WorldWideScience

Sample records for reliability analysis empirical

  1. The effect of loss functions on empirical Bayes reliability analysis

    Directory of Open Access Journals (Sweden)

    Camara Vincent A. R.

    1998-01-01

    Full Text Available The aim of the present study is to investigate the sensitivity of empirical Bayes estimates of the reliability function with respect to changing of the loss function. In addition to applying some of the basic analytical results on empirical Bayes reliability obtained with the use of the “popular” squared error loss function, we shall derive some expressions corresponding to empirical Bayes reliability estimates obtained with the Higgins–Tsokos, the Harris and our proposed logarithmic loss functions. The concept of efficiency, along with the notion of integrated mean square error, will be used as a criterion to numerically compare our results. It is shown that empirical Bayes reliability functions are in general sensitive to the choice of the loss function, and that the squared error loss does not always yield the best empirical Bayes reliability estimate.

  2. The effect of loss functions on empirical Bayes reliability analysis

    Directory of Open Access Journals (Sweden)

    Vincent A. R. Camara

    1999-01-01

    Full Text Available The aim of the present study is to investigate the sensitivity of empirical Bayes estimates of the reliability function with respect to changing of the loss function. In addition to applying some of the basic analytical results on empirical Bayes reliability obtained with the use of the “popular” squared error loss function, we shall derive some expressions corresponding to empirical Bayes reliability estimates obtained with the Higgins–Tsokos, the Harris and our proposed logarithmic loss functions. The concept of efficiency, along with the notion of integrated mean square error, will be used as a criterion to numerically compare our results.

  3. Review of the human reliability analysis performed for Empire State Electric Energy Research Corporation

    International Nuclear Information System (INIS)

    Swart, D.; Banz, I.

    1985-01-01

    The Empire State Electric Energy Research Corporation (ESEERCO) commissioned Westinghouse to conduct a human reliability analysis to identify and quantify human error probabilities associated with operator actions for four specific events which may occur in light water reactors: loss of coolant accident, steam generator tube rupture, steam/feed line break, and stuck open pressurizer spray valve. Human Error Probabilities (HEPs) derived from Swain's Technique for Human Error Rate Prediction (THERP) were compared to data obtained from simulator exercises. A correlation was found between the HEPs derived from Swain and the results of the simulator data. The results of this study provide a unique insight into human factors analysis. The HEPs obtained from such probabilistic studies can be used to prioritize scenarios for operator training situations, and thus improve the correlation between simulator exercises and real control room experiences

  4. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Forester, John A.; Bye, Andreas; Dang, Vinh N.; Lois, Erasmia

    2010-01-01

    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to 'translate' the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  5. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; John A. Forester; Andreas Bye; Vinh N. Dang; Erasmia Lois

    2010-06-01

    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to “translate” the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  6. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  7. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  8. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  9. Analysis and Application of Reliability

    International Nuclear Information System (INIS)

    Jeong, Hae Seong; Park, Dong Ho; Kim, Jae Ju

    1999-05-01

    This book tells of analysis and application of reliability, which includes definition, importance and historical background of reliability, function of reliability and failure rate, life distribution and assumption of reliability, reliability of unrepaired system, reliability of repairable system, sampling test of reliability, failure analysis like failure analysis by FEMA and FTA, and cases, accelerated life testing such as basic conception, acceleration and acceleration factor, and analysis of accelerated life testing data, maintenance policy about alternation and inspection.

  10. EMPIRICAL RESEARCH AND CONGREGATIONAL ANALYSIS ...

    African Journals Online (AJOL)

    empirical research has made to the process of congregational analysis. 1 Part of this ... contextual congegrational analysis – meeting social and divine desires”) at the IAPT .... methodology of a congregational analysis should be regarded as a process. ... essential to create space for a qualitative and quantitative approach.

  11. Waste package reliability analysis

    International Nuclear Information System (INIS)

    Pescatore, C.; Sastre, C.

    1983-01-01

    Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table

  12. Empirically Testing Thematic Analysis (ETTA)

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Tingleff, Elllen B.

    2015-01-01

    Text analysis is not a question of a right or wrong way to go about it, but a question of different traditions. These tend to not only give answers to how to conduct an analysis, but also to provide the answer as to why it is conducted in the way that it is. The problem however may be that the li...... for themselves. The advantage of utilizing the presented analytic approach is argued to be the integral empirical testing, which should assure systematic development, interpretation and analysis of the source textual material....... between tradition and tool is unclear. The main objective of this article is therefore to present Empirical Testing Thematic Analysis, a step by step approach to thematic text analysis; discussing strengths and weaknesses, so that others might assess its potential as an approach that they might utilize/develop...

  13. Empirical analysis of consumer behavior

    NARCIS (Netherlands)

    Huang, Yufeng

    2015-01-01

    This thesis consists of three essays in quantitative marketing, focusing on structural empirical analysis of consumer behavior. In the first essay, he investigates the role of a consumer's skill of product usage, and its imperfect transferability across brands, in her product choice. It shows that

  14. Integrated system reliability analysis

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...

  15. Structural systems reliability analysis

    International Nuclear Information System (INIS)

    Frangopol, D.

    1975-01-01

    For an exact evaluation of the reliability of a structure it appears necessary to determine the distribution densities of the loads and resistances and to calculate the correlation coefficients between loads and between resistances. These statistical characteristics can be obtained only on the basis of a long activity period. In case that such studies are missing the statistical properties formulated here give upper and lower bounds of the reliability. (orig./HP) [de

  16. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  17. Scyllac equipment reliability analysis

    International Nuclear Information System (INIS)

    Gutscher, W.D.; Johnson, K.J.

    1975-01-01

    Most of the failures in Scyllac can be related to crowbar trigger cable faults. A new cable has been designed, procured, and is currently undergoing evaluation. When the new cable has been proven, it will be worked into the system as quickly as possible without causing too much additional down time. The cable-tip problem may not be easy or even desirable to solve. A tightly fastened permanent connection that maximizes contact area would be more reliable than the plug-in type of connection in use now, but it would make system changes and repairs much more difficult. The balance of the failures have such a low occurrence rate that they do not cause much down time and no major effort is underway to eliminate them. Even though Scyllac was built as an experimental system and has many thousands of components, its reliability is very good. Because of this the experiment has been able to progress at a reasonable pace

  18. Integrating reliability analysis and design

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1980-10-01

    This report describes the Interactive Reliability Analysis Project and demonstrates the advantages of using computer-aided design systems (CADS) in reliability analysis. Common cause failure problems require presentations of systems, analysis of fault trees, and evaluation of solutions to these. Results have to be communicated between the reliability analyst and the system designer. Using a computer-aided design system saves time and money in the analysis of design. Computer-aided design systems lend themselves to cable routing, valve and switch lists, pipe routing, and other component studies. At EG and G Idaho, Inc., the Applicon CADS is being applied to the study of water reactor safety systems

  19. Reliability analysis of shutdown system

    International Nuclear Information System (INIS)

    Kumar, C. Senthil; John Arul, A.; Pal Singh, Om; Suryaprakasa Rao, K.

    2005-01-01

    This paper presents the results of reliability analysis of Shutdown System (SDS) of Indian Prototype Fast Breeder Reactor. Reliability analysis carried out using Fault Tree Analysis predicts a value of 3.5 x 10 -8 /de for failure of shutdown function in case of global faults and 4.4 x 10 -8 /de for local faults. Based on 20 de/y, the frequency of shutdown function failure is 0.7 x 10 -6 /ry, which meets the reliability target, set by the Indian Atomic Energy Regulatory Board. The reliability is limited by Common Cause Failure (CCF) of actuation part of SDS and to a lesser extent CCF of electronic components. The failure frequency of individual systems is -3 /ry, which also meets the safety criteria. Uncertainty analysis indicates a maximum error factor of 5 for the top event unavailability

  20. Risk analysis and reliability

    International Nuclear Information System (INIS)

    Uppuluri, V.R.R.

    1979-01-01

    Mathematical foundations of risk analysis are addressed. The importance of having the same probability space in order to compare different experiments is pointed out. Then the following topics are discussed: consequences as random variables with infinite expectations; the phenomenon of rare events; series-parallel systems and different kinds of randomness that could be imposed on such systems; and the problem of consensus of estimates of expert opinion

  1. Analyses of reliability characteristics of emergency diesel generator population using empirical Bayes methods

    International Nuclear Information System (INIS)

    Vesely, W.E.; Uryas'ev, S.P.; Samanta, P.K.

    1993-01-01

    Emergency Diesel Generators (EDGs) provide backup power to nuclear power plants in case of failure of AC buses. The reliability of EDGs is important to assure response to loss-of-offsite power accident scenarios, a dominant contributor to the plant risk. The reliable performance of EDGs has been of concern both for regulators and plant operators. In this paper the authors present an approach and results from the analysis of failure data from a large population of EDGs. They used empirical Bayes approach to obtain both the population distribution and the individual failure probabilities from EDGs failure to start and load-run data over 4 years for 194 EDGs at 63 plant units

  2. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  3. A Reliability Test of a Complex System Based on Empirical Likelihood

    OpenAIRE

    Zhou, Yan; Fu, Liya; Zhang, Jun; Hui, Yongchang

    2016-01-01

    To analyze the reliability of a complex system described by minimal paths, an empirical likelihood method is proposed to solve the reliability test problem when the subsystem distributions are unknown. Furthermore, we provide a reliability test statistic of the complex system and extract the limit distribution of the test statistic. Therefore, we can obtain the confidence interval for reliability and make statistical inferences. The simulation studies also demonstrate the theorem results.

  4. Travel Time Reliability for Urban Networks : Modelling and Empirics

    NARCIS (Netherlands)

    Zheng, F.; Liu, Xiaobo; van Zuylen, H.J.; Li, Jie; Lu, Chao

    2017-01-01

    The importance of travel time reliability in traffic management, control, and network design has received a lot of attention in the past decade. In this paper, a network travel time distribution model based on the Johnson curve system is proposed. The model is applied to field travel time data

  5. Reliability analysis and operator modelling

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    1996-01-01

    The paper considers the state of operator modelling in reliability analysis. Operator models are needed in reliability analysis because operators are needed in process control systems. HRA methods must therefore be able to account both for human performance variability and for the dynamics of the interaction. A selected set of first generation HRA approaches is briefly described in terms of the operator model they use, their classification principle, and the actual method they propose. In addition, two examples of second generation methods are also considered. It is concluded that first generation HRA methods generally have very simplistic operator models, either referring to the time-reliability relationship or to elementary information processing concepts. It is argued that second generation HRA methods must recognise that cognition is embedded in a context, and be able to account for that in the way human reliability is analysed and assessed

  6. Human reliability under sleep deprivation: Derivation of performance shaping factor multipliers from empirical data

    International Nuclear Information System (INIS)

    Griffith, Candice D.; Mahadevan, Sankaran

    2015-01-01

    This paper develops a probabilistic approach that could use empirical data to derive values of performance shaping factor (PSF) multipliers for use in quantitative human reliability analysis (HRA). The proposed approach is illustrated with data on sleep deprivation effects on performance. A review of existing HRA methods reveals that sleep deprivation is not explicitly included at present, and expert opinion is frequently used to inform HRA model multipliers. In this paper, quantitative data from empirical studies regarding the effect of continuous hours of wakefulness on performance measures (reaction time, accuracy, and number of lapses) are used to develop a method to derive PSF multiplier values for sleep deprivation, in the context of the SPAR-H model. Data is extracted from the identified studies according to the meta-analysis research synthesis method and used to investigate performance trends and error probabilities. The error probabilities in test and control conditions are compared, and the resulting probability ratios are suggested for use in informing the selection of PSF multipliers in HRA methods. Although illustrated for sleep deprivation, the proposed methodology is general, and can be applied to other performance shaping factors. - Highlights: • Method proposed to derive performance shaping factor multipliers from empirical data. • Studies reporting the effect of sleep deprivation on performance are analyzed. • Test data using psychomotor vigilance tasks are analyzed. • Error probability multipliers computed for reaction time, lapses, and accuracy measures.

  7. Reliability Analysis of Wind Turbines

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    2008-01-01

    In order to minimise the total expected life-cycle costs of a wind turbine it is important to estimate the reliability level for all components in the wind turbine. This paper deals with reliability analysis for the tower and blades of onshore wind turbines placed in a wind farm. The limit states...... consideres are in the ultimate limit state (ULS) extreme conditions in the standstill position and extreme conditions during operating. For wind turbines, where the magnitude of the loads is influenced by the control system, the ultimate limit state can occur in both cases. In the fatigue limit state (FLS......) the reliability level for a wind turbine placed in a wind farm is considered, and wake effects from neighbouring wind turbines is taken into account. An illustrative example with calculation of the reliability for mudline bending of the tower is considered. In the example the design is determined according...

  8. Reliability analysis in intelligent machines

    Science.gov (United States)

    Mcinroy, John E.; Saridis, George N.

    1990-01-01

    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  9. Design and experimentation of an empirical multistructure framework for accurate, sharp and reliable hydrological ensembles

    Science.gov (United States)

    Seiller, G.; Anctil, F.; Roy, R.

    2017-09-01

    This paper outlines the design and experimentation of an Empirical Multistructure Framework (EMF) for lumped conceptual hydrological modeling. This concept is inspired from modular frameworks, empirical model development, and multimodel applications, and encompasses the overproduce and select paradigm. The EMF concept aims to reduce subjectivity in conceptual hydrological modeling practice and includes model selection in the optimisation steps, reducing initial assumptions on the prior perception of the dominant rainfall-runoff transformation processes. EMF generates thousands of new modeling options from, for now, twelve parent models that share their functional components and parameters. Optimisation resorts to ensemble calibration, ranking and selection of individual child time series based on optimal bias and reliability trade-offs, as well as accuracy and sharpness improvement of the ensemble. Results on 37 snow-dominated Canadian catchments and 20 climatically-diversified American catchments reveal the excellent potential of the EMF in generating new individual model alternatives, with high respective performance values, that may be pooled efficiently into ensembles of seven to sixty constitutive members, with low bias and high accuracy, sharpness, and reliability. A group of 1446 new models is highlighted to offer good potential on other catchments or applications, based on their individual and collective interests. An analysis of the preferred functional components reveals the importance of the production and total flow elements. Overall, results from this research confirm the added value of ensemble and flexible approaches for hydrological applications, especially in uncertain contexts, and open up new modeling possibilities.

  10. RELIABILITY ANALYSIS OF BENDING ELIABILITY ANALYSIS OF ...

    African Journals Online (AJOL)

    eobe

    Reliability analysis of the safety levels of the criteria slabs, have been .... was also noted [2] that if the risk level or β < 3.1), the ... reliability analysis. A study [6] has shown that all geometric variables, ..... Germany, 1988. 12. Hasofer, A. M and ...

  11. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  12. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  13. Empirical direction in design and analysis

    CERN Document Server

    Anderson, Norman H

    2001-01-01

    The goal of Norman H. Anderson's new book is to help students develop skills of scientific inference. To accomplish this he organized the book around the ""Experimental Pyramid""--six levels that represent a hierarchy of considerations in empirical investigation--conceptual framework, phenomena, behavior, measurement, design, and statistical inference. To facilitate conceptual and empirical understanding, Anderson de-emphasizes computational formulas and null hypothesis testing. Other features include: *emphasis on visual inspection as a basic skill in experimental analysis to help student

  14. A novel random-pulser concept for empirical reliability studies of complex systems

    International Nuclear Information System (INIS)

    Priesmeyer, H.G.

    1985-01-01

    The concept of a computer-controlled pseudo-random pulser is described, which is able to produce pulse sequences obeying statistical distributions, used in probability assessments of safety technology. It shall be used in empirical investigations of the reliability of complex systems. (orig.) [de

  15. The Reliability and Stability of an Inferred Phylogenetic Tree from Empirical Data.

    Science.gov (United States)

    Katsura, Yukako; Stanley, Craig E; Kumar, Sudhir; Nei, Masatoshi

    2017-03-01

    The reliability of a phylogenetic tree obtained from empirical data is usually measured by the bootstrap probability (Pb) of interior branches of the tree. If the bootstrap probability is high for most branches, the tree is considered to be reliable. If some interior branches show relatively low bootstrap probabilities, we are not sure that the inferred tree is really reliable. Here, we propose another quantity measuring the reliability of the tree called the stability of a subtree. This quantity refers to the probability of obtaining a subtree (Ps) of an inferred tree obtained. We then show that if the tree is to be reliable, both Pb and Ps must be high. We also show that Ps is given by a bootstrap probability of the subtree with the closest outgroup sequence, and computer program RESTA for computing the Pb and Ps values will be presented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  16. The Italian Footwear Industry: an Empirical Analysis

    OpenAIRE

    Pirolo, Luca; Giustiniano, Luca; Nenni, Maria Elena

    2013-01-01

    This paper aims to provide readers with a deep empirical analysis on the Italian footwear industry in order to investigate the evolution of its structure (trends in sales and production, number of firms and employees, main markets, etc.), together with the identification of the main drivers of competitiveness in order to explain the strategies implemented by local actors.

  17. The problem analysis for empirical studies

    NARCIS (Netherlands)

    Groenland, E.A.G.

    2014-01-01

    This article proposes a systematic methodology for the development of a problem analysis for cross-sectional, empirical research. This methodology is referred to as the 'Annabel approach'. It is suitable both for academic studies and applied (business) studies. In addition it can be used for both

  18. Human Reliability Analysis: session summary

    International Nuclear Information System (INIS)

    Hall, R.E.

    1985-01-01

    The use of Human Reliability Analysis (HRA) to identify and resolve human factors issues has significantly increased over the past two years. Today, utilities, research institutions, consulting firms, and the regulatory agency have found a common application of HRA tools and Probabilistic Risk Assessment (PRA). The ''1985 IEEE Third Conference on Human Factors and Power Plants'' devoted three sessions to the discussion of these applications and a review of the insights so gained. This paper summarizes the three sessions and presents those common conclusions that were discussed during the meeting. The paper concludes that session participants supported the use of an adequately documented ''living PRA'' to address human factors issues in design and procedural changes, regulatory compliance, and training and that the techniques can produce cost effective qualitative results that are complementary to more classical human factors methods

  19. Fundamentals and applications of systems reliability analysis

    International Nuclear Information System (INIS)

    Boesebeck, K.; Heuser, F.W.; Kotthoff, K.

    1976-01-01

    The lecture gives a survey on the application of methods of reliability analysis to assess the safety of nuclear power plants. Possible statements of reliability analysis in connection with specifications of the atomic licensing procedure are especially dealt with. Existing specifications of safety criteria are additionally discussed with the help of reliability analysis by the example of the reliability analysis of a reactor protection system. Beyond the limited application to single safety systems, the significance of reliability analysis for a closed risk concept is explained in the last part of the lecture. (orig./LH) [de

  20. Residential PV system users' perception of profitability, reliability, and failure risk: An empirical survey in a local Japanese municipality

    International Nuclear Information System (INIS)

    Mukai, Toshihiro; Kawamoto, Shishin; Ueda, Yuzuru; Saijo, Miki; Abe, Naoya

    2011-01-01

    Although previous studies have addressed the reliability of residential PV systems in order to improve the dissemination of the systems among individual users and societies, few have examined users' perception of their own PV systems, which might contain solutions to firmly establish the system into society. First, the present paper examined the extent to which residential PV system users understand specification, reliability, and failure risk of their own systems. Second, causal factors affecting users' satisfaction with PV systems were examined. By analyzing data collected in Kakegawa City, this paper revealed that users did not appropriately understand the basic specifications of their residential PV systems, and in particular, the fact that the systems sometimes failed and therefore needed proper maintenance. Furthermore, a strong causal relationship between users' expectations of financial return from the system and their level of satisfaction was confirmed empirically. These results suggested that excessive focus on profitability and relatively low interest in the systems' reliability and failure risk should be addressed more to avoid problems that could potentially hamper the establishment of this technology into society. - Highlights: → We examined PV users' perception of its specification, reliability, and failure risk. → Data for analysis were collected by questionnaire survey in a Japanese local municipality. → We revealed users did not appropriately understand the basic specifications. → A strong causal relationship between users' expectations of financial return and their level of satisfaction was confirmed empirically.

  1. Empirical analysis of uranium spot prices

    International Nuclear Information System (INIS)

    Morman, M.R.

    1988-01-01

    The objective is to empirically test a market model of the uranium industry that incorporates the notion that, if the resource is viewed as an asset by economic agents, then its own rate of return along with the own rate of return of a competing asset would be a major factor in formulating the price of the resource. The model tested is based on a market model of supply and demand. The supply model incorporates the notion that the decision criteria used by uranium mine owners is to select that extraction rate that maximizes the net present value of their extraction receipts. The demand model uses a concept that allows for explicit recognition of the prospect of arbitrage between a natural-resource market and the market for other capital goods. The empirical approach used for estimation was a recursive or causal model. The empirical results were consistent with the theoretical models. The coefficients of the demand and supply equations had the appropriate signs. Tests for causality were conducted to validate the use of the causal model. The results obtained were favorable. The implication of the findings as related to future studies of exhaustible resources are: (1) in some cases causal models are the appropriate specification for empirical analysis; (2) supply models should incorporate a measure to capture depletion effects

  2. Advances in human reliability analysis in Mexico

    International Nuclear Information System (INIS)

    Nelson, Pamela F.; Gonzalez C, M.; Ruiz S, T.; Guillen M, D.; Contreras V, A.

    2010-10-01

    Human Reliability Analysis (HRA) is a very important part of Probabilistic Risk Analysis (PRA), and constant work is dedicated to improving methods, guidance and data in order to approach realism in the results as well as looking for ways to use these to reduce accident frequency at plants. Further, in order to advance in these areas, several HRA studies are being performed globally. Mexico has participated in the International HRA Empirical study with the objective of -benchmarking- HRA methods by comparing HRA predictions to actual crew performance in a simulator, as well as in the empirical study on a US nuclear power plant currently in progress. The focus of the first study was the development of an understanding of how methods are applied by various analysts, and characterize the methods for their capability to guide the analysts to identify potential human failures, and associated causes and performance shaping factors. The HRA benchmarking study has been performed by using the Halden simulator, 14 European crews, and 15 HRA equipment s (NRC, EPRI, and foreign HRA equipment s using different HRA methods). This effort in Mexico is reflected through the work being performed on updating the Laguna Verde PRA to comply with the ASME PRA standard. In order to be considered an HRA with technical adequacy, that is, be considered as a capability category II, for risk-informed applications, the methodology used for the HRA in the original PRA is not considered sufficiently detailed, and the methodology had to upgraded. The HCR/CBDT/THERP method was chosen, since this is used in many nuclear plants with similar design. The HRA update includes identification and evaluation of human errors that can occur during testing and maintenance, as well as human errors that can occur during an accident using the Emergency Operating Procedures. The review of procedures for maintenance, surveillance and operation is a necessary step in HRA and provides insight into the possible

  3. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  4. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  5. Cost analysis of reliability investigations

    International Nuclear Information System (INIS)

    Schmidt, F.

    1981-01-01

    Taking Epsteins testing theory as a basis, premisses are formulated for the selection of cost-optimized reliability inspection plans. Using an example, the expected testing costs and inspection time periods of various inspection plan types, standardized on the basis of the exponential distribution, are compared. It can be shown that sequential reliability tests usually involve lower costs than failure or time-fixed tests. The most 'costly' test is to be expected with the inspection plan type NOt. (orig.) [de

  6. Reliability Analysis of Money Habitudes

    Science.gov (United States)

    Delgadillo, Lucy M.; Bushman, Brittani S.

    2015-01-01

    Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…

  7. Power system reliability analysis using fault trees

    International Nuclear Information System (INIS)

    Volkanovski, A.; Cepin, M.; Mavko, B.

    2006-01-01

    The power system reliability analysis method is developed from the aspect of reliable delivery of electrical energy to customers. The method is developed based on the fault tree analysis, which is widely applied in the Probabilistic Safety Assessment (PSA). The method is adapted for the power system reliability analysis. The method is developed in a way that only the basic reliability parameters of the analysed power system are necessary as an input for the calculation of reliability indices of the system. The modeling and analysis was performed on an example power system consisting of eight substations. The results include the level of reliability of current power system configuration, the combinations of component failures resulting in a failed power delivery to loads, and the importance factors for components and subsystems. (author)

  8. Unemployment and Mental Disorders - An Empirical Analysis

    DEFF Research Database (Denmark)

    Agerbo, Esben; Eriksson, Tor Viking; Mortensen, Preben Bo

    1998-01-01

    The purpose of this paper is also to analyze the importance of unemployment and other social factors as risk factors for impaired mental health. It departs from previous studies in that we make use of information about first admissions to a psychiatric hospital or ward as our measure of mental...... from the Psychiatric case register. Secondly, we estimate conditional logistic regression models for case-control data on first admissions to a psychiatric hospital. The explanatory variables in the empirical analysis include age, gender, education, marital status, income, wealth, and unemployment (and...

  9. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  10. Reliability analysis of reactor pressure vessel intensity

    International Nuclear Information System (INIS)

    Zheng Liangang; Lu Yongbo

    2012-01-01

    This paper performs the reliability analysis of reactor pressure vessel (RPV) with ANSYS. The analysis method include direct Monte Carlo Simulation method, Latin Hypercube Sampling, central composite design and Box-Behnken Matrix design. The RPV integrity reliability under given input condition is proposed. The result shows that the effects on the RPV base material reliability are internal press, allowable basic stress and elasticity modulus of base material in descending order, and the effects on the bolt reliability are allowable basic stress of bolt material, preload of bolt and internal press in descending order. (authors)

  11. Reliability Analysis of Adhesive Bonded Scarf Joints

    DEFF Research Database (Denmark)

    Kimiaeifar, Amin; Toft, Henrik Stensgaard; Lund, Erik

    2012-01-01

    element analysis (FEA). For the reliability analysis a design equation is considered which is related to a deterministic code-based design equation where reliability is secured by partial safety factors together with characteristic values for the material properties and loads. The failure criteria......A probabilistic model for the reliability analysis of adhesive bonded scarfed lap joints subjected to static loading is developed. It is representative for the main laminate in a wind turbine blade subjected to flapwise bending. The structural analysis is based on a three dimensional (3D) finite...... are formulated using a von Mises, a modified von Mises and a maximum stress failure criterion. The reliability level is estimated for the scarfed lap joint and this is compared with the target reliability level implicitly used in the wind turbine standard IEC 61400-1. A convergence study is performed to validate...

  12. Issues in benchmarking human reliability analysis methods: A literature review

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Hendrickson, Stacey M.L.; Forester, John A.; Tran, Tuan Q.; Lois, Erasmia

    2010-01-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies is presented in order to aid in the design of future HRA benchmarking endeavors.

  13. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  14. Estimating Ordinal Reliability for Likert-Type and Ordinal Item Response Data: A Conceptual, Empirical, and Practical Guide

    Science.gov (United States)

    Gadermann, Anne M.; Guhn, Martin; Zumbo, Bruno D.

    2012-01-01

    This paper provides a conceptual, empirical, and practical guide for estimating ordinal reliability coefficients for ordinal item response data (also referred to as Likert, Likert-type, ordered categorical, or rating scale item responses). Conventionally, reliability coefficients, such as Cronbach's alpha, are calculated using a Pearson…

  15. Analysis of the Reliability of the "Alternator- Alternator Belt" System

    Directory of Open Access Journals (Sweden)

    Ivan Mavrin

    2012-10-01

    Full Text Available Before starting and also during the exploitation of va1ioussystems, it is vety imp011ant to know how the system and itsparts will behave during operation regarding breakdowns, i.e.failures. It is possible to predict the service behaviour of a systemby determining the functions of reliability, as well as frequencyand intensity of failures.The paper considers the theoretical basics of the functionsof reliability, frequency and intensity of failures for the twomain approaches. One includes 6 equal intetvals and the other13 unequal intetvals for the concrete case taken from practice.The reliability of the "alternator- alternator belt" system installedin the buses, has been analysed, according to the empiricaldata on failures.The empitical data on failures provide empirical functionsof reliability and frequency and intensity of failures, that arepresented in tables and graphically. The first analysis perfO!med by dividing the mean time between failures into 6 equaltime intervals has given the forms of empirical functions of fa ilurefrequency and intensity that approximately cotTespond totypical functions. By dividing the failure phase into 13 unequalintetvals with two failures in each interval, these functions indicateexplicit transitions from early failure inte1val into the randomfailure interval, i.e. into the ageing intetval. Functions thusobtained are more accurate and represent a better solution forthe given case.In order to estimate reliability of these systems with greateraccuracy, a greater number of failures needs to be analysed.

  16. An Empirical Analysis of the Budget Deficit

    Directory of Open Access Journals (Sweden)

    Ioan Talpos

    2007-11-01

    Full Text Available Economic policies and, particularly, fiscal policies are not designed and implemented in an “empty space”: the structural characteristics of the economic systems, the institutional architecture of societies, the cultural paradigm and the power relations between different social groups define the borders of these policies. This paper tries to deal with these borders, to describe their nature and the implications of their existence to the fiscal policies’ quality and impact at a theoretical level as well as at an empirical one. The main results of the proposed analysis support the ideas that the mentioned variables matters both for the social mandate entrusted by the society to the state and thus to the role and functions of the state and for the economic growth as a support of the resources collected at distributed by the public authorities.

  17. System Reliability Analysis Considering Correlation of Performances

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Saekyeol; Lee, Tae Hee [Hanyang Univ., Seoul (Korea, Republic of); Lim, Woochul [Mando Corporation, Seongnam (Korea, Republic of)

    2017-04-15

    Reliability analysis of a mechanical system has been developed in order to consider the uncertainties in the product design that may occur from the tolerance of design variables, uncertainties of noise, environmental factors, and material properties. In most of the previous studies, the reliability was calculated independently for each performance of the system. However, the conventional methods cannot consider the correlation between the performances of the system that may lead to a difference between the reliability of the entire system and the reliability of the individual performance. In this paper, the joint probability density function (PDF) of the performances is modeled using a copula which takes into account the correlation between performances of the system. The system reliability is proposed as the integral of joint PDF of performances and is compared with the individual reliability of each performance by mathematical examples and two-bar truss example.

  18. System Reliability Analysis Considering Correlation of Performances

    International Nuclear Information System (INIS)

    Kim, Saekyeol; Lee, Tae Hee; Lim, Woochul

    2017-01-01

    Reliability analysis of a mechanical system has been developed in order to consider the uncertainties in the product design that may occur from the tolerance of design variables, uncertainties of noise, environmental factors, and material properties. In most of the previous studies, the reliability was calculated independently for each performance of the system. However, the conventional methods cannot consider the correlation between the performances of the system that may lead to a difference between the reliability of the entire system and the reliability of the individual performance. In this paper, the joint probability density function (PDF) of the performances is modeled using a copula which takes into account the correlation between performances of the system. The system reliability is proposed as the integral of joint PDF of performances and is compared with the individual reliability of each performance by mathematical examples and two-bar truss example.

  19. Reliability analysis using network simulation

    International Nuclear Information System (INIS)

    Engi, D.

    1985-01-01

    The models that can be used to provide estimates of the reliability of nuclear power systems operate at many different levels of sophistication. The least-sophisticated models treat failure processes that entail only time-independent phenomena (such as demand failure). More advanced models treat processes that also include time-dependent phenomena such as run failure and possibly repair. However, many of these dynamic models are deficient in some respects because they either disregard the time-dependent phenomena that cannot be expressed in closed-form analytic terms or because they treat these phenomena in quasi-static terms. The next level of modeling requires a dynamic approach that incorporates not only procedures for treating all significant time-dependent phenomena but also procedures for treating these phenomena when they are conditionally linked or characterized by arbitrarily selected probability distributions. The level of sophistication that is required is provided by a dynamic, Monte Carlo modeling approach. A computer code that uses a dynamic, Monte Carlo modeling approach is Q-GERT (Graphical Evaluation and Review Technique - with Queueing), and the present study had demonstrated the feasibility of using Q-GERT for modeling time-dependent, unconditionally and conditionally linked phenomena that are characterized by arbitrarily selected probability distributions

  20. On the Reliability of Source Time Functions Estimated Using Empirical Green's Function Methods

    Science.gov (United States)

    Gallegos, A. C.; Xie, J.; Suarez Salas, L.

    2017-12-01

    The Empirical Green's Function (EGF) method (Hartzell, 1978) has been widely used to extract source time functions (STFs). In this method, seismograms generated by collocated events with different magnitudes are deconvolved. Under a fundamental assumption that the STF of the small event is a delta function, the deconvolved Relative Source Time Function (RSTF) yields the large event's STF. While this assumption can be empirically justified by examination of differences in event size and frequency content of the seismograms, there can be a lack of rigorous justification of the assumption. In practice, a small event might have a finite duration when the RSTF is retrieved and interpreted as the large event STF with a bias. In this study, we rigorously analyze this bias using synthetic waveforms generated by convolving a realistic Green's function waveform with pairs of finite-duration triangular or parabolic STFs. The RSTFs are found using a time-domain based matrix deconvolution. We find when the STFs of smaller events are finite, the RSTFs are a series of narrow non-physical spikes. Interpreting these RSTFs as a series of high-frequency source radiations would be very misleading. The only reliable and unambiguous information we can retrieve from these RSTFs is the difference in durations and the moment ratio of the two STFs. We can apply a Tikhonov smoothing to obtain a single-pulse RSTF, but its duration is dependent on the choice of weighting, which may be subjective. We then test the Multi-Channel Deconvolution (MCD) method (Plourde & Bostock, 2017) which assumes that both STFs have finite durations to be solved for. A concern about the MCD method is that the number of unknown parameters is larger, which would tend to make the problem rank-deficient. Because the kernel matrix is dependent on the STFs to be solved for under a positivity constraint, we can only estimate the rank-deficiency with a semi-empirical approach. Based on the results so far, we find that the

  1. Analysis of information security reliability: A tutorial

    International Nuclear Information System (INIS)

    Kondakci, Suleyman

    2015-01-01

    This article presents a concise reliability analysis of network security abstracted from stochastic modeling, reliability, and queuing theories. Network security analysis is composed of threats, their impacts, and recovery of the failed systems. A unique framework with a collection of the key reliability models is presented here to guide the determination of the system reliability based on the strength of malicious acts and performance of the recovery processes. A unique model, called Attack-obstacle model, is also proposed here for analyzing systems with immunity growth features. Most computer science curricula do not contain courses in reliability modeling applicable to different areas of computer engineering. Hence, the topic of reliability analysis is often too diffuse to most computer engineers and researchers dealing with network security. This work is thus aimed at shedding some light on this issue, which can be useful in identifying models, their assumptions and practical parameters for estimating the reliability of threatened systems and for assessing the performance of recovery facilities. It can also be useful for the classification of processes and states regarding the reliability of information systems. Systems with stochastic behaviors undergoing queue operations and random state transitions can also benefit from the approaches presented here. - Highlights: • A concise survey and tutorial in model-based reliability analysis applicable to information security. • A framework of key modeling approaches for assessing reliability of networked systems. • The framework facilitates quantitative risk assessment tasks guided by stochastic modeling and queuing theory. • Evaluation of approaches and models for modeling threats, failures, impacts, and recovery analysis of information systems

  2. Space Mission Human Reliability Analysis (HRA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this project is to extend current ground-based Human Reliability Analysis (HRA) techniques to a long-duration, space-based tool to more effectively...

  3. Reliability analysis of stiff versus flexible piping

    International Nuclear Information System (INIS)

    Lu, S.C.

    1985-01-01

    The overall objective of this research project is to develop a technical basis for flexible piping designs which will improve piping reliability and minimize the use of pipe supports, snubbers, and pipe whip restraints. The current study was conducted to establish the necessary groundwork based on the piping reliability analysis. A confirmatory piping reliability assessment indicated that removing rigid supports and snubbers tends to either improve or affect very little the piping reliability. The authors then investigated a couple of changes to be implemented in Regulatory Guide (RG) 1.61 and RG 1.122 aimed at more flexible piping design. They concluded that these changes substantially reduce calculated piping responses and allow piping redesigns with significant reduction in number of supports and snubbers without violating ASME code requirements. Furthermore, the more flexible piping redesigns are capable of exhibiting reliability levels equal to or higher than the original stiffer design. An investigation of the malfunction of pipe whip restraints confirmed that the malfunction introduced higher thermal stresses and tended to reduce the overall piping reliability. Finally, support and component reliabilities were evaluated based on available fragility data. Results indicated that the support reliability usually exhibits a moderate decrease as the piping flexibility increases. Most on-line pumps and valves showed an insignificant reduction in reliability for a more flexible piping design

  4. Reliability and validity of risk analysis

    International Nuclear Information System (INIS)

    Aven, Terje; Heide, Bjornar

    2009-01-01

    In this paper we investigate to what extent risk analysis meets the scientific quality requirements of reliability and validity. We distinguish between two types of approaches within risk analysis, relative frequency-based approaches and Bayesian approaches. The former category includes both traditional statistical inference methods and the so-called probability of frequency approach. Depending on the risk analysis approach, the aim of the analysis is different, the results are presented in different ways and consequently the meaning of the concepts reliability and validity are not the same.

  5. Multi-Disciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song

    1997-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  6. Empirical analysis of online human dynamics

    Science.gov (United States)

    Zhao, Zhi-Dan; Zhou, Tao

    2012-06-01

    Patterns of human activities have attracted increasing academic interests, since the quantitative understanding of human behavior is helpful to uncover the origins of many socioeconomic phenomena. This paper focuses on behaviors of Internet users. Six large-scale systems are studied in our experiments, including the movie-watching in Netflix and MovieLens, the transaction in Ebay, the bookmark-collecting in Delicious, and the posting in FreindFeed and Twitter. Empirical analysis reveals some common statistical features of online human behavior: (1) The total number of user's actions, the user's activity, and the interevent time all follow heavy-tailed distributions. (2) There exists a strongly positive correlation between user's activity and the total number of user's actions, and a significantly negative correlation between the user's activity and the width of the interevent time distribution. We further study the rescaling method and show that this method could to some extent eliminate the different statistics among users caused by the different activities, yet the effectiveness depends on the data sets.

  7. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  8. An update on the "empirical turn" in bioethics: analysis of empirical research in nine bioethics journals.

    Science.gov (United States)

    Wangmo, Tenzin; Hauri, Sirin; Gennet, Eloise; Anane-Sarpong, Evelyn; Provoost, Veerle; Elger, Bernice S

    2018-02-07

    A review of literature published a decade ago noted a significant increase in empirical papers across nine bioethics journals. This study provides an update on the presence of empirical papers in the same nine journals. It first evaluates whether the empirical trend is continuing as noted in the previous study, and second, how it is changing, that is, what are the characteristics of the empirical works published in these nine bioethics journals. A review of the same nine journals (Bioethics; Journal of Medical Ethics; Journal of Clinical Ethics; Nursing Ethics; Cambridge Quarterly of Healthcare Ethics; Hastings Center Report; Theoretical Medicine and Bioethics; Christian Bioethics; and Kennedy Institute of Ethics Journal) was conducted for a 12-year period from 2004 to 2015. Data obtained was analysed descriptively and using a non-parametric Chi-square test. Of the total number of original papers (N = 5567) published in the nine bioethics journals, 18.1% (n = 1007) collected and analysed empirical data. Journal of Medical Ethics and Nursing Ethics led the empirical publications, accounting for 89.4% of all empirical papers. The former published significantly more quantitative papers than qualitative, whereas the latter published more qualitative papers. Our analysis reveals no significant difference (χ2 = 2.857; p = 0.091) between the proportion of empirical papers published in 2004-2009 and 2010-2015. However, the increasing empirical trend has continued in these journals with the proportion of empirical papers increasing from 14.9% in 2004 to 17.8% in 2015. This study presents the current state of affairs regarding empirical research published nine bioethics journals. In the quarter century of data that is available about the nine bioethics journals studied in two reviews, the proportion of empirical publications continues to increase, signifying a trend towards empirical research in bioethics. The growing volume is mainly attributable to two

  9. Reliability analysis of Angra I safety systems

    International Nuclear Information System (INIS)

    Oliveira, L.F.S. de; Soto, J.B.; Maciel, C.C.; Gibelli, S.M.O.; Fleming, P.V.; Arrieta, L.A.

    1980-07-01

    An extensive reliability analysis of some safety systems of Angra I, are presented. The fault tree technique, which has been successfully used in most reliability studies of nuclear safety systems performed to date is employed. Results of a quantitative determination of the unvailability of the accumulator and the containment spray injection systems are presented. These results are also compared to those reported in WASH-1400. (E.G.) [pt

  10. Sources of Currency Crisis: An Empirical Analysis

    OpenAIRE

    Weber, Axel A.

    1997-01-01

    Two types of currency crisis models coexist in the literature: first generation models view speculative attacks as being caused by economic fundamentals which are inconsistent with a given parity. Second generation models claim self-fulfilling speculation as the main source of a currency crisis. Recent empirical research in international macroeconomics has attempted to distinguish between the sources of currency crises. This paper adds to this literature by proposing a new empirical approach ...

  11. An empirical analysis of Diaspora bonds

    OpenAIRE

    AKKOYUNLU, Şule; STERN, Max

    2018-01-01

    Abstract. This study is the first to investigate theoretically and empirically the determinants of Diaspora Bonds for eight developing countries (Bangladesh, Ethiopia, Ghana, India, Lebanon, Pakistan, the Philippines, and Sri-Lanka) and one developed country - Israel for the period 1951 and 2008. Empirical results are consistent with the predictions of the theoretical model. The most robust variables are the closeness indicator and the sovereign rating, both on the demand-side. The spread is ...

  12. Swimming pool reactor reliability and safety analysis

    International Nuclear Information System (INIS)

    Li Zhaohuan

    1997-01-01

    A reliability and safety analysis of Swimming Pool Reactor in China Institute of Atomic Energy is done by use of event/fault tree technique. The paper briefly describes the analysis model, analysis code and main results. Meanwhile it also describes the impact of unassigned operation status on safety, the estimation of effectiveness of defense tactics in maintenance against common cause failure, the effectiveness of recovering actions on the system reliability, the comparison of occurrence frequencies of the core damage by use of generic and specific data

  13. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1980-01-01

    A fault tree analysis package is described that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage, and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and projects delays. The package operates interactively allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis system data can be derived automatically from a generic data bank. As the analysis procedes improved estimates of critical failure rates and test and maintenance schedules can be inserted. The computations are standard, - identification of minimal cut-sets, estimation of reliability parameters, and ranking of the effect of the individual component failure modes and system failure modes on these parameters. The user can vary the fault trees and data on-line, and print selected data for preferred systems in a form suitable for inclusion in safety reports. A case history is given - that of HIFAR containment isolation system. (author)

  14. Reliability Analysis of Elasto-Plastic Structures

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Sørensen, John Dalsgaard

    1984-01-01

    . Failure of this type of system is defined either as formation of a mechanism or by failure of a prescribed number of elements. In the first case failure is independent of the order in which the elements fail, but this is not so by the second definition. The reliability analysis consists of two parts...... are described and the two definitions of failure can be used by the first formulation, but only the failure definition based on formation of a mechanism by the second formulation. The second part of the reliability analysis is an estimate of the failure probability for the structure on the basis...

  15. Reliability analysis and assessment of structural systems

    International Nuclear Information System (INIS)

    Yao, J.T.P.; Anderson, C.A.

    1977-01-01

    The study of structural reliability deals with the probability of having satisfactory performance of the structure under consideration within any specific time period. To pursue this study, it is necessary to apply available knowledge and methodology in structural analysis (including dynamics) and design, behavior of materials and structures, experimental mechanics, and the theory of probability and statistics. In addition, various severe loading phenomena such as strong motion earthquakes and wind storms are important considerations. For three decades now, much work has been done on reliability analysis of structures, and during this past decade, certain so-called 'Level I' reliability-based design codes have been proposed and are in various stages of implementation. These contributions will be critically reviewed and summarized in this paper. Because of the undesirable consequences resulting from the failure of nuclear structures, it is important and desirable to consider the structural reliability in the analysis and design of these structures. Moreover, after these nuclear structures are constructed, it is desirable for engineers to be able to assess the structural reliability periodically as well as immediately following the occurrence of severe loading conditions such as a strong-motion earthquake. During this past decade, increasing use has been made of techniques of system identification in structural engineering. On the basis of non-destructive test results, various methods have been developed to obtain an adequate mathematical model (such as the equations of motion with more realistic parameters) to represent the structural system

  16. Culture Representation in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    David Gertman; Julie Marble; Steven Novack

    2006-12-01

    Understanding human-system response is critical to being able to plan and predict mission success in the modern battlespace. Commonly, human reliability analysis has been used to predict failures of human performance in complex, critical systems. However, most human reliability methods fail to take culture into account. This paper takes an easily understood state of the art human reliability analysis method and extends that method to account for the influence of culture, including acceptance of new technology, upon performance. The cultural parameters used to modify the human reliability analysis were determined from two standard industry approaches to cultural assessment: Hofstede’s (1991) cultural factors and Davis’ (1989) technology acceptance model (TAM). The result is called the Culture Adjustment Method (CAM). An example is presented that (1) reviews human reliability assessment with and without cultural attributes for a Supervisory Control and Data Acquisition (SCADA) system attack, (2) demonstrates how country specific information can be used to increase the realism of HRA modeling, and (3) discusses the differences in human error probability estimates arising from cultural differences.

  17. The Interrelation among Faithful Representation (Reliability, Corruption and IFRS Adoption: An Empirical Investigation

    Directory of Open Access Journals (Sweden)

    Alexios Kythreotis

    2015-08-01

    Full Text Available Purpose – The degree of corruption, among other things, indicates the non -implementation of laws, weak enforcement of legal sanctions and the existence of non-transparent economic transactions. Therefore, the expected change in reliability (faithful-representation resulting from the adoption of IAS/IFRS, does not depend solely on the adoption of IAS/IFRS but is also influenced by the degree of corruption in each country. The purpose of this paper is to examine whether the above statement is true. Design/methodology/approach – The data were taken from DataStream database and the sample period consists of listed companies of fifteen European countries that adopted IAS/IFRS mandatorily. The time horizon is 10 years, from 2000 until 2009. The period between 2000 and 2004 is defined as the period before the adoption, while the period between 2005 and 2009 is defined as the period after the adoption. The reliability/faithful representation of financial statements - as defined by the Conceptual Framework - is detected through regression analysis. Findings – The findings advocate that the adoption of IFRS/IAS seems to be not enough. It appears that the level of reliability of financial statements in every country does not depend solely on the adoption of IAS/IFRS but is also influenced by the degree of corruption in each country. Research limitations/implications – The models that are used for the measurement of reliability have as an independent variable the short-term accruals. Given that, the models fail to take into consideration accounting treatments that concern non-current assets/liabilities. Originality/value – The findings that are identified for counties with a high degree of corruption indicate a statistically significant reduction in reliability after the adoption of IAS/IFRS. These findings constitute a useful tool for the IASB and the European Commission as well as for the users of financial statements.

  18. Reliability Analysis of a Steel Frame

    Directory of Open Access Journals (Sweden)

    M. Sýkora

    2002-01-01

    Full Text Available A steel frame with haunches is designed according to Eurocodes. The frame is exposed to self-weight, snow, and wind actions. Lateral-torsional buckling appears to represent the most critical criterion, which is considered as a basis for the limit state function. In the reliability analysis, the probabilistic models proposed by the Joint Committee for Structural Safety (JCSS are used for basic variables. The uncertainty model coefficients take into account the inaccuracy of the resistance model for the haunched girder and the inaccuracy of the action effect model. The time invariant reliability analysis is based on Turkstra's rule for combinations of snow and wind actions. The time variant analysis describes snow and wind actions by jump processes with intermittencies. Assuming a 50-year lifetime, the obtained values of the reliability index b vary within the range from 3.95 up to 5.56. The cross-profile IPE 330 designed according to Eurocodes seems to be adequate. It appears that the time invariant reliability analysis based on Turkstra's rule provides considerably lower values of b than those obtained by the time variant analysis.

  19. Representative Sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Kim Harry

    2005-01-01

    regime in order to secure the necessary reliability of: samples (which must be representative, from the primary sampling onwards), analysis (which will not mean anything outside the miniscule analytical volume without representativity ruling all mass reductions involved, also in the laboratory) and data...

  20. Reliability analysis of prestressed concrete containment structures

    International Nuclear Information System (INIS)

    Jiang, J.; Zhao, Y.; Sun, J.

    1993-01-01

    The reliability analysis of prestressed concrete containment structures subjected to combinations of static and dynamic loads with consideration of uncertainties of structural and load parameters is presented. Limit state probabilities for given parameters are calculated using the procedure developed at BNL, while that with consideration of parameter uncertainties are calculated by a fast integration for time variant structural reliability. The limit state surface of the prestressed concrete containment is constructed directly incorporating the prestress. The sensitivities of the Choleskey decomposition matrix and the natural vibration character are calculated by simplified procedures. (author)

  1. Prime implicants in dynamic reliability analysis

    International Nuclear Information System (INIS)

    Tyrväinen, Tero

    2016-01-01

    This paper develops an improved definition of a prime implicant for the needs of dynamic reliability analysis. Reliability analyses often aim to identify minimal cut sets or prime implicants, which are minimal conditions that cause an undesired top event, such as a system's failure. Dynamic reliability analysis methods take the time-dependent behaviour of a system into account. This means that the state of a component can change in the analysed time frame and prime implicants can include the failure of a component at different time points. There can also be dynamic constraints on a component's behaviour. For example, a component can be non-repairable in the given time frame. If a non-repairable component needs to be failed at a certain time point to cause the top event, we consider that the condition that it is failed at the latest possible time point is minimal, and the condition in which it fails earlier non-minimal. The traditional definition of a prime implicant does not account for this type of time-related minimality. In this paper, a new definition is introduced and illustrated using a dynamic flowgraph methodology model. - Highlights: • A new definition of a prime implicant is developed for dynamic reliability analysis. • The new definition takes time-related minimality into account. • The new definition is needed in dynamic flowgraph methodology. • Results can be represented by a smaller number of prime implicants.

  2. Insurability of Cyber Risk: An Empirical Analysis

    OpenAIRE

    Biener, Christian; Eling, Martin; Wirfs, Jan Hendrik

    2015-01-01

    This paper discusses the adequacy of insurance for managing cyber risk. To this end, we extract 994 cases of cyber losses from an operational risk database and analyse their statistical properties. Based on the empirical results and recent literature, we investigate the insurability of cyber risk by systematically reviewing the set of criteria introduced by Berliner (1982). Our findings emphasise the distinct characteristics of cyber risks compared with other operational risks and bring to li...

  3. Compassion: An Evolutionary Analysis and Empirical Review

    OpenAIRE

    Goetz, Jennifer L.; Keltner, Dacher; Simon-Thomas, Emiliana

    2010-01-01

    What is compassion? And how did it evolve? In this review, we integrate three evolutionary arguments that converge on the hypothesis that compassion evolved as a distinct affective experience whose primary function is to facilitate cooperation and protection of the weak and those who suffer. Our empirical review reveals compassion to have distinct appraisal processes attuned to undeserved suffering, distinct signaling behavior related to caregiving patterns of touch, posture, and vocalization...

  4. Τhe observational and empirical thermospheric CO2 and NO power do not exhibit power-law behavior; an indication of their reliability

    Science.gov (United States)

    Varotsos, C. A.; Efstathiou, M. N.

    2018-03-01

    In this paper we investigate the evolution of the energy emitted by CO2 and NO from the Earth's thermosphere on a global scale using both observational and empirically derived data. In the beginning, we analyze the daily power observations of CO2 and NO received from the Sounding of the Atmosphere using Broadband Emission Radiometry (SABER) equipment on the NASA Thermosphere-Ionosphere-Mesosphere Energetics and Dynamics (TIMED) satellite for the entire period 2002-2016. We then perform the same analysis on the empirical daily power emitted by CO2 and NO that were derived recently from the infrared energy budget of the thermosphere during 1947-2016. The tool used for the analysis of the observational and empirical datasets is the detrended fluctuation analysis, in order to investigate whether the power emitted by CO2 and by NO from the thermosphere exhibits power-law behavior. The results obtained from both observational and empirical data do not support the establishment of the power-law behavior. This conclusion reveals that the empirically derived data are characterized by the same intrinsic properties as those of the observational ones, thus enhancing the validity of their reliability.

  5. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    1984-10-01

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  6. Human reliability analysis of control room operators

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Isaac J.A.L.; Carvalho, Paulo Victor R.; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)

    2005-07-01

    Human reliability is the probability that a person correctly performs some system required action in a required time period and performs no extraneous action that can degrade the system Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. Significant progress has been made in the HRA field during the last years, mainly in nuclear area. Some first-generation HRA methods were developed, as THERP (Technique for human error rate prediction). Now, an array of called second-generation methods are emerging as alternatives, for instance ATHEANA (A Technique for human event analysis). The ergonomics approach has as tool the ergonomic work analysis. It focus on the study of operator's activities in physical and mental form, considering at the same time the observed characteristics of operator and the elements of the work environment as they are presented to and perceived by the operators. The aim of this paper is to propose a methodology to analyze the human reliability of the operators of industrial plant control room, using a framework that includes the approach used by ATHEANA, THERP and the work ergonomics analysis. (author)

  7. Planning Irreversible Electroporation in the Porcine Kidney: Are Numerical Simulations Reliable for Predicting Empiric Ablation Outcomes?

    International Nuclear Information System (INIS)

    Wimmer, Thomas; Srimathveeravalli, Govindarajan; Gutta, Narendra; Ezell, Paula C.; Monette, Sebastien; Maybody, Majid; Erinjery, Joseph P.; Durack, Jeremy C.; Coleman, Jonathan A.; Solomon, Stephen B.

    2015-01-01

    PurposeNumerical simulations are used for treatment planning in clinical applications of irreversible electroporation (IRE) to determine ablation size and shape. To assess the reliability of simulations for treatment planning, we compared simulation results with empiric outcomes of renal IRE using computed tomography (CT) and histology in an animal model.MethodsThe ablation size and shape for six different IRE parameter sets (70–90 pulses, 2,000–2,700 V, 70–100 µs) for monopolar and bipolar electrodes was simulated using a numerical model. Employing these treatment parameters, 35 CT-guided IRE ablations were created in both kidneys of six pigs and followed up with CT immediately and after 24 h. Histopathology was analyzed from postablation day 1.ResultsAblation zones on CT measured 81 ± 18 % (day 0, p ≤ 0.05) and 115 ± 18 % (day 1, p ≤ 0.09) of the simulated size for monopolar electrodes, and 190 ± 33 % (day 0, p ≤ 0.001) and 234 ± 12 % (day 1, p ≤ 0.0001) for bipolar electrodes. Histopathology indicated smaller ablation zones than simulated (71 ± 41 %, p ≤ 0.047) and measured on CT (47 ± 16 %, p ≤ 0.005) with complete ablation of kidney parenchyma within the central zone and incomplete ablation in the periphery.ConclusionBoth numerical simulations for planning renal IRE and CT measurements may overestimate the size of ablation compared to histology, and ablation effects may be incomplete in the periphery

  8. Sensitivity analysis in a structural reliability context

    International Nuclear Information System (INIS)

    Lemaitre, Paul

    2014-01-01

    This thesis' subject is sensitivity analysis in a structural reliability context. The general framework is the study of a deterministic numerical model that allows to reproduce a complex physical phenomenon. The aim of a reliability study is to estimate the failure probability of the system from the numerical model and the uncertainties of the inputs. In this context, the quantification of the impact of the uncertainty of each input parameter on the output might be of interest. This step is called sensitivity analysis. Many scientific works deal with this topic but not in the reliability scope. This thesis' aim is to test existing sensitivity analysis methods, and to propose more efficient original methods. A bibliographical step on sensitivity analysis on one hand and on the estimation of small failure probabilities on the other hand is first proposed. This step raises the need to develop appropriate techniques. Two variables ranking methods are then explored. The first one proposes to make use of binary classifiers (random forests). The second one measures the departure, at each step of a subset method, between each input original density and the density given the subset reached. A more general and original methodology reflecting the impact of the input density modification on the failure probability is then explored. The proposed methods are then applied on the CWNR case, which motivates this thesis. (author)

  9. Human reliability analysis using event trees

    International Nuclear Information System (INIS)

    Heslinga, G.

    1983-01-01

    The shut-down procedure of a technologically complex installation as a nuclear power plant consists of a lot of human actions, some of which have to be performed several times. The procedure is regarded as a chain of modules of specific actions, some of which are analyzed separately. The analysis is carried out by making a Human Reliability Analysis event tree (HRA event tree) of each action, breaking down each action into small elementary steps. The application of event trees in human reliability analysis implies more difficulties than in the case of technical systems where event trees were mainly used until now. The most important reason is that the operator is able to recover a wrong performance; memory influences play a significant role. In this study these difficulties are dealt with theoretically. The following conclusions can be drawn: (1) in principle event trees may be used in human reliability analysis; (2) although in practice the operator will recover his fault partly, theoretically this can be described as starting the whole event tree again; (3) compact formulas have been derived, by which the probability of reaching a specific failure consequence on passing through the HRA event tree after several times of recovery is to be calculated. (orig.)

  10. Structural reliability analysis and seismic risk assessment

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Shinozuka, M.

    1984-01-01

    This paper presents a reliability analysis method for safety evaluation of nuclear structures. By utilizing this method, it is possible to estimate the limit state probability in the lifetime of structures and to generate analytically the fragility curves for PRA studies. The earthquake ground acceleration, in this approach, is represented by a segment of stationary Gaussian process with a zero mean and a Kanai-Tajimi Spectrum. All possible seismic hazard at a site represented by a hazard curve is also taken into consideration. Furthermore, the limit state of a structure is analytically defined and the corresponding limit state surface is then established. Finally, the fragility curve is generated and the limit state probability is evaluated. In this paper, using a realistic reinforced concrete containment as an example, results of the reliability analysis of the containment subjected to dead load, live load and ground earthquake acceleration are presented and a fragility curve for PRA studies is also constructed

  11. Reliability analysis in interdependent smart grid systems

    Science.gov (United States)

    Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong

    2018-06-01

    Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.

  12. Qualitative analysis in reliability and safety studies

    International Nuclear Information System (INIS)

    Worrell, R.B.; Burdick, G.R.

    1976-01-01

    The qualitative evaluation of system logic models is described as it pertains to assessing the reliability and safety characteristics of nuclear systems. Qualitative analysis of system logic models, i.e., models couched in an event (Boolean) algebra, is defined, and the advantages inherent in qualitative analysis are explained. Certain qualitative procedures that were developed as a part of fault-tree analysis are presented for illustration. Five fault-tree analysis computer-programs that contain a qualitative procedure for determining minimal cut sets are surveyed. For each program the minimal cut-set algorithm and limitations on its use are described. The recently developed common-cause analysis for studying the effect of common-causes of failure on system behavior is explained. This qualitative procedure does not require altering the fault tree, but does use minimal cut sets from the fault tree as part of its input. The method is applied using two different computer programs. 25 refs

  13. Sensitivity analysis in optimization and reliability problems

    International Nuclear Information System (INIS)

    Castillo, Enrique; Minguez, Roberto; Castillo, Carmen

    2008-01-01

    The paper starts giving the main results that allow a sensitivity analysis to be performed in a general optimization problem, including sensitivities of the objective function, the primal and the dual variables with respect to data. In particular, general results are given for non-linear programming, and closed formulas for linear programming problems are supplied. Next, the methods are applied to a collection of civil engineering reliability problems, which includes a bridge crane, a retaining wall and a composite breakwater. Finally, the sensitivity analysis formulas are extended to calculus of variations problems and a slope stability problem is used to illustrate the methods

  14. Sensitivity analysis in optimization and reliability problems

    Energy Technology Data Exchange (ETDEWEB)

    Castillo, Enrique [Department of Applied Mathematics and Computational Sciences, University of Cantabria, Avda. Castros s/n., 39005 Santander (Spain)], E-mail: castie@unican.es; Minguez, Roberto [Department of Applied Mathematics, University of Castilla-La Mancha, 13071 Ciudad Real (Spain)], E-mail: roberto.minguez@uclm.es; Castillo, Carmen [Department of Civil Engineering, University of Castilla-La Mancha, 13071 Ciudad Real (Spain)], E-mail: mariacarmen.castillo@uclm.es

    2008-12-15

    The paper starts giving the main results that allow a sensitivity analysis to be performed in a general optimization problem, including sensitivities of the objective function, the primal and the dual variables with respect to data. In particular, general results are given for non-linear programming, and closed formulas for linear programming problems are supplied. Next, the methods are applied to a collection of civil engineering reliability problems, which includes a bridge crane, a retaining wall and a composite breakwater. Finally, the sensitivity analysis formulas are extended to calculus of variations problems and a slope stability problem is used to illustrate the methods.

  15. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  16. Subset simulation for structural reliability sensitivity analysis

    International Nuclear Information System (INIS)

    Song Shufang; Lu Zhenzhou; Qiao Hongwei

    2009-01-01

    Based on two procedures for efficiently generating conditional samples, i.e. Markov chain Monte Carlo (MCMC) simulation and importance sampling (IS), two reliability sensitivity (RS) algorithms are presented. On the basis of reliability analysis of Subset simulation (Subsim), the RS of the failure probability with respect to the distribution parameter of the basic variable is transformed as a set of RS of conditional failure probabilities with respect to the distribution parameter of the basic variable. By use of the conditional samples generated by MCMC simulation and IS, procedures are established to estimate the RS of the conditional failure probabilities. The formulae of the RS estimator, its variance and its coefficient of variation are derived in detail. The results of the illustrations show high efficiency and high precision of the presented algorithms, and it is suitable for highly nonlinear limit state equation and structural system with single and multiple failure modes

  17. Time-frequency analysis : mathematical analysis of the empirical mode decomposition.

    Science.gov (United States)

    2009-01-01

    Invented over 10 years ago, empirical mode : decomposition (EMD) provides a nonlinear : time-frequency analysis with the ability to successfully : analyze nonstationary signals. Mathematical : Analysis of the Empirical Mode Decomposition : is a...

  18. A taxonomy for human reliability analysis

    International Nuclear Information System (INIS)

    Beattie, J.D.; Iwasa-Madge, K.M.

    1984-01-01

    A human interaction taxonomy (classification scheme) was developed to facilitate human reliability analysis in a probabilistic safety evaluation of a nuclear power plant, being performed at Ontario Hydro. A human interaction occurs, by definition, when operators or maintainers manipulate, or respond to indication from, a plant component or system. The taxonomy aids the fault tree analyst by acting as a heuristic device. It helps define the range and type of human errors to be identified in the construction of fault trees, while keeping the identification by different analysts consistent. It decreases the workload associated with preliminary quantification of the large number of identified interactions by including a category called 'simple interactions'. Fault tree analysts quantify these according to a procedure developed by a team of human reliability specialists. The interactions which do not fit into this category are called 'complex' and are quantified by the human reliability team. The taxonomy is currently being used in fault tree construction in a probabilistic safety evaluation. As far as can be determined at this early stage, the potential benefits of consistency and completeness in identifying human interactions and streamlining the initial quantification are being realized

  19. System reliability analysis with natural language and expert's subjectivity

    International Nuclear Information System (INIS)

    Onisawa, T.

    1996-01-01

    This paper introduces natural language expressions and expert's subjectivity to system reliability analysis. To this end, this paper defines a subjective measure of reliability and presents the method of the system reliability analysis using the measure. The subjective measure of reliability corresponds to natural language expressions of reliability estimation, which is represented by a fuzzy set defined on [0,1]. The presented method deals with the dependence among subsystems and employs parametrized operations of subjective measures of reliability which can reflect expert 's subjectivity towards the analyzed system. The analysis results are also expressed by linguistic terms. Finally this paper gives an example of the system reliability analysis by the presented method

  20. Reliability analysis of containment isolation systems

    International Nuclear Information System (INIS)

    Pelto, P.J.; Counts, C.A.

    1984-06-01

    The Pacific Northwest Laboratory (PNL) is reviewing available information on containment systems design, operating experience, and related research as part of a project being conducted by the Division of Systems Integration, US Nuclear Regulatory Commission. The basic objective of this work is to collect and consolidate data relevant to assessing the functional performance of containment isolation systems and to use this data to the extent possible to characterize containment isolation system reliability for selected reference designs. This paper summarizes the results from initial efforts which focused on collection of data from available documents and briefly describes detailed review and analysis efforts which commenced recently. 5 references

  1. Reliability analysis of containment isolation systems

    International Nuclear Information System (INIS)

    Pelto, P.J.; Ames, K.R.; Gallucci, R.H.

    1985-06-01

    This report summarizes the results of the Reliability Analysis of Containment Isolation System Project. Work was performed in five basic areas: design review, operating experience review, related research review, generic analysis and plant specific analysis. Licensee Event Reports (LERs) and Integrated Leak Rate Test (ILRT) reports provided the major sources of containment performance information used in this study. Data extracted from LERs were assembled into a computer data base. Qualitative and quantitative information developed for containment performance under normal operating conditions and design basis accidents indicate that there is room for improvement. A rough estimate of overall containment unavailability for relatively small leaks which violate plant technical specifications is 0.3. An estimate of containment unavailability due to large leakage events is in the range of 0.001 to 0.01. These estimates are dependent on several assumptions (particularly on event duration times) which are documented in the report

  2. Empirical and theoretical analysis of complex systems

    Science.gov (United States)

    Zhao, Guannan

    structures evolve on a similar timescale to individual level transmission, we investigated the process of transmission through a model population comprising of social groups which follow simple dynamical rules for growth and break-up, and the profiles produced bear a striking resemblance to empirical data obtained from social, financial and biological systems. Finally, for better implementation of a widely accepted power law test algorithm, we have developed a fast testing procedure using parallel computation.

  3. Advancing Usability Evaluation through Human Reliability Analysis

    International Nuclear Information System (INIS)

    Ronald L. Boring; David I. Gertman

    2005-01-01

    This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probability of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues

  4. Integrated Reliability and Risk Analysis System (IRRAS)

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance

  5. Government debt in Greece: An empirical analysis

    Directory of Open Access Journals (Sweden)

    Gisele Mah

    2014-06-01

    Full Text Available Greek government debt has been increasing above the percentage stated in the growth and stability path from 112.9% in 2008 to 175.6% in 2013. This paper investigates the determinants of the general government debt in Greek by means of Vector Error Correction Model framework, Variance Decomposition and Generalized Impulse Response Function Analysis. The analysis showed a significant negative relationship between general government debt and government deficit, general government debt and inflation. Shocks to general government and inflation will cause general government debt to increase. Government deficit should be increased since there is gross capital formation included in its calculation which could be invested in income generating projects. The current account balance should be reduced by improving the net trade balance.

  6. Microscopic saw mark analysis: an empirical approach.

    Science.gov (United States)

    Love, Jennifer C; Derrick, Sharon M; Wiersema, Jason M; Peters, Charles

    2015-01-01

    Microscopic saw mark analysis is a well published and generally accepted qualitative analytical method. However, little research has focused on identifying and mitigating potential sources of error associated with the method. The presented study proposes the use of classification trees and random forest classifiers as an optimal, statistically sound approach to mitigate the potential for error of variability and outcome error in microscopic saw mark analysis. The statistical model was applied to 58 experimental saw marks created with four types of saws. The saw marks were made in fresh human femurs obtained through anatomical gift and were analyzed using a Keyence digital microscope. The statistical approach weighed the variables based on discriminatory value and produced decision trees with an associated outcome error rate of 8.62-17.82%. © 2014 American Academy of Forensic Sciences.

  7. THE LISBON STRATEGY: AN EMPIRICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Silvestri Marcello

    2010-07-01

    Full Text Available This paper investigates the European economic integration within the frame work of the 2000 Lisbon Council with the aim of studying the dynamics affecting the social and economic life of European Countries. Such a descriptive investigation focuses on certain significant variables of the new theories highlighting the importance of technological innovation and human capital. To this end the multivariate statistic technique of Principal Component Analysis has been applied in order to classify Countries with regard to the investigated phenomenon.

  8. Some connections for manuals of empirical logic to functional analysis

    International Nuclear Information System (INIS)

    Cook, T.A.

    1981-01-01

    In this informal presentation, the theory of manuals of operations is connected with some familiar concepts in functional analysis; namely, base normed and order unit normed spaces. The purpose of this discussion is to present several general open problems which display the interplay of empirical logic with functional analysis. These are mathematical problems with direct physical interpretation. (orig./HSI)

  9. Diakoptical reliability analysis of transistorized systems

    International Nuclear Information System (INIS)

    Kontoleon, J.M.; Lynn, J.W.; Green, A.E.

    1975-01-01

    Limitations both on high-speed core availability and computation time required for assessing the reliability of large-sized and complex electronic systems, such as used for the protection of nuclear reactors, are very serious restrictions which continuously confront the reliability analyst. Diakoptic methods simplify the solution of the electrical-network problem by subdividing a given network into a number of independent subnetworks and then interconnecting the solutions of these smaller parts by a systematic process involving transformations based on connection-matrix elements associated with the interconnecting links. However, the interconnection process is very complicated and it may be used only if the original system has been cut in such a manner that a relation can be established between the constraints appearing at both sides of the cut. Also, in dealing with transistorized systems, one of the difficulties encountered is that of modelling adequately their performance under various operating conditions, since their parameters are strongly affected by the imposed voltage and current levels. In this paper a new interconnection approach is presented which may be of use in the reliability analysis of large-sized transistorized systems. This is based on the partial optimization of the subdivisions of the torn network as well as on the optimization of the torn paths. The solution of the subdivisions is based on the principles of algebraic topology, with an algebraic structure relating the physical variables in a topological structure which defines the interconnection of the discrete elements. Transistors, and other nonlinear devices, are modelled using their actual characteristics, under normal and abnormal operating conditions. Use of so-called k factors is made to facilitate accounting for use of electrical stresses. The approach is demonstrated by way of an example. (author)

  10. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, T. J.; Jang, S. C.; Kang, H. G.; Kim, M. C.; Eom, H. S.; Lee, H. J.

    2006-09-01

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  11. Empirical analysis of industrial operations in Montenegro

    Directory of Open Access Journals (Sweden)

    Galić Jelena

    2012-12-01

    Full Text Available Since the starting process of transition, industrial production in Montenegro has been faced with serious problems and its share in GDP is constantly decreasing. Global financial crises had in large extent negatively influenced industry. Analysis of financial indicators showed that industry had significant losses, problem of undercapitalisation and liquidity problems. If we look by industry sectors, than situation is more favourable in the production of electricity, gas and water compared to extracting industry and mining. In paper is proposed measures of economic policy in order to improve situation in industry.

  12. Stochastic reliability analysis using Fokker Planck equations

    International Nuclear Information System (INIS)

    Hari Prasad, M.; Rami Reddy, G.; Srividya, A.; Verma, A.K.

    2011-01-01

    The Fokker-Planck equation describes the time evolution of the probability density function of the velocity of a particle, and can be generalized to other observables as well. It is also known as the Kolmogorov forward equation (diffusion). Hence, for any process, which evolves with time, the probability density function as a function of time can be represented with Fokker-Planck equation. In stochastic reliability analysis one is more interested in finding out the reliability or failure probability of the components or structures as a function of time rather than instantaneous failure probabilities. In this analysis the variables are represented with random processes instead of random variables. A random processes can be either stationary or non stationary. If the random process is stationary then the failure probability doesn't change with time where as in the case of non stationary processes the failure probability changes with time. In the present paper Fokker Planck equations have been used to find out the probability density function of the non stationary random processes. In this paper a flow chart has been provided which describes step by step process for carrying out stochastic reliability analysis using Fokker-Planck equations. As a first step one has to identify the failure function as a function of random processes. Then one has to solve the Fokker-Planck equation for each random process. In this paper the Fokker-Planck equation has been solved by using Finite difference method. As a result one gets the probability density values of the random process in the sample space as well as time space. Later at each time step appropriate probability distribution has to be identified based on the available probability density values. For checking the better fitness of the data Kolmogorov-Smirnov Goodness of fit test has been performed. In this way one can find out the distribution of the random process at each time step. Once one has the probability distribution

  13. Structural Reliability Analysis of Wind Turbines: A Review

    Directory of Open Access Journals (Sweden)

    Zhiyu Jiang

    2017-12-01

    Full Text Available The paper presents a detailed review of the state-of-the-art research activities on structural reliability analysis of wind turbines between the 1990s and 2017. We describe the reliability methods including the first- and second-order reliability methods and the simulation reliability methods and show the procedure for and application areas of structural reliability analysis of wind turbines. Further, we critically review the various structural reliability studies on rotor blades, bottom-fixed support structures, floating systems and mechanical and electrical components. Finally, future applications of structural reliability methods to wind turbine designs are discussed.

  14. Research review and development trends of human reliability analysis techniques

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Dai Licao

    2011-01-01

    Human reliability analysis (HRA) methods are reviewed. The theoretical basis of human reliability analysis, human error mechanism, the key elements of HRA methods as well as the existing HRA methods are respectively introduced and assessed. Their shortcomings,the current research hotspot and difficult problems are identified. Finally, it takes a close look at the trends of human reliability analysis methods. (authors)

  15. Reliability Analysis of Tubular Joints in Offshore Structures

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Sørensen, John Dalsgaard

    1987-01-01

    Reliability analysis of single tubular joints and offshore platforms with tubular joints is" presented. The failure modes considered are yielding, punching, buckling and fatigue failure. Element reliability as well as systems reliability approaches are used and illustrated by several examples....... Finally, optimal design of tubular.joints with reliability constraints is discussed and illustrated by an example....

  16. CRITICAL ANALYSIS OF THE RELIABILITY OF INTUITIVE MORAL DECISIONS

    Directory of Open Access Journals (Sweden)

    V. V. Nadurak

    2017-06-01

    Full Text Available Purpose of the research is a critical analysis of the reliability of intuitive moral decisions. Methodology. The work is based on the methodological attitude of empirical ethics, involving the use of findings from empirical research in ethical reflection and decision making. Originality. The main kinds of intuitive moral decisions are identified: 1 intuitively emotional decisions (i.e. decisions made under the influence of emotions that accompanies the process of moral decision making; 2 decisions made under the influence of moral risky psychological aptitudes (unconscious human tendencies that makes us think in a certain way and make decisions, unacceptable from the logical and ethical point of view; 3 intuitively normative decisions (decisions made under the influence of socially learned norms, that cause evaluative feeling «good-bad», without conscious reasoning. It was found that all of these kinds of intuitive moral decisions can lead to mistakes in the moral life. Conclusions. Considering the fact that intuition systematically leads to erroneous moral decisions, intuitive reaction cannot be the only source for making such decisions. The conscious rational reasoning can compensate for weaknesses of intuition. In this case, there is a necessity in theoretical model that would structure the knowledge about the interactions between intuitive and rational factors in moral decisions making and became the basis for making suggestions that would help us to make the right moral decision.

  17. Human Reliability Analysis For Computerized Procedures

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Gertman, David I.; Le Blanc, Katya

    2011-01-01

    This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.

  18. Reliability Analysis of Structural Timber Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Hoffmeyer, P.

    2000-01-01

    Structural systems like timber trussed rafters and roof elements made of timber can be expected to have some degree of redundancy and nonlinear/plastic behaviour when the loading consists of for example snow or imposed load. In this paper this system effect is modelled and the statistic...... of variation. In the paper a stochastic model is described for the strength of a single piece of timber taking into account the stochastic variation of the strength and stiffness with length. Also stochastic models for different types of loads are formulated. First, simple representative systems with different...... types of redundancy and non-linearity are considered. The statistical characteristics of the load bearing capacity are determined by reliability analysis. Next, more complex systems are considered modelling the mechanical behaviour of timber roof elements I stressed skin panels made of timber. Using...

  19. Human reliability analysis of dependent events

    International Nuclear Information System (INIS)

    Swain, A.D.; Guttmann, H.E.

    1977-01-01

    In the human reliability analysis in WASH-1400, the continuous variable of degree of interaction among human events was approximated by selecting four points on this continuum to represent the entire continuum. The four points selected were identified as zero coupling (i.e., zero dependence), complete coupling (i.e., complete dependence), and two intermediate points--loose coupling (a moderate level of dependence) and tight coupling (a high level of dependence). The paper expands the WASH-1400 treatment of common mode failure due to the interaction of human activities. Mathematical expressions for the above four levels of dependence are derived for parallel and series systems. The psychological meaning of each level of dependence is illustrated by examples, with probability tree diagrams to illustrate the use of conditional probabilities resulting from the interaction of human actions in nuclear power plant tasks

  20. Reliability analysis of steel-containment strength

    International Nuclear Information System (INIS)

    Greimann, L.G.; Fanous, F.; Wold-Tinsae, A.; Ketalaar, D.; Lin, T.; Bluhm, D.

    1982-06-01

    A best estimate and uncertainty assessment of the resistance of the St. Lucie, Cherokee, Perry, WPPSS and Browns Ferry containment vessels was performed. The Monte Carlo simulation technique and second moment approach were compared as a means of calculating the probability distribution of the containment resistance. A uniform static internal pressure was used and strain ductility was taken as the failure criterion. Approximate methods were developed and calibrated with finite element analysis. Both approximate and finite element analyses were performed on the axisymmetric containment structure. An uncertainty assessment of the containment strength was then performed by the second moment reliability method. Based upon the approximate methods, the cumulative distribution for the resistance of each of the five containments (shell modes only) is presented

  1. Standardizing the practice of human reliability analysis

    International Nuclear Information System (INIS)

    Hallbert, B.P.

    1993-01-01

    The practice of human reliability analysis (HRA) within the nuclear industry varies greatly in terms of posited mechanisms that shape human performance, methods of characterizing and analytically modeling human behavior, and the techniques that are employed to estimate the frequency with which human error occurs. This variation has been a source of contention among HRA practitioners regarding the validity of results obtained from different HRA methods. It has also resulted in attempts to develop standard methods and procedures for conducting HRAs. For many of the same reasons, the practice of HRA has not been standardized or has been standardized only to the extent that individual analysts have developed heuristics and consistent approaches in their practice of HRA. From the standpoint of consumers and regulators, this has resulted in a lack of clear acceptance criteria for the assumptions, modeling, and quantification of human errors in probabilistic risk assessments

  2. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  3. Human Reliability Analysis for Small Modular Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-06-01

    Because no human reliability analysis (HRA) method was specifically developed for small modular reactors (SMRs), the application of any current HRA method to SMRs represents tradeoffs. A first- generation HRA method like THERP provides clearly defined activity types, but these activity types do not map to the human-system interface or concept of operations confronting SMR operators. A second- generation HRA method like ATHEANA is flexible enough to be used for SMR applications, but there is currently insufficient guidance for the analyst, requiring considerably more first-of-a-kind analyses and extensive SMR expertise in order to complete a quality HRA. Although no current HRA method is optimized to SMRs, it is possible to use existing HRA methods to identify errors, incorporate them as human failure events in the probabilistic risk assessment (PRA), and quantify them. In this paper, we provided preliminary guidance to assist the human reliability analyst and reviewer in understanding how to apply current HRA methods to the domain of SMRs. While it is possible to perform a satisfactory HRA using existing HRA methods, ultimately it is desirable to formally incorporate SMR considerations into the methods. This may require the development of new HRA methods. More practicably, existing methods need to be adapted to incorporate SMRs. Such adaptations may take the form of guidance on the complex mapping between conventional light water reactors and small modular reactors. While many behaviors and activities are shared between current plants and SMRs, the methods must adapt if they are to perform a valid and accurate analysis of plant personnel performance in SMRs.

  4. Application of parameters space analysis tools for empirical model validation

    Energy Technology Data Exchange (ETDEWEB)

    Paloma del Barrio, E. [LEPT-ENSAM UMR 8508, Talence (France); Guyon, G. [Electricite de France, Moret-sur-Loing (France)

    2004-01-01

    A new methodology for empirical model validation has been proposed in the framework of the Task 22 (Building Energy Analysis Tools) of the International Energy Agency. It involves two main steps: checking model validity and diagnosis. Both steps, as well as the underlying methods, have been presented in the first part of the paper. In this part, they are applied for testing modelling hypothesis in the framework of the thermal analysis of an actual building. Sensitivity analysis tools have been first used to identify the parts of the model that can be really tested on the available data. A preliminary diagnosis is then supplied by principal components analysis. Useful information for model behaviour improvement has been finally obtained by optimisation techniques. This example of application shows how model parameters space analysis is a powerful tool for empirical validation. In particular, diagnosis possibilities are largely increased in comparison with residuals analysis techniques. (author)

  5. Task Decomposition in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  6. An Empirical Bayes Approach to Mantel-Haenszel DIF Analysis.

    Science.gov (United States)

    Zwick, Rebecca; Thayer, Dorothy T.; Lewis, Charles

    1999-01-01

    Developed an empirical Bayes enhancement to Mantel-Haenszel (MH) analysis of differential item functioning (DIF) in which it is assumed that the MH statistics are normally distributed and that the prior distribution of underlying DIF parameters is also normal. (Author/SLD)

  7. An Empirical Analysis of the Relationship between Minimum Wage ...

    African Journals Online (AJOL)

    An Empirical Analysis of the Relationship between Minimum Wage, Investment and Economic Growth in Ghana. ... In addition, the ratio of public investment to tax revenue must increase as minimum wage increases since such complementary changes are more likely to lead to economic growth. Keywords: minimum wage ...

  8. Finite element reliability analysis of fatigue life

    International Nuclear Information System (INIS)

    Harkness, H.H.; Belytschko, T.; Liu, W.K.

    1992-01-01

    Fatigue reliability is addressed by the first-order reliability method combined with a finite element method. Two-dimensional finite element models of components with cracks in mode I are considered with crack growth treated by the Paris law. Probability density functions of the variables affecting fatigue are proposed to reflect a setting where nondestructive evaluation is used, and the Rosenblatt transformation is employed to treat non-Gaussian random variables. Comparisons of the first-order reliability results and Monte Carlo simulations suggest that the accuracy of the first-order reliability method is quite good in this setting. Results show that the upper portion of the initial crack length probability density function is crucial to reliability, which suggests that if nondestructive evaluation is used, the probability of detection curve plays a key role in reliability. (orig.)

  9. Modeling human reliability analysis using MIDAS

    International Nuclear Information System (INIS)

    Boring, R. L.

    2006-01-01

    This paper documents current efforts to infuse human reliability analysis (HRA) into human performance simulation. The Idaho National Laboratory is teamed with NASA Ames Research Center to bridge the SPAR-H HRA method with NASA's Man-machine Integration Design and Analysis System (MIDAS) for use in simulating and modeling the human contribution to risk in nuclear power plant control room operations. It is anticipated that the union of MIDAS and SPAR-H will pave the path for cost-effective, timely, and valid simulated control room operators for studying current and next generation control room configurations. This paper highlights considerations for creating the dynamic HRA framework necessary for simulation, including event dependency and granularity. This paper also highlights how the SPAR-H performance shaping factors can be modeled in MIDAS across static, dynamic, and initiator conditions common to control room scenarios. This paper concludes with a discussion of the relationship of the workload factors currently in MIDAS and the performance shaping factors in SPAR-H. (authors)

  10. Reliability Analysis and Optimal Design of Monolithic Vertical Wall Breakwaters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, Hans F.; Christiani, E.

    1994-01-01

    Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of the most important failure modes, sliding failure, failure of the foundation and overturning failure are described . Relevant design variables are identified...

  11. The Interrelation among Faithful Representation (Reliability), Corruption and IFRS Adoption: An Empirical Investigation

    OpenAIRE

    Alexios Kythreotis

    2015-01-01

    Purpose – The degree of corruption, among other things, indicates the non -implementation of laws, weak enforcement of legal sanctions and the existence of non-transparent economic transactions. Therefore, the expected change in reliability (faithful-representation) resulting from the adoption of IAS/IFRS, does not depend solely on the adoption of IAS/IFRS but is also influenced by the degree of corruption in each country. The purpose of this paper is to examine whether the above statement is...

  12. The Interrelation among Faithful Representation (Reliability), Corruption and IFRS Adoption: An Empirical Investigation

    OpenAIRE

    Alexios Kythreotis

    2015-01-01

    Purpose – The degree of corruption, among other things, indicates the non -implementation of laws, weak enforcement of legal sanctions and the existence of non-transparent economic transactions. Therefore, the expected change in reliability (faithful-representation) resulting from the adoption of IAS/IFRS, does not depend solely on the adoption of IAS/IFRS but is also influenced by the degree of corruption in each country. The purpose of this paper is to examine w...

  13. Probabilistic risk assessment course documentation. Volume 3. System reliability and analysis techniques, Session A - reliability

    International Nuclear Information System (INIS)

    Lofgren, E.V.

    1985-08-01

    This course in System Reliability and Analysis Techniques focuses on the quantitative estimation of reliability at the systems level. Various methods are reviewed, but the structure provided by the fault tree method is used as the basis for system reliability estimates. The principles of fault tree analysis are briefly reviewed. Contributors to system unreliability and unavailability are reviewed, models are given for quantitative evaluation, and the requirements for both generic and plant-specific data are discussed. Also covered are issues of quantifying component faults that relate to the systems context in which the components are embedded. All reliability terms are carefully defined. 44 figs., 22 tabs

  14. HUMAN RELIABILITY ANALYSIS DENGAN PENDEKATAN COGNITIVE RELIABILITY AND ERROR ANALYSIS METHOD (CREAM

    Directory of Open Access Journals (Sweden)

    Zahirah Alifia Maulida

    2015-01-01

    Full Text Available Kecelakaan kerja pada bidang grinding dan welding menempati urutan tertinggi selama lima tahun terakhir di PT. X. Kecelakaan ini disebabkan oleh human error. Human error terjadi karena pengaruh lingkungan kerja fisik dan non fisik.Penelitian kali menggunakan skenario untuk memprediksi serta mengurangi kemungkinan terjadinya error pada manusia dengan pendekatan CREAM (Cognitive Reliability and Error Analysis Method. CREAM adalah salah satu metode human reliability analysis yang berfungsi untuk mendapatkan nilai Cognitive Failure Probability (CFP yang dapat dilakukan dengan dua cara yaitu basic method dan extended method. Pada basic method hanya akan didapatkan nilai failure probabailty secara umum, sedangkan untuk extended method akan didapatkan CFP untuk setiap task. Hasil penelitian menunjukkan faktor- faktor yang mempengaruhi timbulnya error pada pekerjaan grinding dan welding adalah kecukupan organisasi, kecukupan dari Man Machine Interface (MMI & dukungan operasional, ketersediaan prosedur/ perencanaan, serta kecukupan pelatihan dan pengalaman. Aspek kognitif pada pekerjaan grinding yang memiliki nilai error paling tinggi adalah planning dengan nilai CFP 0.3 dan pada pekerjaan welding yaitu aspek kognitif execution dengan nilai CFP 0.18. Sebagai upaya untuk mengurangi nilai error kognitif pada pekerjaan grinding dan welding rekomendasi yang diberikan adalah memberikan training secara rutin, work instrucstion yang lebih rinci dan memberikan sosialisasi alat. Kata kunci: CREAM (cognitive reliability and error analysis method, HRA (human reliability analysis, cognitive error Abstract The accidents in grinding and welding sectors were the highest cases over the last five years in PT. X and it caused by human error. Human error occurs due to the influence of working environment both physically and non-physically. This study will implement an approaching scenario called CREAM (Cognitive Reliability and Error Analysis Method. CREAM is one of human

  15. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  16. Space Mission Human Reliability Analysis (HRA) Project

    Science.gov (United States)

    Boyer, Roger

    2014-01-01

    The purpose of the Space Mission Human Reliability Analysis (HRA) Project is to extend current ground-based HRA risk prediction techniques to a long-duration, space-based tool. Ground-based HRA methodology has been shown to be a reasonable tool for short-duration space missions, such as Space Shuttle and lunar fly-bys. However, longer-duration deep-space missions, such as asteroid and Mars missions, will require the crew to be in space for as long as 400 to 900 day missions with periods of extended autonomy and self-sufficiency. Current indications show higher risk due to fatigue, physiological effects due to extended low gravity environments, and others, may impact HRA predictions. For this project, Safety & Mission Assurance (S&MA) will work with Human Health & Performance (HH&P) to establish what is currently used to assess human reliabiilty for human space programs, identify human performance factors that may be sensitive to long duration space flight, collect available historical data, and update current tools to account for performance shaping factors believed to be important to such missions. This effort will also contribute data to the Human Performance Data Repository and influence the Space Human Factors Engineering research risks and gaps (part of the HRP Program). An accurate risk predictor mitigates Loss of Crew (LOC) and Loss of Mission (LOM).The end result will be an updated HRA model that can effectively predict risk on long-duration missions.

  17. Individual Differences in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe; Ronald L. Boring

    2014-06-01

    While human reliability analysis (HRA) methods include uncertainty in quantification, the nominal model of human error in HRA typically assumes that operator performance does not vary significantly when they are given the same initiating event, indicators, procedures, and training, and that any differences in operator performance are simply aleatory (i.e., random). While this assumption generally holds true when performing routine actions, variability in operator response has been observed in multiple studies, especially in complex situations that go beyond training and procedures. As such, complexity can lead to differences in operator performance (e.g., operator understanding and decision-making). Furthermore, psychological research has shown that there are a number of known antecedents (i.e., attributable causes) that consistently contribute to observable and systematically measurable (i.e., not random) differences in behavior. This paper reviews examples of individual differences taken from operational experience and the psychological literature. The impact of these differences in human behavior and their implications for HRA are then discussed. We propose that individual differences should not be treated as aleatory, but rather as epistemic. Ultimately, by understanding the sources of individual differences, it is possible to remove some epistemic uncertainty from analyses.

  18. Weibull distribution in reliability data analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Ma Yingfei; Zhang Zhijian; Zhang Min; Zheng Gangyang

    2015-01-01

    Reliability is an important issue affecting each stage of the life cycle ranging from birth to death of a product or a system. The reliability engineering includes the equipment failure data processing, quantitative assessment of system reliability and maintenance, etc. Reliability data refers to the variety of data that describe the reliability of system or component during its operation. These data may be in the form of numbers, graphics, symbols, texts and curves. Quantitative reliability assessment is the task of the reliability data analysis. It provides the information related to preventing, detect, and correct the defects of the reliability design. Reliability data analysis under proceed with the various stages of product life cycle and reliability activities. Reliability data of Systems Structures and Components (SSCs) in Nuclear Power Plants is the key factor of probabilistic safety assessment (PSA); reliability centered maintenance and life cycle management. The Weibull distribution is widely used in reliability engineering, failure analysis, industrial engineering to represent manufacturing and delivery times. It is commonly used to model time to fail, time to repair and material strength. In this paper, an improved Weibull distribution is introduced to analyze the reliability data of the SSCs in Nuclear Power Plants. An example is given in the paper to present the result of the new method. The Weibull distribution of mechanical equipment for reliability data fitting ability is very strong in nuclear power plant. It's a widely used mathematical model for reliability analysis. The current commonly used methods are two-parameter and three-parameter Weibull distribution. Through comparison and analysis, the three-parameter Weibull distribution fits the data better. It can reflect the reliability characteristics of the equipment and it is more realistic to the actual situation. (author)

  19. RELIABILITY ANALYSIS OF POWER DISTRIBUTION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Popescu V.S.

    2012-04-01

    Full Text Available Power distribution systems are basic parts of power systems and reliability of these systems at present is a key issue for power engineering development and requires special attention. Operation of distribution systems is accompanied by a number of factors that produce random data a large number of unplanned interruptions. Research has shown that the predominant factors that have a significant influence on the reliability of distribution systems are: weather conditions (39.7%, defects in equipment(25% and unknown random factors (20.1%. In the article is studied the influence of random behavior and are presented estimations of reliability of predominantly rural electrical distribution systems.

  20. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  1. Empiric reliability of diagnostic and prognostic estimations of physical standards of children, going in for sports.

    Directory of Open Access Journals (Sweden)

    Zaporozhanov V.A.

    2012-12-01

    Full Text Available In the conditions of sporting-pedagogical practices objective estimation of potential possibilities gettings busy already on the initial stages of long-term preparation examined as one of issues of the day. The proper quantitative information allows to individualize preparation of gettings in obedience to requirements to the guided processes busy. Research purpose - logically and metrical to rotin expedience of metrical method of calculations of reliability of results of the control measurings, in-use for diagnostics of psychophysical fitness and prognosis of growth of trade gettings busy in the select type of sport. Material and methods. Analysed the results of the control measurings on four indexes of psychophysical preparedness and estimation of experts of fitness 24th gettings busy composition of children of gymnastic school. The results of initial and final inspection of gymnasts on the same control tests processed the method of mathematical statistics. Expected the metrical estimations of reliability of measurings is stability, co-ordination and informing of control information for current diagnostics and prognosis of sporting possibilities inspected. Results. Expedience of the use in these aims of metrical operations of calculation of complex estimation of the psychophysical state of gettings busy is metrology grounded. Conclusions. Research results confirm expedience of calculation of complex estimation of psychophysical features gettings busy for diagnostics of fitness in the select type of sport and trade prognosis on the subsequent stages of preparation.

  2. Empirical Research Concerning the Impact of the Public Internal Audit on the Accounting System and its Reliability in Romanian Universities

    Directory of Open Access Journals (Sweden)

    Drăguşin Cristina-Petrina

    2016-12-01

    Full Text Available The present paper is materialized in an empirical study concerning the impact of the internal audit on the accounting system and its reliability, in case of public universities in Romania. In order to achieve the study, it was necessary to know the different points of view of the representatives of the accounting departments of public institutions of academic education, using a statistical survey based on questionnaire. The research objectives were focused on obtaining conclusions regarding: the importance of internal auditing of the accounting system and its reliability; the extent to which the internal audit manages to provide reasonable assurances regarding the accounting and financial activity; the importance in auditing of the items related to the accounting activity; the assurance and the adequacy of the human resources allocated to the internal audit departments; the frequency with which the internal audit reports projects are modified in order to follow the audited structure recommendations; the extent to which the audit reports reflect the reality; the internal audit activity contribution in improving the accounting systems and their reliability in the Romanian universities.

  3. Reliability in perceptual analysis of voice quality.

    Science.gov (United States)

    Bele, Irene Velsvik

    2005-12-01

    This study focuses on speaking voice quality in male teachers (n = 35) and male actors (n = 36), who represent untrained and trained voice users, because we wanted to investigate normal and supranormal voices. In this study, both substantial and methodologic aspects were considered. It includes a method for perceptual voice evaluation, and a basic issue was rater reliability. A listening group of 10 listeners, 7 experienced speech-language therapists, and 3 speech-language therapist students evaluated the voices by 15 vocal characteristics using VA scales. Two sets of voice signals were investigated: text reading (2 loudness levels) and sustained vowel (3 levels). The results indicated a high interrater reliability for most perceptual characteristics. Connected speech was evaluated more reliably, especially at the normal level, but both types of voice signals were evaluated reliably, although the reliability for connected speech was somewhat higher than for vowels. Experienced listeners tended to be more consistent in their ratings than did the student raters. Some vocal characteristics achieved acceptable reliability even with a smaller panel of listeners. The perceptual characteristics grouped in 4 factors reflected perceptual dimensions.

  4. Reliability Analysis of Fatigue Fracture of Wind Turbine Drivetrain Components

    DEFF Research Database (Denmark)

    Berzonskis, Arvydas; Sørensen, John Dalsgaard

    2016-01-01

    in the volume of the casted ductile iron main shaft, on the reliability of the component. The probabilistic reliability analysis conducted is based on fracture mechanics models. Additionally, the utilization of the probabilistic reliability for operation and maintenance planning and quality control is discussed....

  5. Component reliability analysis for development of component reliability DB of Korean standard NPPs

    International Nuclear Information System (INIS)

    Choi, S. Y.; Han, S. H.; Kim, S. H.

    2002-01-01

    The reliability data of Korean NPP that reflects the plant specific characteristics is necessary for PSA and Risk Informed Application. We have performed a project to develop the component reliability DB and calculate the component reliability such as failure rate and unavailability. We have collected the component operation data and failure/repair data of Korean standard NPPs. We have analyzed failure data by developing a data analysis method which incorporates the domestic data situation. And then we have compared the reliability results with the generic data for the foreign NPPs

  6. Mathematical Methods in Survival Analysis, Reliability and Quality of Life

    CERN Document Server

    Huber, Catherine; Mesbah, Mounir

    2008-01-01

    Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.

  7. Reliability demonstration test planning using bayesian analysis

    International Nuclear Information System (INIS)

    Chandran, Senthil Kumar; Arul, John A.

    2003-01-01

    In Nuclear Power Plants, the reliability of all the safety systems is very critical from the safety viewpoint and it is very essential that the required reliability requirements be met while satisfying the design constraints. From practical experience, it is found that the reliability of complex systems such as Safety Rod Drive Mechanism is of the order of 10 -4 with an uncertainty factor of 10. To demonstrate the reliability of such systems is prohibitive in terms of cost and time as the number of tests needed is very large. The purpose of this paper is to develop a Bayesian reliability demonstrating testing procedure for exponentially distributed failure times with gamma prior distribution on the failure rate which can be easily and effectively used to demonstrate component/subsystem/system reliability conformance to stated requirements. The important questions addressed in this paper are: With zero failures, how long one should perform the tests and how many components are required to conclude with a given degree of confidence, that the component under test, meets the reliability requirement. The procedure is explained with an example. This procedure can also be extended to demonstrate with more number of failures. The approach presented is applicable for deriving test plans for demonstrating component failure rates of nuclear power plants, as the failure data for similar components are becoming available in existing plants elsewhere. The advantages of this procedure are the criterion upon which the procedure is based is simple and pertinent, the fitting of the prior distribution is an integral part of the procedure and is based on the use of information regarding two percentiles of this distribution and finally, the procedure is straightforward and easy to apply in practice. (author)

  8. Reliability analysis of pipe whip impacts

    International Nuclear Information System (INIS)

    Alzbutas, R.; Dundulis, G.; Kulak, R.F.; Marchertas, P.V.

    2003-01-01

    A probabilistic analysis of a group distribution header (GDH) guillotine break and the damage resulting from the failed GDH impacting against a neighbouring wall was carried out for the Ignalita RBMK-1500 reactor. The NEPTUNE software system was used for the deterministic transient analysis of a GDH guillotine break. Many deterministic analyses were performed using different values of the random variables that were specified by ProFES software. All the deterministic results were transferred to the ProFES system, which then performed probabilistic analyses of piping failure and wall damage. The Monte Carlo Simulation (MCS) method was used to study the sensitivity of the response variables and the effect of uncertainties of material properties and geometry parameters to the probability of limit states. The First Order Reliability Method (FORM) was used to study the probability of failure of the impacted-wall and the support-wall. The Response Surface (RS/MCS) method was used in order to express failure probability as function and to investigate the dependence between impact load and failure probability. The results of the probability analyses for a whipping GDH impacting onto an adjacent wall show that: (i) there is a 0.982 probability that after a GDH guillotine break contact between GDH and wall will occur; (ii) there is a probability of 0.013 that the ultimate tensile strength of concrete at the impact location will be reached, and a through-crack may open; (iii) there is a probability of 0.0126 that the ultimate compressive strength of concrete at the GDH support location will be reached, and the concrete may fail; (iv) at the impact location in the adjacent wall, there is a probability of 0.327 that the ultimate tensile strength of the rebars in the first layer will be reached and the rebars will fail; (v) at the GDH support location, there is a probability of 0.11 that the ultimate stress of the rebars in the first layer will be reached and the rebars will fail

  9. Victim countries of transnational terrorism: an empirical characteristics analysis.

    Science.gov (United States)

    Elbakidze, Levan; Jin, Yanhong

    2012-12-01

    This study empirically investigates the association between country-level socioeconomic characteristics and risk of being victimized in transnational terrorism events. We find that a country's annual financial contribution to the U.N. general operating budget has a positive association with the frequency of being victimized in transnational terrorism events. In addition, per capita GDP, political freedom, and openness to trade are nonlinearly related to the frequency of being victimized in transnational terrorism events. © 2012 Society for Risk Analysis.

  10. Islamic banks and profitability: an empirical analysis of Indonesian banking

    OpenAIRE

    Jordan, Sarah

    2013-01-01

    This paper provides an empirical analysis of the factors that determine the profitability of Indonesian banks between the years 2006-2012. In particular, it investigates whether there are any significant differences in terms of profitability between Islamic banks and commercial banks. The results, obtained by applying the system-GMM estimator to the panel of 54 banks, indicate that the high bank profitability during these years were determined mainly by the size of the banks, the market share...

  11. Reliability Analysis for Safety Grade PLC(POSAFE-Q)

    International Nuclear Information System (INIS)

    Choi, Kyung Chul; Song, Seung Whan; Park, Gang Min; Hwang, Sung Jae

    2012-01-01

    Safety Grade PLC(Programmable Logic Controller), POSAFE-Q, was developed recently in accordance with nuclear regulatory and requirements. In this paper, describe reliability analysis for digital safety grade PLC (especially POSAFE-Q). Reliability analysis scope is Prediction, Calculation of MTBF (Mean Time Between Failure), FMEA (Failure Mode Effect Analysis), PFD (Probability of Failure on Demand). (author)

  12. Explaining Innovation. An Empirical Analysis of Industry Data from Norway

    Directory of Open Access Journals (Sweden)

    Torbjørn Lorentzen

    2016-01-01

    Full Text Available The objective of the paper is to analyse why some firms innovate while others do not. The paper combines different theories of innovation by relating innovation to internal, firm specific assets and external, regional factors. Hypotheses are derived from theories and tested empirically by using logistic regression. The empirical analysis indicates that internal funding of R&D and size of the firm are the most important firm specific attributes for successful innovation. External, regional factors are also important. The analysis shows that firms located in large urban regions have significantly higher innovation rates than firms located in the periphery, and firms involved in regional networking are more likely to innovate compared to firms not involved in networking. The analysis contributes to a theoretical and empirical understanding of factors that influence on innovation and the role innovation plays in the market economy. Innovation policy should be targeted at developing a tax system and building infrastructure which give firms incentives to invest and allocate internal resources to R&D-activities and collaborate with others in innovation. From an economic policy perspective, consideration should be given to allocating more public resources to rural areas in order to compensate for the asymmetric distribution of resources between the centre and periphery. The paper contributes to the scientific literature of innovation by combining the firm oriented perspective with weight on firm specific, internal resources and a system perspective which focuses on external resources and networking as the most important determinants of innovation in firms.

  13. Probabilistic safety analysis and human reliability analysis. Proceedings. Working material

    International Nuclear Information System (INIS)

    1996-01-01

    An international meeting on Probabilistic Safety Assessment (PSA) and Human Reliability Analysis (HRA) was jointly organized by Electricite de France - Research and Development (EDF DER) and SRI International in co-ordination with the International Atomic Energy Agency. The meeting was held in Paris 21-23 November 1994. A group of international and French specialists in PSA and HRA participated at the meeting and discussed the state of the art and current trends in the following six topics: PSA Methodology; PSA Applications; From PSA to Dependability; Incident Analysis; Safety Indicators; Human Reliability. For each topic a background paper was prepared by EDF/DER and reviewed by the international group of specialists who attended the meeting. The results of this meeting provide a comprehensive overview of the most important questions related to the readiness of PSA for specific uses and areas where further research and development is required. Refs, figs, tabs

  14. Probabilistic safety analysis and human reliability analysis. Proceedings. Working material

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    An international meeting on Probabilistic Safety Assessment (PSA) and Human Reliability Analysis (HRA) was jointly organized by Electricite de France - Research and Development (EDF DER) and SRI International in co-ordination with the International Atomic Energy Agency. The meeting was held in Paris 21-23 November 1994. A group of international and French specialists in PSA and HRA participated at the meeting and discussed the state of the art and current trends in the following six topics: PSA Methodology; PSA Applications; From PSA to Dependability; Incident Analysis; Safety Indicators; Human Reliability. For each topic a background paper was prepared by EDF/DER and reviewed by the international group of specialists who attended the meeting. The results of this meeting provide a comprehensive overview of the most important questions related to the readiness of PSA for specific uses and areas where further research and development is required. Refs, figs, tabs.

  15. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  16. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    International Nuclear Information System (INIS)

    Boring, Ronald Laurids

    2010-01-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  17. TIGER reliability analysis in the DSN

    Science.gov (United States)

    Gunn, J. M.

    1982-01-01

    The TIGER algorithm, the inputs to the program and the output are described. TIGER is a computer program designed to simulate a system over a period of time to evaluate system reliability and availability. Results can be used in the Deep Space Network for initial spares provisioning and system evaluation.

  18. Reliability analysis of an offshore structure

    DEFF Research Database (Denmark)

    Sorensen, J. D.; Faber, M. H.; Thoft-Christensen, P.

    1992-01-01

    A jacket type offshore structure from the North Sea is considered. The time variant reliability is estimated for failure defined as brittle fracture and crack through the tubular member walls. The stochastic modelling is described. The hot spot stress spectral moments as function of the stochasti...

  19. Reliability analysis of reactor protection systems

    International Nuclear Information System (INIS)

    Alsan, S.

    1976-07-01

    A theoretical mathematical study of reliability is presented and the concepts subsequently defined applied to the study of nuclear reactor safety systems. The theory is applied to investigations of the operational reliability of the Siloe reactor from the point of view of rod drop. A statistical study conducted between 1964 and 1971 demonstrated that most rod drop incidents arose from circumstances associated with experimental equipment (new set-ups). The reliability of the most suitable safety system for some recently developed experimental equipment is discussed. Calculations indicate that if all experimental equipment were equipped with these new systems, only 1.75 rod drop accidents would be expected to occur per year on average. It is suggested that all experimental equipment should be equipped with these new safety systems and tested every 21 days. The reliability of the new safety system currently being studied for the Siloe reactor was also investigated. The following results were obtained: definite failures must be detected immediately as a result of the disturbances produced; the repair time must not exceed a few hours; the equipment must be tested every week. Under such conditions, the rate of accidental rod drops is about 0.013 on average per year. The level of nondefinite failures is less than 10 -6 per hour and the level of nonprotection 1 hour per year. (author)

  20. Bypassing BDD Construction for Reliability Analysis

    DEFF Research Database (Denmark)

    Williams, Poul Frederick; Nikolskaia, Macha; Rauzy, Antoine

    2000-01-01

    In this note, we propose a Boolean Expression Diagram (BED)-based algorithm to compute the minimal p-cuts of boolean reliability models such as fault trees. BEDs make it possible to bypass the Binary Decision Diagram (BDD) construction, which is the main cost of fault tree assessment....

  1. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  2. Mechanical reliability analysis of tubes intended for hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Nahal, Mourad; Khelif, Rabia [Badji Mokhtar University, Annaba (Algeria)

    2013-02-15

    Reliability analysis constitutes an essential phase in any study concerning reliability. Many industrialists evaluate and improve the reliability of their products during the development cycle - from design to startup (design, manufacture, and exploitation) - to develop their knowledge on cost/reliability ratio and to control sources of failure. In this study, we obtain results for hardness, tensile, and hydrostatic tests carried out on steel tubes for transporting hydrocarbons followed by statistical analysis. Results obtained allow us to conduct a reliability study based on resistance request. Thus, index of reliability is calculated and the importance of the variables related to the tube is presented. Reliability-based assessment of residual stress effects is applied to underground pipelines under a roadway, with and without active corrosion. Residual stress has been found to greatly increase probability of failure, especially in the early stages of pipe lifetime.

  3. Reliability analysis of RC containment structures under combined loads

    International Nuclear Information System (INIS)

    Hwang, H.; Reich, M.; Kagami, S.

    1984-01-01

    This paper discusses a reliability analysis method and load combination design criteria for reinforced concrete containment structures under combined loads. The probability based reliability analysis method is briefly described. For load combination design criteria, derivations of the load factors for accidental pressure due to a design basis accident and safe shutdown earthquake (SSE) for three target limit state probabilities are presented

  4. Reliability analysis of PLC safety equipment

    Energy Technology Data Exchange (ETDEWEB)

    Yu, J.; Kim, J. Y. [Chungnam Nat. Univ., Daejeon (Korea, Republic of)

    2006-06-15

    FMEA analysis for Nuclear Safety Grade PLC, failure rate prediction for nuclear safety grade PLC, sensitivity analysis for components failure rate of nuclear safety grade PLC, unavailability analysis support for nuclear safety system.

  5. Reliability analysis of PLC safety equipment

    International Nuclear Information System (INIS)

    Yu, J.; Kim, J. Y.

    2006-06-01

    FMEA analysis for Nuclear Safety Grade PLC, failure rate prediction for nuclear safety grade PLC, sensitivity analysis for components failure rate of nuclear safety grade PLC, unavailability analysis support for nuclear safety system

  6. Reliability Analysis of Dynamic Stability in Waves

    DEFF Research Database (Denmark)

    Søborg, Anders Veldt

    2004-01-01

    exhibit sufficient characteristics with respect to slope at zero heel (GM value), maximum leverarm, positive range of stability and area below the leverarm curve. The rule-based requirements to calm water leverarm curves are entirely based on experience obtained from vessels in operation and recorded......The assessment of a ship's intact stability is traditionally based on a semi-empirical deterministic concept that evaluates the characteristics of ship's calm water restoring leverarm curves. Today the ship is considered safe with respect to dynamic stability if its calm water leverarm curves...... accidents in the past. The rules therefore only leaves little room for evaluation and improvement of safety of a ship's dynamic stability. A few studies have evaluated the probability of ship stability loss in waves using Monte Carlo simulations. However, since this probability may be in the order of 10...

  7. EMPIRICAL ANALYSIS OF REMITTANCE INFLOW: THE CASE OF NEPAL

    Directory of Open Access Journals (Sweden)

    Karan Singh Thagunna

    2013-01-01

    Full Text Available This paper analyzes the nine year remittance inflow and macroeconomic data of Nepal, and studies the effect of remittance on each of those macroeconomic variables. We have used Unit Root Test, Least Squared Regression Analysis, and Granger Causality Test. The empirical results suggest that remittance has more causality on the consumption pattern as well as the import patter, and less on investments. Furthermore, with available literatures, this paper discusses the importance of channeling the remittance funds into the productive capital, mainly the public infrastructure, in comparison with the South Korean case study.

  8. Food labelled Information: An Empirical Analysis of Consumer Preferences

    Directory of Open Access Journals (Sweden)

    Alessandro Banterle

    2012-12-01

    Full Text Available This paper aims at analysing which kinds of currently labelled information are of interest and actually used by consumers, and which additional kinds could improve consumer choices. We investigate the attitude of consumers with respect to innovative strategies for the diffusion of product information, as smart-labels for mobile-phones. The empirical analysis was organised in focus groups followed by a survey on 240 consumers. Results show that the most important nutritional claims are vitamins, energy and fat content. Consumers show a high interest in the origin of the products, GMOs, environmental impact, animal welfare and type of breeding.

  9. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  10. Digital Processor Module Reliability Analysis of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lee, Sang Yong; Jung, Jae Hyun; Kim, Jae Ho; Kim, Sung Hun

    2005-01-01

    The system used in plant, military equipment, satellite, etc. consists of many electronic parts as control module, which requires relatively high reliability than other commercial electronic products. Specially, Nuclear power plant related to the radiation safety requires high safety and reliability, so most parts apply to Military-Standard level. Reliability prediction method provides the rational basis of system designs and also provides the safety significance of system operations. Thus various reliability prediction tools have been developed in recent decades, among of them, the MI-HDBK-217 method has been widely used as a powerful tool for the prediction. In this work, It is explained that reliability analysis work for Digital Processor Module (DPM, control module of SMART) is performed by Parts Stress Method based on MIL-HDBK-217F NOTICE2. We are using the Relex 7.6 of Relex software corporation, because reliability analysis process requires enormous part libraries and data for failure rate calculation

  11. Time-dependent reliability sensitivity analysis of motion mechanisms

    International Nuclear Information System (INIS)

    Wei, Pengfei; Song, Jingwen; Lu, Zhenzhou; Yue, Zhufeng

    2016-01-01

    Reliability sensitivity analysis aims at identifying the source of structure/mechanism failure, and quantifying the effects of each random source or their distribution parameters on failure probability or reliability. In this paper, the time-dependent parametric reliability sensitivity (PRS) analysis as well as the global reliability sensitivity (GRS) analysis is introduced for the motion mechanisms. The PRS indices are defined as the partial derivatives of the time-dependent reliability w.r.t. the distribution parameters of each random input variable, and they quantify the effect of the small change of each distribution parameter on the time-dependent reliability. The GRS indices are defined for quantifying the individual, interaction and total contributions of the uncertainty in each random input variable to the time-dependent reliability. The envelope function method combined with the first order approximation of the motion error function is introduced for efficiently estimating the time-dependent PRS and GRS indices. Both the time-dependent PRS and GRS analysis techniques can be especially useful for reliability-based design. This significance of the proposed methods as well as the effectiveness of the envelope function method for estimating the time-dependent PRS and GRS indices are demonstrated with a four-bar mechanism and a car rack-and-pinion steering linkage. - Highlights: • Time-dependent parametric reliability sensitivity analysis is presented. • Time-dependent global reliability sensitivity analysis is presented for mechanisms. • The proposed method is especially useful for enhancing the kinematic reliability. • An envelope method is introduced for efficiently implementing the proposed methods. • The proposed method is demonstrated by two real planar mechanisms.

  12. Structural reliability analysis applied to pipeline risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gardiner, M. [GL Industrial Services, Loughborough (United Kingdom); Mendes, Renato F.; Donato, Guilherme V.P. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Quantitative Risk Assessment (QRA) of pipelines requires two main components to be provided. These are models of the consequences that follow from some loss of containment incident, and models for the likelihood of such incidents occurring. This paper describes how PETROBRAS have used Structural Reliability Analysis for the second of these, to provide pipeline- and location-specific predictions of failure frequency for a number of pipeline assets. This paper presents an approach to estimating failure rates for liquid and gas pipelines, using Structural Reliability Analysis (SRA) to analyze the credible basic mechanisms of failure such as corrosion and mechanical damage. SRA is a probabilistic limit state method: for a given failure mechanism it quantifies the uncertainty in parameters to mathematical models of the load-resistance state of a structure and then evaluates the probability of load exceeding resistance. SRA can be used to benefit the pipeline risk management process by optimizing in-line inspection schedules, and as part of the design process for new construction in pipeline rights of way that already contain multiple lines. A case study is presented to show how the SRA approach has recently been used on PETROBRAS pipelines and the benefits obtained from it. (author)

  13. Systems reliability analysis for the national ignition facility

    International Nuclear Information System (INIS)

    Majumdar, K.C.; Annese, C.E.; MacIntyre, A.T.; Sicherman, A.

    1996-01-01

    A Reliability, Availability and Maintainability (RAM) analysis was initiated for the National Ignition Facility (NIF). The NIF is an inertial confinement fusion research facility designed to achieve controlled thermonuclear reaction; the preferred site for the NIF is the Lawrence Livermore National Laboratory (LLNL). The NIF RAM analysis has three purposes: (1) to allocate top level reliability and availability goals for the systems, (2) to develop an operability model for optimum maintainability, and (3) to determine the achievability of the allocated goals of the RAM parameters for the NIF systems and the facility operation as a whole. An allocation model assigns the reliability and availability goals for front line and support systems by a top-down approach; reliability analysis uses a bottom-up approach to determine the system reliability and availability from component level to system level

  14. Interactive reliability analysis project. FY 80 progress report

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Shepherd, J.C.

    1981-03-01

    This report summarizes the progress to date in the interactive reliability analysis project. Purpose is to develop and demonstrate a reliability and safety technique that can be incorporated early in the design process. Details are illustrated in a simple example of a reactor safety system

  15. Reliability analysis of grid connected small wind turbine power electronics

    International Nuclear Information System (INIS)

    Arifujjaman, Md.; Iqbal, M.T.; Quaicoe, J.E.

    2009-01-01

    Grid connection of small permanent magnet generator (PMG) based wind turbines requires a power conditioning system comprising a bridge rectifier, a dc-dc converter and a grid-tie inverter. This work presents a reliability analysis and an identification of the least reliable component of the power conditioning system of such grid connection arrangements. Reliability of the configuration is analyzed for the worst case scenario of maximum conversion losses at a particular wind speed. The analysis reveals that the reliability of the power conditioning system of such PMG based wind turbines is fairly low and it reduces to 84% of initial value within one year. The investigation is further enhanced by identifying the least reliable component within the power conditioning system and found that the inverter has the dominant effect on the system reliability, while the dc-dc converter has the least significant effect. The reliability analysis demonstrates that a permanent magnet generator based wind energy conversion system is not the best option from the point of view of power conditioning system reliability. The analysis also reveals that new research is required to determine a robust power electronics configuration for small wind turbine conversion systems.

  16. Reliability analysis of wind embedded power generation system for ...

    African Journals Online (AJOL)

    This paper presents a method for Reliability Analysis of wind energy embedded in power generation system for Indian scenario. This is done by evaluating the reliability index, loss of load expectation, for the power generation system with and without integration of wind energy sources in the overall electric power system.

  17. Analysis and assessment of water treatment plant reliability

    Directory of Open Access Journals (Sweden)

    Szpak Dawid

    2017-03-01

    Full Text Available The subject of the publication is the analysis and assessment of the reliability of the surface water treatment plant (WTP. In the study the one parameter method of reliability assessment was used. Based on the flow sheet derived from the water company the reliability scheme of the analysed WTP was prepared. On the basis of the daily WTP work report the availability index Kg for the individual elements included in the WTP, was determined. Then, based on the developed reliability scheme showing the interrelationships between elements, the availability index Kg for the whole WTP was determined. The obtained value of the availability index Kg was compared with the criteria values.

  18. An integrated approach to human reliability analysis -- decision analytic dynamic reliability model

    International Nuclear Information System (INIS)

    Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.

    1999-01-01

    The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis

  19. Reliability analysis applied to structural tests

    Science.gov (United States)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  20. Application of DFM in human reliability analysis

    International Nuclear Information System (INIS)

    Yu Shaojie; Zhao Jun; Tong Jiejuan

    2011-01-01

    Combining with ATHEANA, the possible to identify EFCs and UAs using DFM is studied; and then Steam Generator Tube Rupture (SGTR) accident is modeled and solved. Through inductive analysis, 26 Prime Implicants (PIs) are obtained and the meaning of results is interpreted; and one of PIs is similar to the accident scenario of human failure event in one nuclear power plant. Finally, this paper discusses the methods of quantifying PIs, analysis of Error of commission (EOC) and so on. (authors)

  1. Analysis of operating reliability of WWER-1000 unit

    International Nuclear Information System (INIS)

    Bortlik, J.

    1985-01-01

    The nuclear power unit was divided into 33 technological units. Input data for reliability analysis were surveys of operating results obtained from the IAEA information system and certain indexes of the reliability of technological equipment determined using the Bayes formula. The missing reliability data for technological equipment were used from the basic variant. The fault tree of the WWER-1000 unit was determined for the peak event defined as the impossibility of reaching 100%, 75% and 50% of rated power. The period was observed of the nuclear power plant operation with reduced output owing to defect and the respective time needed for a repair of the equipment. The calculation of the availability of the WWER-1000 unit was made for different variant situations. Certain indexes of the operating reliability of the WWER-1000 unit which are the result of a detailed reliability analysis are tabulated for selected variants. (E.S.)

  2. Energy efficiency determinants: An empirical analysis of Spanish innovative firms

    International Nuclear Information System (INIS)

    Costa-Campi, María Teresa; García-Quevedo, José; Segarra, Agustí

    2015-01-01

    This paper examines the extent to which innovative Spanish firms pursue improvements in energy efficiency (EE) as an objective of innovation. The increase in energy consumption and its impact on greenhouse gas emissions justifies the greater attention being paid to energy efficiency and especially to industrial EE. The ability of manufacturing companies to innovate and improve their EE has a substantial influence on attaining objectives regarding climate change mitigation. Despite the effort to design more efficient energy policies, the EE determinants in manufacturing firms have been little studied in the empirical literature. From an exhaustive sample of Spanish manufacturing firms and using a logit model, we examine the energy efficiency determinants for those firms that have innovated. To carry out the econometric analysis, we use panel data from the Community Innovation Survey for the period 2008–2011. Our empirical results underline the role of size among the characteristics of firms that facilitate energy efficiency innovation. Regarding company behaviour, firms that consider the reduction of environmental impacts to be an important objective of innovation and that have introduced organisational innovations are more likely to innovate with the objective of increasing energy efficiency. -- Highlights: •Drivers of innovation in energy efficiency at firm-level are examined. •Tangible investments have a greater influence on energy efficiency than R&D. •Environmental and energy efficiency innovation objectives are complementary. •Organisational innovation favors energy efficiency innovation. •Public policies should be implemented to improve firms’ energy efficiency

  3. Simulation Approach to Mission Risk and Reliability Analysis, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  4. Reliability analysis of digital I and C systems at KAERI

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2013-01-01

    This paper provides an overview of the ongoing research activities on a reliability analysis of digital instrumentation and control (I and C) systems of nuclear power plants (NPPs) performed by the Korea Atomic Energy Research Institute (KAERI). The research activities include the development of a new safety-critical software reliability analysis method by integrating the advantages of existing software reliability analysis methods, a fault coverage estimation method based on fault injection experiments, and a new human reliability analysis method for computer-based main control rooms (MCRs) based on human performance data from the APR-1400 full-scope simulator. The research results are expected to be used to address various issues such as the licensing issues related to digital I and C probabilistic safety assessment (PSA) for advanced digital-based NPPs. (author)

  5. PSA applications and piping reliability analysis: where do we stand?

    International Nuclear Information System (INIS)

    Lydell, B.O.Y.

    1997-01-01

    This reviews a recently proposed framework for piping reliability analysis. The framework was developed to promote critical interpretations of operational data on pipe failures, and to support application-specific-parameter estimation

  6. Reliability analysis of digital safety systems at nuclear power plants

    International Nuclear Information System (INIS)

    Sopira Vladimir; Kovacs, Zoltan

    2015-01-01

    Reliability analysis of digital reactor protection systems built on the basis of TELEPERM XS is described, and experience gained by the Slovak RELKO company during the past 20 years in this domain is highlighted. (orig.)

  7. reliability analysis of a two span floor designed according

    African Journals Online (AJOL)

    user

    deterministic approach, considering both ultimate and serviceability limit states. Reliability analysis of the floor ... loading, strength and stiffness parameters, dimensions .... to show that there is a direct relation between the failure probability (Pf) ...

  8. Reliability and risk analysis data base development: an historical perspective

    International Nuclear Information System (INIS)

    Fragola, Joseph R.

    1996-01-01

    Collection of empirical data and data base development for use in the prediction of the probability of future events has a long history. Dating back at least to the 17th century, safe passage events and mortality events were collected and analyzed to uncover prospective underlying classes and associated class attributes. Tabulations of these developed classes and associated attributes formed the underwriting basis for the fledgling insurance industry. Much earlier, master masons and architects used design rules of thumb to capture the experience of the ages and thereby produce structures of incredible longevity and reliability (Antona, E., Fragola, J. and Galvagni, R. Risk based decision analysis in design. Fourth SRA Europe Conference Proceedings, Rome, Italy, 18-20 October 1993). These rules served so well in producing robust designs that it was not until almost the 19th century that the analysis (Charlton, T.M., A History Of Theory Of Structures In The 19th Century, Cambridge University Press, Cambridge, UK, 1982) of masonry voussoir arches, begun by Galileo some two centuries earlier (Galilei, G. Discorsi e dimostrazioni mathematiche intorno a due nuove science, (Discourses and mathematical demonstrations concerning two new sciences, Leiden, The Netherlands, 1638), was placed on a sound scientific basis. Still, with the introduction of new materials (such as wrought iron and steel) and the lack of theoretical knowledge and computational facilities, approximate methods of structural design abounded well into the second half of the 20th century. To this day structural designers account for material variations and gaps in theoretical knowledge by employing factors of safety (Benvenuto, E., An Introduction to the History of Structural Mechanics, Part II: Vaulted Structures and Elastic Systems, Springer-Verlag, NY, 1991) or codes of practice (ASME Boiler and Pressure Vessel Code, ASME, New York) originally developed in the 19th century (Antona, E., Fragola, J. and

  9. Human reliability analysis in Loviisa probabilistic safety analysis

    International Nuclear Information System (INIS)

    Illman, L.; Isaksson, J.; Makkonen, L.; Vaurio, J.K.; Vuorio, U.

    1986-01-01

    The human reliability analysis in the Loviisa PSA project is carried out for three major groups of errors in human actions: (A) errors made before an initiating event, (B) errors that initiate a transient and (C) errors made during transients. Recovery possibilities are also included in each group. The methods used or planned for each group are described. A simplified THERP approach is used for group A, with emphasis on test and maintenance error recovery aspects and dependencies between redundancies. For group B, task analyses and human factors assessments are made for startup, shutdown and operational transients, with emphasis on potential common cause initiators. For group C, both misdiagnosis and slow decision making are analyzed, as well as errors made in carrying out necessary or backup actions. New or advanced features of the methodology are described

  10. Safety and reliability analysis based on nonprobabilistic methods

    International Nuclear Information System (INIS)

    Kozin, I.O.; Petersen, K.E.

    1996-01-01

    Imprecise probabilities, being developed during the last two decades, offer a considerably more general theory having many advantages which make it very promising for reliability and safety analysis. The objective of the paper is to argue that imprecise probabilities are more appropriate tool for reliability and safety analysis, that they allow to model the behavior of nuclear industry objects more comprehensively and give a possibility to solve some problems unsolved in the framework of conventional approach. Furthermore, some specific examples are given from which we can see the usefulness of the tool for solving some reliability tasks

  11. Statis Program Analysis for Reliable, Trusted Apps

    Science.gov (United States)

    2017-02-01

    and prevent errors in their Java programs. The Checker Framework includes compiler plug-ins (“checkers”) that find bugs or verify their absence. It...versions of the Java language. 4.8 DATAFLOW FRAMEWORK The dataflow framework enables more accurate analysis of source code. (Despite their similar...names, the dataflow framework is independent of the (Information) Flow Checker of chapter 2.) In Java code, a given operation may be permitted or

  12. Environmental pressure group strength and air pollution. An empirical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Binder, Seth; Neumayer, Eric [Department of Geography and Environment and Center for Environmental Policy and Governance (CEPG), London School of Economics and Political Science, Houghton Street, London WC2A 2AE (United Kingdom)

    2005-12-01

    There is an established theoretical and empirical case-study literature arguing that environmental pressure groups have a real impact on pollution levels. Our original contribution to this literature is to provide the first systematic quantitative test of the strength of environmental non-governmental organizations (ENGOs) on air pollution levels. We find that ENGO strength exerts a statistically significant impact on sulfur dioxide, smoke and heavy particulates concentration levels in a cross-country time-series regression analysis. This result holds true both for ordinary least squares and random-effects estimation. It is robust to controlling for the potential endogeneity of ENGO strength with the help of instrumental variables. The effect is also substantively important. Strengthening ENGOs represents an important strategy by which aid donors, foundations, international organizations and other stakeholders can try to achieve lower pollution levels around the world.

  13. Environmental pressure group strength and air pollution. An empirical analysis

    International Nuclear Information System (INIS)

    Binder, Seth; Neumayer, Eric

    2005-01-01

    There is an established theoretical and empirical case-study literature arguing that environmental pressure groups have a real impact on pollution levels. Our original contribution to this literature is to provide the first systematic quantitative test of the strength of environmental non-governmental organizations (ENGOs) on air pollution levels. We find that ENGO strength exerts a statistically significant impact on sulfur dioxide, smoke and heavy particulates concentration levels in a cross-country time-series regression analysis. This result holds true both for ordinary least squares and random-effects estimation. It is robust to controlling for the potential endogeneity of ENGO strength with the help of instrumental variables. The effect is also substantively important. Strengthening ENGOs represents an important strategy by which aid donors, foundations, international organizations and other stakeholders can try to achieve lower pollution levels around the world

  14. Refined discrete and empirical horizontal gradients in VLBI analysis

    Science.gov (United States)

    Landskron, Daniel; Böhm, Johannes

    2018-02-01

    Missing or incorrect consideration of azimuthal asymmetry of troposphere delays is a considerable error source in space geodetic techniques such as Global Navigation Satellite Systems (GNSS) or Very Long Baseline Interferometry (VLBI). So-called horizontal troposphere gradients are generally utilized for modeling such azimuthal variations and are particularly required for observations at low elevation angles. Apart from estimating the gradients within the data analysis, which has become common practice in space geodetic techniques, there is also the possibility to determine the gradients beforehand from different data sources than the actual observations. Using ray-tracing through Numerical Weather Models (NWMs), we determined discrete gradient values referred to as GRAD for VLBI observations, based on the standard gradient model by Chen and Herring (J Geophys Res 102(B9):20489-20502, 1997. https://doi.org/10.1029/97JB01739) and also for new, higher-order gradient models. These gradients are produced on the same data basis as the Vienna Mapping Functions 3 (VMF3) (Landskron and Böhm in J Geod, 2017.https://doi.org/10.1007/s00190-017-1066-2), so they can also be regarded as the VMF3 gradients as they are fully consistent with each other. From VLBI analyses of the Vienna VLBI and Satellite Software (VieVS), it becomes evident that baseline length repeatabilities (BLRs) are improved on average by 5% when using a priori gradients GRAD instead of estimating the gradients. The reason for this improvement is that the gradient estimation yields poor results for VLBI sessions with a small number of observations, while the GRAD a priori gradients are unaffected from this. We also developed a new empirical gradient model applicable for any time and location on Earth, which is included in the Global Pressure and Temperature 3 (GPT3) model. Although being able to describe only the systematic component of azimuthal asymmetry and no short-term variations at all, even these

  15. Reliability analysis of digital based I and C system

    Energy Technology Data Exchange (ETDEWEB)

    Kang, I. S.; Cho, B. S.; Choi, M. J. [KOPEC, Yongin (Korea, Republic of)

    1999-10-01

    Rapidly, digital technology is being widely applied in replacing analog component installed in existing plant and designing new nuclear power plant for control and monitoring system in Korea as well as in foreign countries. Even though many merits of digital technology, it is being faced with a new problem of reliability assurance. The studies for solving this problem are being performed vigorously in foreign countries. The reliability of KNGR Engineered Safety Features Component Control System (ESF-CCS), digital based I and C system, was analyzed to verify fulfillment of the ALWR EPRI-URD requirement for reliability analysis and eliminate hazards in design applied new technology. The qualitative analysis using FMEA and quantitative analysis using reliability block diagram were performed. The results of analyses are shown in this paper.

  16. Reliability analysis and utilization of PEMs in space application

    Science.gov (United States)

    Jiang, Xiujie; Wang, Zhihua; Sun, Huixian; Chen, Xiaomin; Zhao, Tianlin; Yu, Guanghua; Zhou, Changyi

    2009-11-01

    More and more plastic encapsulated microcircuits (PEMs) are used in space missions to achieve high performance. Since PEMs are designed for use in terrestrial operating conditions, the successful usage of PEMs in space harsh environment is closely related to reliability issues, which should be considered firstly. However, there is no ready-made methodology for PEMs in space applications. This paper discusses the reliability for the usage of PEMs in space. This reliability analysis can be divided into five categories: radiation test, radiation hardness, screening test, reliability calculation and reliability assessment. One case study is also presented to illuminate the details of the process, in which a PEM part is used in a joint space program Double-Star Project between the European Space Agency (ESA) and China. The influence of environmental constrains including radiation, humidity, temperature and mechanics on the PEM part has been considered. Both Double-Star Project satellites are still running well in space now.

  17. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  18. Reliability analysis of dispersion nuclear fuel elements

    Science.gov (United States)

    Ding, Shurong; Jiang, Xin; Huo, Yongzhong; Li, Lin an

    2008-03-01

    Taking a dispersion fuel element as a special particle composite, the representative volume element is chosen to act as the research object. The fuel swelling is simulated through temperature increase. The large strain elastoplastic analysis is carried out for the mechanical behaviors using FEM. The results indicate that the fission swelling is simulated successfully; the thickness increments grow linearly with burnup; with increasing of burnup: (1) the first principal stresses at fuel particles change from tensile ones to compression ones, (2) the maximum Mises stresses at the particles transfer from the centers of fuel particles to the location close to the interfaces between the matrix and the particles, their values increase with burnup; the maximum Mises stresses at the matrix exist in the middle location between the two particles near the mid-plane along the length (or width) direction, and the maximum plastic strains are also at the above region.

  19. Reliability analysis of dispersion nuclear fuel elements

    Energy Technology Data Exchange (ETDEWEB)

    Ding Shurong [Department of Mechanics and Engineering Science, Fudan University, Shanghai 200433 (China)], E-mail: dsr1971@163.com; Jiang Xin [Department of Mechanics and Engineering Science, Fudan University, Shanghai 200433 (China); Huo Yongzhong [Department of Mechanics and Engineering Science, Fudan University, Shanghai 200433 (China)], E-mail: yzhuo@fudan.edu.cn; Li Linan [Department of Mechanics, Tianjin University, Tianjin 300072 (China)

    2008-03-15

    Taking a dispersion fuel element as a special particle composite, the representative volume element is chosen to act as the research object. The fuel swelling is simulated through temperature increase. The large strain elastoplastic analysis is carried out for the mechanical behaviors using FEM. The results indicate that the fission swelling is simulated successfully; the thickness increments grow linearly with burnup; with increasing of burnup: (1) the first principal stresses at fuel particles change from tensile ones to compression ones, (2) the maximum Mises stresses at the particles transfer from the centers of fuel particles to the location close to the interfaces between the matrix and the particles, their values increase with burnup; the maximum Mises stresses at the matrix exist in the middle location between the two particles near the mid-plane along the length (or width) direction, and the maximum plastic strains are also at the above region.

  20. Durability reliability analysis for corroding concrete structures under uncertainty

    Science.gov (United States)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  1. Reliability analysis of HVDC grid combined with power flow simulations

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yongtao; Langeland, Tore; Solvik, Johan [DNV AS, Hoevik (Norway); Stewart, Emma [DNV KEMA, Camino Ramon, CA (United States)

    2012-07-01

    Based on a DC grid power flow solver and the proposed GEIR, we carried out reliability analysis for a HVDC grid test system proposed by CIGRE working group B4-58, where the failure statistics are collected from literature survey. The proposed methodology is used to evaluate the impact of converter configuration on the overall reliability performance of the HVDC grid, where the symmetrical monopole configuration is compared with the bipole with metallic return wire configuration. The results quantify the improvement on reliability by using the later alternative. (orig.)

  2. Reliability analysis of neutron transport simulation using Monte Carlo method

    International Nuclear Information System (INIS)

    Souza, Bismarck A. de; Borges, Jose C.

    1995-01-01

    This work presents a statistical and reliability analysis covering data obtained by computer simulation of neutron transport process, using the Monte Carlo method. A general description of the method and its applications is presented. Several simulations, corresponding to slowing down and shielding problems have been accomplished. The influence of the physical dimensions of the materials and of the sample size on the reliability level of results was investigated. The objective was to optimize the sample size, in order to obtain reliable results, optimizing computation time. (author). 5 refs, 8 figs

  3. Analysis of sodium valve reliability data at CREDO

    International Nuclear Information System (INIS)

    Bott, T.F.; Haas, P.M.

    1979-01-01

    The Centralized Reliability Data Organization (CREDO) has been established at Oak Ridge National Laboratory (ORNL) by the Department of Energy to provide a centralized source of data for reliability/maintainabilty analysis of advanced reactor systems. The current schedule calls for develoment of the data system at a moderate pace, with the first major distribution of data in late FY-1980. Continuous long-term collection of engineering, operating, and event data has been initiated at EBR-II and FFTF

  4. Reliability analysis and updating of deteriorating systems with subset simulation

    DEFF Research Database (Denmark)

    Schneider, Ronald; Thöns, Sebastian; Straub, Daniel

    2017-01-01

    An efficient approach to reliability analysis of deteriorating structural systems is presented, which considers stochastic dependence among element deterioration. Information on a deteriorating structure obtained through inspection or monitoring is included in the reliability assessment through B...... is an efficient and robust sampling-based algorithm suitable for such analyses. The approach is demonstrated in two case studies considering a steel frame structure and a Daniels system subjected to high-cycle fatigue....

  5. Use of COMCAN III in system design and reliability analysis

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Shepherd, J.C.; Marshall, N.H.; Fitch, L.R.

    1982-03-01

    This manual describes the COMCAN III computer program and its use. COMCAN III is a tool that can be used by the reliability analyst performing a probabilistic risk assessment or by the designer of a system desiring improved performance and efficiency. COMCAN III can be used to determine minimal cut sets of a fault tree, to calculate system reliability characteristics, and to perform qualitative common cause failure analysis

  6. Reliability analysis framework for computer-assisted medical decision systems

    International Nuclear Information System (INIS)

    Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.

    2007-01-01

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  7. Reliability analysis and initial requirements for FC systems and stacks

    Science.gov (United States)

    Åström, K.; Fontell, E.; Virtanen, S.

    In the year 2000 Wärtsilä Corporation started an R&D program to develop SOFC systems for CHP applications. The program aims to bring to the market highly efficient, clean and cost competitive fuel cell systems with rated power output in the range of 50-250 kW for distributed generation and marine applications. In the program Wärtsilä focuses on system integration and development. System reliability and availability are key issues determining the competitiveness of the SOFC technology. In Wärtsilä, methods have been implemented for analysing the system in respect to reliability and safety as well as for defining reliability requirements for system components. A fault tree representation is used as the basis for reliability prediction analysis. A dynamic simulation technique has been developed to allow for non-static properties in the fault tree logic modelling. Special emphasis has been placed on reliability analysis of the fuel cell stacks in the system. A method for assessing reliability and critical failure predictability requirements for fuel cell stacks in a system consisting of several stacks has been developed. The method is based on a qualitative model of the stack configuration where each stack can be in a functional, partially failed or critically failed state, each of the states having different failure rates and effects on the system behaviour. The main purpose of the method is to understand the effect of stack reliability, critical failure predictability and operating strategy on the system reliability and availability. An example configuration, consisting of 5 × 5 stacks (series of 5 sets of 5 parallel stacks) is analysed in respect to stack reliability requirements as a function of predictability of critical failures and Weibull shape factor of failure rate distributions.

  8. Test-retest reliability of trunk accelerometric gait analysis

    DEFF Research Database (Denmark)

    Henriksen, Marius; Lund, Hans; Moe-Nilssen, R

    2004-01-01

    The purpose of this study was to determine the test-retest reliability of a trunk accelerometric gait analysis in healthy subjects. Accelerations were measured during walking using a triaxial accelerometer mounted on the lumbar spine of the subjects. Six men and 14 women (mean age 35.2; range 18...... a definite potential in clinical gait analysis....

  9. Reliability Analysis of a Two Dissimilar Unit Cold Standby System ...

    African Journals Online (AJOL)

    (2009) using linear first order differential equation evaluated the reliability and availability characteristics of two-dissimilar-unit cold standby system with three mode for which no cost benefit analysis was considered. El-said (1994) contributed on stochastic analysis of a two-dissimilar-unit standby redundant system.

  10. A sensitivity analysis of centrifugal compressors' empirical models

    International Nuclear Information System (INIS)

    Yoon, Sung Ho; Baek, Je Hyun

    2001-01-01

    The mean-line method using empirical models is the most practical method of predicting off-design performance. To gain insight into the empirical models, the influence of empirical models on the performance prediction results is investigated. We found that, in the two-zone model, the secondary flow mass fraction has a considerable effect at high mass flow-rates on the performance prediction curves. In the TEIS model, the first element changes the slope of the performance curves as well as the stable operating range. The second element makes the performance curves move up and down as it increases or decreases. It is also discovered that the slip factor affects pressure ratio, but it has little effect on efficiency. Finally, this study reveals that the skin friction coefficient has significant effect on both the pressure ratio curve and the efficiency curve. These results show the limitations of the present empirical models, and more reasonable empirical models are reeded

  11. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  12. Reliability analysis for Atucha II reactor protection system signals

    International Nuclear Information System (INIS)

    Roca, Jose Luis

    1996-01-01

    Atucha II is a 745 MW Argentine Power Nuclear Reactor constructed by ENACE SA, Nuclear Argentine Company for Electrical Power Generation and SIEMENS AG KWU, Erlangen, Germany. A preliminary modular logic analysis of RPS (Reactor Protection System) signals was performed by means of the well known Swedish professional risk and reliability software named Risk-Spectrum taking as a basis a reference signal coded as JR17ER003 which command the two moderator loops valves. From the reliability and behavior knowledge for this reference signal follows an estimation of the reliability for the other 97 RPS signals. Because the preliminary character of this analysis Main Important Measures are not performed at this stage. Reliability is by the statistic value named unavailability predicted. The scope of this analysis is restricted from the measurement elements to the RPS buffer outputs. In the present context only one redundancy is analyzed so in the Instrumentation and Control area there no CCF (Common Cause Failures) present for signals. Finally those unavailability values could be introduced in the failure domain for the posterior complete Atucha II reliability analysis which includes all mechanical and electromechanical features. Also an estimation of the spurious frequency of RPS signals defined as faulty by no trip is performed

  13. Reliability analysis for Atucha II reactor protection system signals

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2000-01-01

    Atucha II is a 745 MW Argentine power nuclear reactor constructed by Nuclear Argentine Company for Electric Power Generation S.A. (ENACE S.A.) and SIEMENS AG KWU, Erlangen, Germany. A preliminary modular logic analysis of RPS (Reactor Protection System) signals was performed by means of the well known Swedish professional risk and reliability software named Risk-Spectrum taking as a basis a reference signal coded as JR17ER003 which command the two moderator loops valves. From the reliability and behavior knowledge for this reference signal follows an estimation of the reliability for the other 97 RPS signals. Because the preliminary character of this analysis Main Important Measures are not performed at this stage. Reliability is by the statistic value named unavailability predicted. The scope of this analysis is restricted from the measurement elements to the RPS buffer outputs. In the present context only one redundancy is analyzed so in the Instrumentation and Control area there no CCF (Common Cause Failures) present for signals. Finally those unavailability values could be introduced in the failure domain for the posterior complete Atucha II reliability analysis which includes all mechanical and electromechanical features. Also an estimation of the spurious frequency of RPS signals defined as faulty by no trip is performed. (author)

  14. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-11-01

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  15. The development of a reliable amateur boxing performance analysis template.

    Science.gov (United States)

    Thomson, Edward; Lamb, Kevin; Nicholas, Ceri

    2013-01-01

    The aim of this study was to devise a valid performance analysis system for the assessment of the movement characteristics associated with competitive amateur boxing and assess its reliability using analysts of varying experience of the sport and performance analysis. Key performance indicators to characterise the demands of an amateur contest (offensive, defensive and feinting) were developed and notated using a computerised notational analysis system. Data were subjected to intra- and inter-observer reliability assessment using median sign tests and calculating the proportion of agreement within predetermined limits of error. For all performance indicators, intra-observer reliability revealed non-significant differences between observations (P > 0.05) and high agreement was established (80-100%) regardless of whether exact or the reference value of ±1 was applied. Inter-observer reliability was less impressive for both analysts (amateur boxer and experienced analyst), with the proportion of agreement ranging from 33-100%. Nonetheless, there was no systematic bias between observations for any indicator (P > 0.05), and the proportion of agreement within the reference range (±1) was 100%. A reliable performance analysis template has been developed for the assessment of amateur boxing performance and is available for use by researchers, coaches and athletes to classify and quantify the movement characteristics of amateur boxing.

  16. Reliability analysis of cluster-based ad-hoc networks

    International Nuclear Information System (INIS)

    Cook, Jason L.; Ramirez-Marquez, Jose Emmanuel

    2008-01-01

    The mobile ad-hoc wireless network (MAWN) is a new and emerging network scheme that is being employed in a variety of applications. The MAWN varies from traditional networks because it is a self-forming and dynamic network. The MAWN is free of infrastructure and, as such, only the mobile nodes comprise the network. Pairs of nodes communicate either directly or through other nodes. To do so, each node acts, in turn, as a source, destination, and relay of messages. The virtue of a MAWN is the flexibility this provides; however, the challenge for reliability analyses is also brought about by this unique feature. The variability and volatility of the MAWN configuration makes typical reliability methods (e.g. reliability block diagram) inappropriate because no single structure or configuration represents all manifestations of a MAWN. For this reason, new methods are being developed to analyze the reliability of this new networking technology. New published methods adapt to this feature by treating the configuration probabilistically or by inclusion of embedded mobility models. This paper joins both methods together and expands upon these works by modifying the problem formulation to address the reliability analysis of a cluster-based MAWN. The cluster-based MAWN is deployed in applications with constraints on networking resources such as bandwidth and energy. This paper presents the problem's formulation, a discussion of applicable reliability metrics for the MAWN, and illustration of a Monte Carlo simulation method through the analysis of several example networks

  17. Fatigue Reliability Analysis of a Mono-Tower Platform

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; Brincker, Rune

    1991-01-01

    In this paper, a fatigue reliability analysis of a Mono-tower platform is presented. The failure mode, fatigue failure in the butt welds, is investigated with two different models. The one with the fatigue strength expressed through SN relations, the other with the fatigue strength expressed thro...... of the natural period, damping ratio, current, stress spectrum and parameters describing the fatigue strength. Further, soil damping is shown to be significant for the Mono-tower.......In this paper, a fatigue reliability analysis of a Mono-tower platform is presented. The failure mode, fatigue failure in the butt welds, is investigated with two different models. The one with the fatigue strength expressed through SN relations, the other with the fatigue strength expressed...... through linear-elastic fracture mechanics (LEFM). In determining the cumulative fatigue damage, Palmgren-Miner's rule is applied. Element reliability, as well as systems reliability, is estimated using first-order reliability methods (FORM). The sensitivity of the systems reliability to various parameters...

  18. State of the art report on aging reliability analysis

    International Nuclear Information System (INIS)

    Choi, Sun Yeong; Yang, Joon Eon; Han, Sang Hoon; Ha, Jae Joo

    2002-03-01

    The goal of this report is to describe the state of the art on aging analysis methods to calculate the effects of component aging quantitatively. In this report, we described some aging analysis methods which calculate the increase of Core Damage Frequency (CDF) due to aging by including the influence of aging into PSA. We also described several research topics required for aging analysis for components of domestic NPPs. We have described a statistical model and reliability physics model which calculate the effect of aging quantitatively by using PSA method. It is expected that the practical use of the reliability-physics model will be increased though the process with the reliability-physics model is more complicated than statistical model

  19. IEEE guide for the analysis of human reliability

    International Nuclear Information System (INIS)

    Dougherty, E.M. Jr.

    1987-01-01

    The Institute of Electrical and Electronics Engineers (IEEE) working group 7.4 of the Human Factors and Control Facilities Subcommittee of the Nuclear Power Engineering Committee (NPEC) has released its fifth draft of a Guide for General Principles of Human Action Reliability Analysis for Nuclear Power Generating Stations, for approval of NPEC. A guide is the least mandating in the IEEE hierarchy of standards. The purpose is to enhance the performance of an human reliability analysis (HRA) as a part of a probabilistic risk assessment (PRA), to assure reproducible results, and to standardize documentation. The guide does not recommend or even discuss specific techniques, which are too rapidly evolving today. Considerable maturation in the analysis of human reliability in a PRA context has taken place in recent years. The IEEE guide on this subject is an initial step toward bringing HRA out of the research and development arena into the toolbox of standard engineering practices

  20. Reliability analysis of the reactor protection system with fault diagnosis

    International Nuclear Information System (INIS)

    Lee, D.Y.; Han, J.B.; Lyou, J.

    2004-01-01

    The main function of a reactor protection system (RPS) is to maintain the reactor core integrity and reactor coolant system pressure boundary. The RPS consists of the 2-out-of-m redundant architecture to assure a reliable operation. The system reliability of the RPS is a very important factor for the probability safety assessment (PSA) evaluation in the nuclear field. To evaluate the system failure rate of the k-out-of-m redundant system is not so easy with the deterministic method. In this paper, the reliability analysis method using the binomial process is suggested to calculate the failure rate of the RPS system with a fault diagnosis function. The suggested method is compared with the result of the Markov process to verify the validation of the suggested method, and applied to the several kinds of RPS architectures for a comparative evaluation of the reliability. (orig.)

  1. Reliability Analysis of Wireless Sensor Networks Using Markovian Model

    Directory of Open Access Journals (Sweden)

    Jin Zhu

    2012-01-01

    Full Text Available This paper investigates reliability analysis of wireless sensor networks whose topology is switching among possible connections which are governed by a Markovian chain. We give the quantized relations between network topology, data acquisition rate, nodes' calculation ability, and network reliability. By applying Lyapunov method, sufficient conditions of network reliability are proposed for such topology switching networks with constant or varying data acquisition rate. With the conditions satisfied, the quantity of data transported over wireless network node will not exceed node capacity such that reliability is ensured. Our theoretical work helps to provide a deeper understanding of real-world wireless sensor networks, which may find its application in the fields of network design and topology control.

  2. Reliability of the Emergency Severity Index: Meta-analysis

    Directory of Open Access Journals (Sweden)

    Amir Mirhaghi

    2015-01-01

    Full Text Available Objectives: Although triage systems based on the Emergency Severity Index (ESI have many advantages in terms of simplicity and clarity, previous research has questioned their reliability in practice. Therefore, the aim of this meta-analysis was to determine the reliability of ESI triage scales. Methods: This metaanalysis was performed in March 2014. Electronic research databases were searched and articles conforming to the Guidelines for Reporting Reliability and Agreement Studies were selected. Two researchers independently examined selected abstracts. Data were extracted in the following categories: version of scale (latest/older, participants (adult/paediatric, raters (nurse, physician or expert, method of reliability (intra/inter-rater, reliability statistics (weighted/unweighted kappa and the origin and publication year of the study. The effect size was obtained by the Z-transformation of reliability coefficients. Data were pooled with random-effects models and a meta-regression was performed based on the method of moments estimator. Results: A total of 19 studies from six countries were included in the analysis. The pooled coefficient for the ESI triage scales was substantial at 0.791 (95% confidence interval: 0.787‒0.795. Agreement was higher with the latest and adult versions of the scale and among expert raters, compared to agreement with older and paediatric versions of the scales and with other groups of raters, respectively. Conclusion: ESI triage scales showed an acceptable level of overall reliability. However, ESI scales require more development in order to see full agreement from all rater groups. Further studies concentrating on other aspects of reliability assessment are needed.

  3. An Empirical Analysis of Human Performance and Nuclear Safety Culture

    International Nuclear Information System (INIS)

    Jeffrey Joe; Larry G. Blackwood

    2006-01-01

    The purpose of this analysis, which was conducted for the US Nuclear Regulatory Commission (NRC), was to test whether an empirical connection exists between human performance and nuclear power plant safety culture. This was accomplished through analyzing the relationship between a measure of human performance and a plant's Safety Conscious Work Environment (SCWE). SCWE is an important component of safety culture the NRC has developed, but it is not synonymous with it. SCWE is an environment in which employees are encouraged to raise safety concerns both to their own management and to the NRC without fear of harassment, intimidation, retaliation, or discrimination. Because the relationship between human performance and allegations is intuitively reciprocal and both relationship directions need exploration, two series of analyses were performed. First, human performance data could be indicative of safety culture, so regression analyses were performed using human performance data to predict SCWE. It also is likely that safety culture contributes to human performance issues at a plant, so a second set of regressions were performed using allegations to predict HFIS results

  4. Empirically characteristic analysis of chaotic PID controlling particle swarm optimization.

    Science.gov (United States)

    Yan, Danping; Lu, Yongzhong; Zhou, Min; Chen, Shiping; Levy, David

    2017-01-01

    Since chaos systems generally have the intrinsic properties of sensitivity to initial conditions, topological mixing and density of periodic orbits, they may tactfully use the chaotic ergodic orbits to achieve the global optimum or their better approximation to given cost functions with high probability. During the past decade, they have increasingly received much attention from academic community and industry society throughout the world. To improve the performance of particle swarm optimization (PSO), we herein propose a chaotic proportional integral derivative (PID) controlling PSO algorithm by the hybridization of chaotic logistic dynamics and hierarchical inertia weight. The hierarchical inertia weight coefficients are determined in accordance with the present fitness values of the local best positions so as to adaptively expand the particles' search space. Moreover, the chaotic logistic map is not only used in the substitution of the two random parameters affecting the convergence behavior, but also used in the chaotic local search for the global best position so as to easily avoid the particles' premature behaviors via the whole search space. Thereafter, the convergent analysis of chaotic PID controlling PSO is under deep investigation. Empirical simulation results demonstrate that compared with other several chaotic PSO algorithms like chaotic PSO with the logistic map, chaotic PSO with the tent map and chaotic catfish PSO with the logistic map, chaotic PID controlling PSO exhibits much better search efficiency and quality when solving the optimization problems. Additionally, the parameter estimation of a nonlinear dynamic system also further clarifies its superiority to chaotic catfish PSO, genetic algorithm (GA) and PSO.

  5. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  6. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    Science.gov (United States)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  7. Reliability analysis of reactor inspection robot(RIROB)

    International Nuclear Information System (INIS)

    Eom, H. S.; Kim, J. H.; Lee, J. C.; Choi, Y. R.; Moon, S. S.

    2002-05-01

    This report describes the method and the result of the reliability analysis of RIROB developed in Korea Atomic Energy Research Institute. There are many classic techniques and models for the reliability analysis. These techniques and models have been used widely and approved in other industries such as aviation and nuclear industry. Though these techniques and models have been approved in real fields they are still insufficient for the complicated systems such RIROB which are composed of computer, networks, electronic parts, mechanical parts, and software. Particularly the application of these analysis techniques to digital and software parts of complicated systems is immature at this time thus expert judgement plays important role in evaluating the reliability of the systems at these days. In this report we proposed a method which combines diverse evidences relevant to the reliability to evaluate the reliability of complicated systems such as RIROB. The proposed method combines diverse evidences and performs inference in formal and in quantitative way by using the benefits of Bayesian Belief Nets (BBN)

  8. Empirical Analysis of Closed-Loop Duopoly Advertising Strategies

    OpenAIRE

    Gary M. Erickson

    1992-01-01

    Closed-loop (perfect) equilibria in a Lanchester duopoly differential game of advertising competition are used as the basis for empirical investigation. Two systems of simultaneous nonlinear equations are formed, one from a general Lanchester model and one from a constrained model. Two empirical applications are conducted. In one involving Coca-Cola and Pepsi-Cola, a formal statistical testing procedure is used to detect whether closed-loop equilibrium advertising strategies are used by the c...

  9. Differences in Dynamic Brand Competition Across Markets: An Empirical Analysis

    OpenAIRE

    Jean-Pierre Dubé; Puneet Manchanda

    2005-01-01

    We investigate differences in the dynamics of marketing decisions across geographic markets empirically. We begin with a linear-quadratic game involving forward-looking firms competing on prices and advertising. Based on the corresponding Markov perfect equilibrium, we propose estimable econometric equations for demand and marketing policy. Our model allows us to measure empirically the strategic response of competitors along with economic measures such as firm profitability. We use a rich da...

  10. Science-Based Simulation Model of Human Performance for Human Reliability Analysis

    International Nuclear Information System (INIS)

    Kelly, Dana L.; Boring, Ronald L.; Mosleh, Ali; Smidts, Carol

    2011-01-01

    Human reliability analysis (HRA), a component of an integrated probabilistic risk assessment (PRA), is the means by which the human contribution to risk is assessed, both qualitatively and quantitatively. However, among the literally dozens of HRA methods that have been developed, most cannot fully model and quantify the types of errors that occurred at Three Mile Island. Furthermore, all of the methods lack a solid empirical basis, relying heavily on expert judgment or empirical results derived in non-reactor domains. Finally, all of the methods are essentially static, and are thus unable to capture the dynamics of an accident in progress. The objective of this work is to begin exploring a dynamic simulation approach to HRA, one whose models have a basis in psychological theories of human performance, and whose quantitative estimates have an empirical basis. This paper highlights a plan to formalize collaboration among the Idaho National Laboratory (INL), the University of Maryland, and The Ohio State University (OSU) to continue development of a simulation model initially formulated at the University of Maryland. Initial work will focus on enhancing the underlying human performance models with the most recent psychological research, and on planning follow-on studies to establish an empirical basis for the model, based on simulator experiments to be carried out at the INL and at the OSU.

  11. Reliability-Based Robustness Analysis for a Croatian Sports Hall

    DEFF Research Database (Denmark)

    Čizmar, Dean; Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    2011-01-01

    This paper presents a probabilistic approach for structural robustness assessment for a timber structure built a few years ago. The robustness analysis is based on a structural reliability based framework for robustness and a simplified mechanical system modelling of a timber truss system....... A complex timber structure with a large number of failure modes is modelled with only a few dominant failure modes. First, a component based robustness analysis is performed based on the reliability indices of the remaining elements after the removal of selected critical elements. The robustness...... is expressed and evaluated by a robustness index. Next, the robustness is assessed using system reliability indices where the probabilistic failure model is modelled by a series system of parallel systems....

  12. Distribution System Reliability Analysis for Smart Grid Applications

    Science.gov (United States)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  13. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  14. Reliability analysis - systematic approach based on limited data

    International Nuclear Information System (INIS)

    Bourne, A.J.

    1975-11-01

    The initial approaches required for reliability analysis are outlined. These approaches highlight the system boundaries, examine the conditions under which the system is required to operate, and define the overall performance requirements. The discussion is illustrated by a simple example of an automatic protective system for a nuclear reactor. It is then shown how the initial approach leads to a method of defining the system, establishing performance parameters of interest and determining the general form of reliability models to be used. The overall system model and the availability of reliability data at the system level are next examined. An iterative process is then described whereby the reliability model and data requirements are systematically refined at progressively lower hierarchic levels of the system. At each stage, the approach is illustrated with examples from the protective system previously described. The main advantages of the approach put forward are the systematic process of analysis, the concentration of assessment effort in the critical areas and the maximum use of limited reliability data. (author)

  15. DATMAN: A reliability data analysis program using Bayesian updating

    International Nuclear Information System (INIS)

    Becker, M.; Feltus, M.A.

    1996-01-01

    Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, which can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately

  16. Human reliability analysis of Lingao Nuclear Power Station

    International Nuclear Information System (INIS)

    Zhang Li; Huang Shudong; Yang Hong; He Aiwu; Huang Xiangrui; Zheng Tao; Su Shengbing; Xi Haiying

    2001-01-01

    The necessity of human reliability analysis (HRA) of Lingao Nuclear Power Station are analyzed, and the method and operation procedures of HRA is briefed. One of the human factors events (HFE) is analyzed in detail and some questions of HRA are discussed. The authors present the analytical results of 61 HFEs, and make a brief introduction of HRA contribution to Lingao Nuclear Power Station

  17. System Reliability Analysis Capability and Surrogate Model Application in RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Huang, Dongli [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gleicher, Frederick [Idaho National Lab. (INL), Idaho Falls, ID (United States); Wang, Bei [Idaho National Lab. (INL), Idaho Falls, ID (United States); Adbel-Khalik, Hany S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pascucci, Valerio [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-11-01

    This report collect the effort performed to improve the reliability analysis capabilities of the RAVEN code and explore new opportunity in the usage of surrogate model by extending the current RAVEN capabilities to multi physics surrogate models and construction of surrogate models for high dimensionality fields.

  18. Factorial validation and reliability analysis of the brain fag syndrome ...

    African Journals Online (AJOL)

    Results: Two valid factors emerged with items 1-3 and items 4, 5 & 7 loading on respectively, making the BFSS a twodimensional (multidimensional) scale which measures 2 aspects of brain fag [labeled burning sensation and crawling sensation respectively]. The reliability analysis yielded a Cronbach Alpha coefficient of ...

  19. Competition in the German pharmacy market: an empirical analysis.

    Science.gov (United States)

    Heinsohn, Jörg G; Flessa, Steffen

    2013-10-10

    Pharmaceutical products are an important component of expenditure on public health insurance in the Federal Republic of Germany. For years, German policy makers have regulated public pharmacies in order to limit the increase in costs. One reform has followed another, main objective being to increase competition in the pharmacy market. It is generally assumed that an increase in competition would reduce healthcare costs. However, there is a lack of empirical proof of a stronger orientation of German public pharmacies towards competition thus far. This paper analyses the self-perceptions of owners of German public pharmacies and their orientation towards competition in the pharmacy markets. It is based on a cross-sectional survey (N = 289) and distinguishes between successful and less successful pharmacies, the location of the pharmacies (e.g. West German States and East German States) and the gender of the pharmacy owner. The data are analysed descriptively by survey items and employing bivariate and structural equation modelling. The analysis reveals that the majority of owners of public pharmacies in Germany do not currently perceive very strong competitive pressure in the market. However, the innovativeness of the pharmacist is confirmed as most relevant for net revenue development and the profit margin. Some differences occur between regions, e.g. public pharmacies in West Germany have a significantly higher profit margin. This study provides evidence that the German healthcare reforms aimed at increasing the competition between public pharmacies in Germany have not been completely successful. Many owners of public pharmacies disregard instruments of active customer-orientated management (such as customer loyalty or an offensive position and economies of scale), which could give them a competitive advantage. However, it is clear that those pharmacists who strive for systematic and innovative management and adopt an offensive and competitive stance are quite

  20. An empirical analysis of the hydropower portfolio in Pakistan

    International Nuclear Information System (INIS)

    Siddiqi, Afreen; Wescoat, James L.; Humair, Salal; Afridi, Khurram

    2012-01-01

    The Indus Basin of Pakistan with 800 hydropower project sites and a feasible hydropower potential of 60 GW, 89% of which is undeveloped, is a complex system poised for large-scale changes in the future. Motivated by the need to understand future impacts of hydropower alternatives, this study conducted a multi-dimensional, empirical analysis of the full hydropower portfolio. The results show that the full portfolio spans multiple scales of capacity from mega (>1000 MW) to micro (<0.1 MW) projects with a skewed spatial distribution within the provinces, as well as among rivers and canals. Of the total feasible potential, 76% lies in two (out of six) administrative regions and 68% lies in two major rivers (out of more than 125 total channels). Once projects currently under implementation are commissioned, there would be a five-fold increase from a current installed capacity of 6720 MW to 36759 MW. It is recommended that the implementation and design decisions should carefully include spatial distribution and environmental considerations upfront. Furthermore, uncertainties in actual energy generation, and broader hydrological risks due to expected climate change effects should be included in the current planning of these systems that are to provide service over several decades into the future. - Highlights: ► Pakistan has a hydropower potential of 60 GW distributed across 800 projects. ► Under-development projects will realize 36.7 GW of this potential by 2030. ► Project locations are skewed towards some sub-basins and provinces. ► Project sizes are very diverse and have quite limited private sector ownership. ► Gaps in data prevent proper risk assessment for Pakistan's hydropower development.

  1. Empirically characteristic analysis of chaotic PID controlling particle swarm optimization

    Science.gov (United States)

    Yan, Danping; Lu, Yongzhong; Zhou, Min; Chen, Shiping; Levy, David

    2017-01-01

    Since chaos systems generally have the intrinsic properties of sensitivity to initial conditions, topological mixing and density of periodic orbits, they may tactfully use the chaotic ergodic orbits to achieve the global optimum or their better approximation to given cost functions with high probability. During the past decade, they have increasingly received much attention from academic community and industry society throughout the world. To improve the performance of particle swarm optimization (PSO), we herein propose a chaotic proportional integral derivative (PID) controlling PSO algorithm by the hybridization of chaotic logistic dynamics and hierarchical inertia weight. The hierarchical inertia weight coefficients are determined in accordance with the present fitness values of the local best positions so as to adaptively expand the particles’ search space. Moreover, the chaotic logistic map is not only used in the substitution of the two random parameters affecting the convergence behavior, but also used in the chaotic local search for the global best position so as to easily avoid the particles’ premature behaviors via the whole search space. Thereafter, the convergent analysis of chaotic PID controlling PSO is under deep investigation. Empirical simulation results demonstrate that compared with other several chaotic PSO algorithms like chaotic PSO with the logistic map, chaotic PSO with the tent map and chaotic catfish PSO with the logistic map, chaotic PID controlling PSO exhibits much better search efficiency and quality when solving the optimization problems. Additionally, the parameter estimation of a nonlinear dynamic system also further clarifies its superiority to chaotic catfish PSO, genetic algorithm (GA) and PSO. PMID:28472050

  2. Modeling gallic acid production rate by empirical and statistical analysis

    Directory of Open Access Journals (Sweden)

    Bratati Kar

    2000-01-01

    Full Text Available For predicting the rate of enzymatic reaction empirical correlation based on the experimental results obtained under various operating conditions have been developed. Models represent both the activation as well as deactivation conditions of enzymatic hydrolysis and the results have been analyzed by analysis of variance (ANOVA. The tannase activity was found maximum at incubation time 5 min, reaction temperature 40ºC, pH 4.0, initial enzyme concentration 0.12 v/v, initial substrate concentration 0.42 mg/ml, ionic strength 0.2 M and under these optimal conditions, the maximum rate of gallic acid production was 33.49 mumoles/ml/min.Para predizer a taxa das reações enzimaticas uma correlação empírica baseada nos resultados experimentais foi desenvolvida. Os modelos representam a ativação e a desativativação da hydrolise enzimatica. Os resultados foram avaliados pela análise de variança (ANOVA. A atividade máxima da tannase foi obtida após 5 minutos de incubação, temperatura 40ºC, pH 4,0, concentração inicial da enzima de 0,12 v/v, concentração inicial do substrato 0,42 mg/ml, força iônica 0,2 M. Sob essas condições a taxa máxima de produção ácido galico foi de 33,49 µmoles/ml/min.

  3. Empirically characteristic analysis of chaotic PID controlling particle swarm optimization.

    Directory of Open Access Journals (Sweden)

    Danping Yan

    Full Text Available Since chaos systems generally have the intrinsic properties of sensitivity to initial conditions, topological mixing and density of periodic orbits, they may tactfully use the chaotic ergodic orbits to achieve the global optimum or their better approximation to given cost functions with high probability. During the past decade, they have increasingly received much attention from academic community and industry society throughout the world. To improve the performance of particle swarm optimization (PSO, we herein propose a chaotic proportional integral derivative (PID controlling PSO algorithm by the hybridization of chaotic logistic dynamics and hierarchical inertia weight. The hierarchical inertia weight coefficients are determined in accordance with the present fitness values of the local best positions so as to adaptively expand the particles' search space. Moreover, the chaotic logistic map is not only used in the substitution of the two random parameters affecting the convergence behavior, but also used in the chaotic local search for the global best position so as to easily avoid the particles' premature behaviors via the whole search space. Thereafter, the convergent analysis of chaotic PID controlling PSO is under deep investigation. Empirical simulation results demonstrate that compared with other several chaotic PSO algorithms like chaotic PSO with the logistic map, chaotic PSO with the tent map and chaotic catfish PSO with the logistic map, chaotic PID controlling PSO exhibits much better search efficiency and quality when solving the optimization problems. Additionally, the parameter estimation of a nonlinear dynamic system also further clarifies its superiority to chaotic catfish PSO, genetic algorithm (GA and PSO.

  4. An empirical analysis of cigarette demand in Argentina.

    Science.gov (United States)

    Martinez, Eugenio; Mejia, Raul; Pérez-Stable, Eliseo J

    2015-01-01

    To estimate the long-term and short-term effects on cigarette demand in Argentina based on changes in cigarette price and income per person >14 years old. Public data from the Ministry of Economics and Production were analysed based on monthly time series data between 1994 and 2010. The econometric analysis used cigarette consumption per person >14 years of age as the dependent variable and the real income per person >14 years old and the real average price of cigarettes as independent variables. Empirical analyses were done to verify the order of integration of the variables, to test for cointegration to capture the long-term effects and to capture the short-term dynamics of the variables. The demand for cigarettes in Argentina was affected by changes in real income and the real average price of cigarettes. The long-term income elasticity was equal to 0.43, while the own-price elasticity was equal to -0.31, indicating a 10% increase in the growth of real income led to an increase in cigarette consumption of 4.3% and a 10% increase in the price produced a fall of 3.1% in cigarette consumption. The vector error correction model estimated that the short-term income elasticity was 0.25 and the short-term own-price elasticity of cigarette demand was -0.15. A simulation exercise showed that increasing the price of cigarettes by 110% would maximise revenues and result in a potentially large decrease in total cigarette consumption. Econometric analyses of cigarette consumption and their relationship with cigarette price and income can provide valuable information for developing cigarette price policy. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. A study of operational and testing reliability in software reliability analysis

    International Nuclear Information System (INIS)

    Yang, B.; Xie, M.

    2000-01-01

    Software reliability is an important aspect of any complex equipment today. Software reliability is usually estimated based on reliability models such as nonhomogeneous Poisson process (NHPP) models. Software systems are improving in testing phase, while it normally does not change in operational phase. Depending on whether the reliability is to be predicted for testing phase or operation phase, different measure should be used. In this paper, two different reliability concepts, namely, the operational reliability and the testing reliability, are clarified and studied in detail. These concepts have been mixed up or even misused in some existing literature. Using different reliability concept will lead to different reliability values obtained and it will further lead to different reliability-based decisions made. The difference of the estimated reliabilities is studied and the effect on the optimal release time is investigated

  6. ZERBERUS - the code for reliability analysis of crack containing structures

    International Nuclear Information System (INIS)

    Cizelj, L.; Riesch-Oppermann, H.

    1992-04-01

    Brief description of the First- and Second Order Reliability Methods, being the theoretical background of the code, is given. The code structure is described in detail, with special emphasis to the new application fields. The numerical example investigates failure probability of steam generator tubing affected by stress corrosion cracking. The changes necessary to accommodate this analysis within the ZERBERUS code are explained. Analysis results are compared to different Monte Carlo techniques. (orig./HP) [de

  7. Accident Sequence Evaluation Program: Human reliability analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  8. Accident Sequence Evaluation Program: Human reliability analysis procedure

    International Nuclear Information System (INIS)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs

  9. Univariate and Bivariate Empirical Mode Decomposition for Postural Stability Analysis

    Directory of Open Access Journals (Sweden)

    Jacques Duchêne

    2008-05-01

    Full Text Available The aim of this paper was to compare empirical mode decomposition (EMD and two new extended methods of  EMD named complex empirical mode decomposition (complex-EMD and bivariate empirical mode decomposition (bivariate-EMD. All methods were used to analyze stabilogram center of pressure (COP time series. The two new methods are suitable to be applied to complex time series to extract complex intrinsic mode functions (IMFs before the Hilbert transform is subsequently applied on the IMFs. The trace of the analytic IMF in the complex plane has a circular form, with each IMF having its own rotation frequency. The area of the circle and the average rotation frequency of IMFs represent efficient indicators of the postural stability status of subjects. Experimental results show the effectiveness of these indicators to identify differences in standing posture between groups.

  10. Maintenance management of railway infrastructures based on reliability analysis

    International Nuclear Information System (INIS)

    Macchi, Marco; Garetti, Marco; Centrone, Domenico; Fumagalli, Luca; Piero Pavirani, Gian

    2012-01-01

    Railway infrastructure maintenance plays a crucial role for rail transport. It aims at guaranteeing safety of operations and availability of railway tracks and related equipment for traffic regulation. Moreover, it is one major cost for rail transport operations. Thus, the increased competition in traffic market is asking for maintenance improvement, aiming at the reduction of maintenance expenditures while keeping the safety of operations. This issue is addressed by the methodology presented in the paper. The first step of the methodology consists of a family-based approach for the equipment reliability analysis; its purpose is the identification of families of railway items which can be given the same reliability targets. The second step builds the reliability model of the railway system for identifying the most critical items, given a required service level for the transportation system. The two methods have been implemented and tested in practical case studies, in the context of Rete Ferroviaria Italiana, the Italian public limited company for railway transportation.

  11. Reliability Analysis of Free Jet Scour Below Dams

    Directory of Open Access Journals (Sweden)

    Chuanqi Li

    2012-12-01

    Full Text Available Current formulas for calculating scour depth below of a free over fall are mostly deterministic in nature and do not adequately consider the uncertainties of various scouring parameters. A reliability-based assessment of scour, taking into account uncertainties of parameters and coefficients involved, should be performed. This paper studies the reliability of a dam foundation under the threat of scour. A model for calculating the reliability of scour and estimating the probability of failure of the dam foundation subjected to scour is presented. The Maximum Entropy Method is applied to construct the probability density function (PDF of the performance function subject to the moment constraints. Monte Carlo simulation (MCS is applied for uncertainty analysis. An example is considered, and there liability of its scour is computed, the influence of various random variables on the probability failure is analyzed.

  12. New Empirical Evidence on the Validity and the Reliability of the Early Life Stress Questionnaire in a Polish Sample.

    Science.gov (United States)

    Sokołowski, Andrzej; Dragan, Wojciech Ł

    2017-01-01

    Background: The Early Life Stress Questionnaire (ELSQ) is widely used to estimate the prevalence of negative events during childhood, including emotional, physical, verbal, sexual abuse, negligence, severe conflicts, separation, parental divorce, substance abuse, poverty, and so forth. Objective: This study presents the psychometric properties of the Polish adaptation of the ELSQ. It also verifies if early life stress (ELS) is a good predictor of psychopathology symptoms during adulthood. Materials and Methods: We analyzed data from two samples. Sample 1 was selected by random quota method from across the country and included 609 participants aged 18-50 years, 306 women (50.2%) and 303 men (49.8%). Sample 2 contained 503 young adults (253 women and 250 men) aged 18-25. Confirmatory and exploratory factor analyses were used to measure ELSQ internal consistency. The validity was based on the relation to psychopathological symptoms and substance misuse. Results: Results showed good internal consistency and validity. Exploratory factor analysis indicates a six-factor structure of the ELSQ. ELS was related to psychopathology in adulthood, including depressive, sociophobic, vegetative as well as pain symptoms. ELSQ score correlated also with alcohol use, but not nicotine dependence. Moreover, ELS was correlated with stress in adulthood. Conclusion: The findings indicate that the Polish version of the ELSQ is a valid and reliable instrument for assessing ELS in the Polish population and may be applied in both clinical and community samples.

  13. Choosing a heuristic and root node for edge ordering in BDD-based network reliability analysis

    International Nuclear Information System (INIS)

    Mo, Yuchang; Xing, Liudong; Zhong, Farong; Pan, Zhusheng; Chen, Zhongyu

    2014-01-01

    In the Binary Decision Diagram (BDD)-based network reliability analysis, heuristics have been widely used to obtain a reasonably good ordering of edge variables. Orderings generated using different heuristics can lead to dramatically different sizes of BDDs, and thus dramatically different running times and memory usages for the analysis of the same network. Unfortunately, due to the nature of the ordering problem (i.e., being an NP-complete problem) no formal guidelines or rules are available for choosing a good heuristic or for choosing a high-performance root node to perform edge searching using a particular heuristic. In this work, we make novel contributions by proposing heuristic and root node selection methods based on the concept of boundary sets for the BDD-based network reliability analysis. Empirical studies show that the proposed selection methods can help to generate high-performance edge ordering for most of studied cases, enabling the efficient BDD-based reliability analysis of large-scale networks. The proposed methods are demonstrated on different types of networks, including square lattice networks, torus lattice networks and de Bruijn networks

  14. An exact method for solving logical loops in reliability analysis

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2009-01-01

    This paper presents an exact method for solving logical loops in reliability analysis. The systems that include logical loops are usually described by simultaneous Boolean equations. First, present a basic rule of solving simultaneous Boolean equations. Next, show the analysis procedures for three-component system with external supports. Third, more detailed discussions are given for the establishment of logical loop relation. Finally, take up two typical structures which include more than one logical loop. Their analysis results and corresponding GO-FLOW charts are given. The proposed analytical method is applicable to loop structures that can be described by simultaneous Boolean equations, and it is very useful in evaluating the reliability of complex engineering systems.

  15. Reliability analysis of service water system under earthquake

    International Nuclear Information System (INIS)

    Yu Yu; Qian Xiaoming; Lu Xuefeng; Wang Shengfei; Niu Fenglei

    2013-01-01

    Service water system is one of the important safety systems in nuclear power plant, whose failure probability is always gained by system reliability analysis. The probability of equipment failure under the earthquake is the function of the peak acceleration of earthquake motion, while the occurrence of earthquake is of randomicity, thus the traditional fault tree method in current probability safety assessment is not powerful enough to deal with such case of conditional probability problem. An analysis frame was put forward for system reliability evaluation in seismic condition in this paper, in which Monte Carlo simulation was used to deal with conditional probability problem. Annual failure probability of service water system was calculated, and failure probability of 1.46X10 -4 per year was obtained. The analysis result is in accordance with the data which indicate equipment seismic resistance capability, and the rationality of the model is validated. (authors)

  16. A framework for intelligent reliability centered maintenance analysis

    International Nuclear Information System (INIS)

    Cheng Zhonghua; Jia Xisheng; Gao Ping; Wu Su; Wang Jianzhao

    2008-01-01

    To improve the efficiency of reliability-centered maintenance (RCM) analysis, case-based reasoning (CBR), as a kind of artificial intelligence (AI) technology, was successfully introduced into RCM analysis process, and a framework for intelligent RCM analysis (IRCMA) was studied. The idea for IRCMA is based on the fact that the historical records of RCM analysis on similar items can be referenced and used for the current RCM analysis of a new item. Because many common or similar items may exist in the analyzed equipment, the repeated tasks of RCM analysis can be considerably simplified or avoided by revising the similar cases in conducting RCM analysis. Based on the previous theory studies, an intelligent RCM analysis system (IRCMAS) prototype was developed. This research has focused on the description of the definition, basic principles as well as a framework of IRCMA, and discussion of critical techniques in the IRCMA. Finally, IRCMAS prototype is presented based on a case study

  17. Ownership dynamics with large shareholders : An empirical analysis

    NARCIS (Netherlands)

    Donelli, M.; Urzua Infante, F.; Larrain, B.

    2013-01-01

    We study the empirical determinants of corporate ownership dynamics in a market where large shareholders are prevalent. We use a unique, hand-collected 20-year dataset on the ownership structure of Chilean companies. Controllers’ blockholdings are on average high -as in continental Europe, for

  18. WHAT FACTORS INFLUENCE QUALITY SERVICE IMPROVEMENT IN MONTENEGRO: EMPIRICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Djurdjica Perovic

    2013-03-01

    Full Text Available In this paper, using an Ordinary Least Square regression (OLS, we investigate whether intangible elements influence tourist's perception about service quality. Our empirical results based on tourist survey in Montenegro, indicate that intangible elements of tourism product have positive impact on tourist's overall perception of service quality in Montenegro.

  19. Reliability test and failure analysis of high power LED packages

    International Nuclear Information System (INIS)

    Chen Zhaohui; Zhang Qin; Wang Kai; Luo Xiaobing; Liu Sheng

    2011-01-01

    A new type application specific light emitting diode (LED) package (ASLP) with freeform polycarbonate lens for street lighting is developed, whose manufacturing processes are compatible with a typical LED packaging process. The reliability test methods and failure criterions from different vendors are reviewed and compared. It is found that test methods and failure criterions are quite different. The rapid reliability assessment standards are urgently needed for the LED industry. 85 0 C/85 RH with 700 mA is used to test our LED modules with three other vendors for 1000 h, showing no visible degradation in optical performance for our modules, with two other vendors showing significant degradation. Some failure analysis methods such as C-SAM, Nano X-ray CT and optical microscope are used for LED packages. Some failure mechanisms such as delaminations and cracks are detected in the LED packages after the accelerated reliability testing. The finite element simulation method is helpful for the failure analysis and design of the reliability of the LED packaging. One example is used to show one currently used module in industry is vulnerable and may not easily pass the harsh thermal cycle testing. (semiconductor devices)

  20. A Reliable and Valid Weighted Scoring Instrument for Use in Grading APA-Style Empirical Research Report

    Science.gov (United States)

    Greenberg, Kathleen Puglisi

    2012-01-01

    The scoring instrument described in this article is based on a deconstruction of the seven sections of an American Psychological Association (APA)-style empirical research report into a set of learning outcomes divided into content-, expression-, and format-related categories. A double-weighting scheme used to score the report yields a final grade…

  1. Opposing the nuclear threat: The convergence of moral analysis and empirical data

    International Nuclear Information System (INIS)

    Hehir, J.B.

    1986-01-01

    This paper examines the concept of nuclear winter from the perspective of religious and moral values. The objective is to identify points of intersection between the empirical arguments about nuclear winter and ethical perspectives on nuclear war. The analysis moves through three steps: (1) the context of the nuclear debate; (2) the ethical and empirical contributions to the nuclear debate; and (3) implications for policy drawn from the ethical-empirical data

  2. Reliability analysis of maintenance operations for railway tracks

    International Nuclear Information System (INIS)

    Rhayma, N.; Bressolette, Ph.; Breul, P.; Fogli, M.; Saussine, G.

    2013-01-01

    Railway engineering is confronted with problems due to degradation of the railway network that requires important and costly maintenance work. However, because of the lack of knowledge on the geometrical and mechanical parameters of the track, it is difficult to optimize the maintenance management. In this context, this paper presents a new methodology to analyze the behavior of railway tracks. It combines new diagnostic devices which permit to obtain an important amount of data and thus to make statistics on the geometric and mechanical parameters and a non-intrusive stochastic approach which can be coupled with any mechanical model. Numerical results show the possibilities of this methodology for reliability analysis of different maintenance operations. In the future this approach will give important informations to railway managers to optimize maintenance operations using a reliability analysis

  3. Reliability analysis based on the losses from failures.

    Science.gov (United States)

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the

  4. Economic Growth and the Environment. An empirical analysis

    Energy Technology Data Exchange (ETDEWEB)

    De Bruyn, S.M.

    1999-12-21

    A number of economists have claimed that economic growth benefits environmental quality as it raises political support and financial means for environmental policy measures. Since the early 1990s this view has increasingly been supported by empirical evidence that has challenged the traditional belief held by environmentalists that economic growth degrades the environment. This study investigates the relationship between economic growth and environmental quality and elaborates the question whether economic growth can be combined with a reduced demand for natural resources. Various hypotheses on this relationship are described and empirically tested for a number of indicators of environmental pressure. The outcome of the tests advocates the use of alternative models for estimation that alter conclusions about the relationship between economic growth and the environment and give insight into the driving forces of emission reduction in developed economies. refs.

  5. The Environmental Kuznets Curve. An empirical analysis for OECD countries

    Energy Technology Data Exchange (ETDEWEB)

    Georgiev, E.

    2008-09-15

    This paper tests the Environmental Kuznets Curve hypothesis for four local (SOx, NOx, CO, VOC) and two global (CO2, GHG) air pollutants. Using a new panel data set of thirty OECD countries, the paper finds that the postulated inverted U-shaped relationship between income and pollution does not hold for all gases. A meaningful Environmental Kuznets Curve exists only for CO, VOC and NOx, where for CO2 the curve is monotonically increasing. For GHG there is indication of an inverted U-shaped relationship between income and pollution, but still most countries are on the increasing path of the curve and hence the future development of the curve is uncertain. For SOx it was found that emissions follow an U-shaped curve. Based on the empirical results, the paper concludes that the Environmental Kuznets Curve does not hold for all gases, it is rather an empirical artefact than a regularity.

  6. The Environmental Kuznets Curve. An empirical analysis for OECD countries

    International Nuclear Information System (INIS)

    Georgiev, E.

    2008-09-01

    This paper tests the Environmental Kuznets Curve hypothesis for four local (SOx, NOx, CO, VOC) and two global (CO2, GHG) air pollutants. Using a new panel data set of thirty OECD countries, the paper finds that the postulated inverted U-shaped relationship between income and pollution does not hold for all gases. A meaningful Environmental Kuznets Curve exists only for CO, VOC and NOx, where for CO2 the curve is monotonically increasing. For GHG there is indication of an inverted U-shaped relationship between income and pollution, but still most countries are on the increasing path of the curve and hence the future development of the curve is uncertain. For SOx it was found that emissions follow an U-shaped curve. Based on the empirical results, the paper concludes that the Environmental Kuznets Curve does not hold for all gases, it is rather an empirical artefact than a regularity.

  7. Does gender equality promote social trust? An empirical analysis

    OpenAIRE

    Seo-Young Cho

    2015-01-01

    Fairness can be an important factor that promotes social trust among people. In this paper, I investigate empirically whether fairness between men and women increases social trust. Using the data of the World Value Survey from 91 countries, I find that gender discriminatory values negatively affect the trust level of both men and women, while actual conditions on gender equality, measured by labor and educational attainments and political participation, are not a significant determinant of so...

  8. Tax morale : theory and empirical analysis of tax compliance

    OpenAIRE

    Torgler, Benno

    2003-01-01

    Tax morale is puzzling in our society. Observations show that tax compliance cannot be satisfactorily explained by the level of enforcement. Other factors may well be relevant. This paper contains a short survey of important theoretical and empirical findings in the tax morale literature, focussing on personal income tax morale. The following three key topics are discussed: moral sentiments, fairness and the relationship between taxpayer and government. The survey stresses the ...

  9. Empirical fractal geometry analysis of some speculative financial bubbles

    Science.gov (United States)

    Redelico, Francisco O.; Proto, Araceli N.

    2012-11-01

    Empirical evidence of a multifractal signature during increasing of a financial bubble leading to a crash is presented. The April 2000 crash in the NASDAQ composite index and a time series from the discrete Chakrabarti-Stinchcombe model for earthquakes are analyzed using a geometric approach and some common patterns are identified. These patterns can be related the geometry of the rising period of a financial bubbles with the non-concave entropy problem.

  10. Solid Rocket Booster Large Main and Drogue Parachute Reliability Analysis

    Science.gov (United States)

    Clifford, Courtenay B.; Hengel, John E.

    2009-01-01

    The parachutes on the Space Transportation System (STS) Solid Rocket Booster (SRB) are the means for decelerating the SRB and allowing it to impact the water at a nominal vertical velocity of 75 feet per second. Each SRB has one pilot, one drogue, and three main parachutes. About four minutes after SRB separation, the SRB nose cap is jettisoned, deploying the pilot parachute. The pilot chute then deploys the drogue parachute. The drogue chute provides initial deceleration and proper SRB orientation prior to frustum separation. At frustum separation, the drogue pulls the frustum from the SRB and allows the main parachutes that are mounted in the frustum to unpack and inflate. These chutes are retrieved, inspected, cleaned, repaired as needed, and returned to the flight inventory and reused. Over the course of the Shuttle Program, several improvements have been introduced to the SRB main parachutes. A major change was the replacement of the small (115 ft. diameter) main parachutes with the larger (136 ft. diameter) main parachutes. Other modifications were made to the main parachutes, main parachute support structure, and SRB frustum to eliminate failure mechanisms, improve damage tolerance, and improve deployment and inflation characteristics. This reliability analysis is limited to the examination of the SRB Large Main Parachute (LMP) and drogue parachute failure history to assess the reliability of these chutes. From the inventory analysis, 68 Large Main Parachutes were used in 651 deployments, and 7 chute failures occurred in the 651 deployments. Logistic regression was used to analyze the LMP failure history, and it showed that reliability growth has occurred over the period of use resulting in a current chute reliability of R = .9983. This result was then used to determine the reliability of the 3 LMPs on the SRB, when all must function. There are 29 drogue parachutes that were used in 244 deployments, and no in-flight failures have occurred. Since there are no

  11. Inclusion of task dependence in human reliability analysis

    International Nuclear Information System (INIS)

    Su, Xiaoyan; Mahadevan, Sankaran; Xu, Peida; Deng, Yong

    2014-01-01

    Dependence assessment among human errors in human reliability analysis (HRA) is an important issue, which includes the evaluation of the dependence among human tasks and the effect of the dependence on the final human error probability (HEP). This paper represents a computational model to handle dependence in human reliability analysis. The aim of the study is to automatically provide conclusions on the overall degree of dependence and calculate the conditional human error probability (CHEP) once the judgments of the input factors are given. The dependence influencing factors are first identified by the experts and the priorities of these factors are also taken into consideration. Anchors and qualitative labels are provided as guidance for the HRA analyst's judgment of the input factors. The overall degree of dependence between human failure events is calculated based on the input values and the weights of the input factors. Finally, the CHEP is obtained according to a computing formula derived from the technique for human error rate prediction (THERP) method. The proposed method is able to quantify the subjective judgment from the experts and improve the transparency in the HEP evaluation process. Two examples are illustrated to show the effectiveness and the flexibility of the proposed method. - Highlights: • We propose a computational model to handle dependence in human reliability analysis. • The priorities of the dependence influencing factors are taken into consideration. • The overall dependence degree is determined by input judgments and the weights of factors. • The CHEP is obtained according to a computing formula derived from THERP

  12. High-Reliable PLC RTOS Development and RPS Structure Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, H. S.; Song, D. Y.; Sohn, D. S.; Kim, J. H. [Enersys Co., Daejeon (Korea, Republic of)

    2008-04-15

    One of the KNICS objectives is to develop a platform for Nuclear Power Plant(NPP) I and C(Instrumentation and Control) system, especially plant protection system. The developed platform is POSAFE-Q and this work supports the development of POSAFE-Q with the development of high-reliable real-time operating system(RTOS) and programmable logic device(PLD) software. Another KNICS objective is to develop safety I and C systems, such as Reactor Protection System(RPS) and Engineered Safety Feature-Component Control System(ESF-CCS). This work plays an important role in the structure analysis for RPS. Validation and verification(V and V) of the safety critical software is an essential work to make digital plant protection system highly reliable and safe. Generally, the reliability and safety of software based system can be improved by strict quality assurance framework including the software development itself. In other words, through V and V, the reliability and safety of a system can be improved and the development activities like software requirement specification, software design specification, component tests, integration tests, and system tests shall be appropriately documented for V and V.

  13. High-Reliable PLC RTOS Development and RPS Structure Analysis

    International Nuclear Information System (INIS)

    Sohn, H. S.; Song, D. Y.; Sohn, D. S.; Kim, J. H.

    2008-04-01

    One of the KNICS objectives is to develop a platform for Nuclear Power Plant(NPP) I and C(Instrumentation and Control) system, especially plant protection system. The developed platform is POSAFE-Q and this work supports the development of POSAFE-Q with the development of high-reliable real-time operating system(RTOS) and programmable logic device(PLD) software. Another KNICS objective is to develop safety I and C systems, such as Reactor Protection System(RPS) and Engineered Safety Feature-Component Control System(ESF-CCS). This work plays an important role in the structure analysis for RPS. Validation and verification(V and V) of the safety critical software is an essential work to make digital plant protection system highly reliable and safe. Generally, the reliability and safety of software based system can be improved by strict quality assurance framework including the software development itself. In other words, through V and V, the reliability and safety of a system can be improved and the development activities like software requirement specification, software design specification, component tests, integration tests, and system tests shall be appropriately documented for V and V.

  14. Reliability analysis with linguistic data: An evidential network approach

    International Nuclear Information System (INIS)

    Zhang, Xiaoge; Mahadevan, Sankaran; Deng, Xinyang

    2017-01-01

    In practical applications of reliability assessment of a system in-service, information about the condition of a system and its components is often available in text form, e.g., inspection reports. Estimation of the system reliability from such text-based records becomes a challenging problem. In this paper, we propose a four-step framework to deal with this problem. In the first step, we construct an evidential network with the consideration of available knowledge and data. Secondly, we train a Naive Bayes text classification algorithm based on the past records. By using the trained Naive Bayes algorithm to classify the new records, we build interval basic probability assignments (BPA) for each new record available in text form. Thirdly, we combine the interval BPAs of multiple new records using an evidence combination approach based on evidence theory. Finally, we propagate the interval BPA through the evidential network constructed earlier to obtain the system reliability. Two numerical examples are used to demonstrate the efficiency of the proposed method. We illustrate the effectiveness of the proposed method by comparing with Monte Carlo Simulation (MCS) results. - Highlights: • We model reliability analysis with linguistic data using evidential network. • Two examples are used to demonstrate the efficiency of the proposed method. • We compare the results with Monte Carlo Simulation (MCS).

  15. Reliability analysis for new technology-based transmitters

    Energy Technology Data Exchange (ETDEWEB)

    Brissaud, Florent, E-mail: florent.brissaud.2007@utt.f [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France); Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and STMR UMR CNRS 6279, 12 rue Marie Curie, BP 2060, 10010 Troyes cedex (France); Barros, Anne; Berenguer, Christophe [Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and STMR UMR CNRS 6279, 12 rue Marie Curie, BP 2060, 10010 Troyes cedex (France); Charpentier, Dominique [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France)

    2011-02-15

    The reliability analysis of new technology-based transmitters has to deal with specific issues: various interactions between both material elements and functions, undefined behaviours under faulty conditions, several transmitted data, and little reliability feedback. To handle these particularities, a '3-step' model is proposed, based on goal tree-success tree (GTST) approaches to represent both the functional and material aspects, and includes the faults and failures as a third part for supporting reliability analyses. The behavioural aspects are provided by relationship matrices, also denoted master logic diagrams (MLD), with stochastic values which represent direct relationships between system elements. Relationship analyses are then proposed to assess the effect of any fault or failure on any material element or function. Taking these relationships into account, the probabilities of malfunction and failure modes are evaluated according to time. Furthermore, uncertainty analyses tend to show that even if the input data and system behaviour are not well known, these previous results can be obtained in a relatively precise way. An illustration is provided by a case study on an infrared gas transmitter. These properties make the proposed model and corresponding reliability analyses especially suitable for intelligent transmitters (or 'smart sensors').

  16. Reliability engineering analysis of ATLAS data reprocessing campaigns

    International Nuclear Information System (INIS)

    Vaniachine, A; Golubkov, D; Karpenko, D

    2014-01-01

    During three years of LHC data taking, the ATLAS collaboration completed three petascale data reprocessing campaigns on the Grid, with up to 2 PB of data being reprocessed every year. In reprocessing on the Grid, failures can occur for a variety of reasons, while Grid heterogeneity makes failures hard to diagnose and repair quickly. As a result, Big Data processing on the Grid must tolerate a continuous stream of failures, errors and faults. While ATLAS fault-tolerance mechanisms improve the reliability of Big Data processing in the Grid, their benefits come at costs and result in delays making the performance prediction difficult. Reliability Engineering provides a framework for fundamental understanding of the Big Data processing on the Grid, which is not a desirable enhancement but a necessary requirement. In ATLAS, cost monitoring and performance prediction became critical for the success of the reprocessing campaigns conducted in preparation for the major physics conferences. In addition, our Reliability Engineering approach supported continuous improvements in data reprocessing throughput during LHC data taking. The throughput doubled in 2011 vs. 2010 reprocessing, then quadrupled in 2012 vs. 2011 reprocessing. We present the Reliability Engineering analysis of ATLAS data reprocessing campaigns providing the foundation needed to scale up the Big Data processing technologies beyond the petascale.

  17. Some developments in human reliability analysis approaches and tools

    Energy Technology Data Exchange (ETDEWEB)

    Hannaman, G W; Worledge, D H

    1988-01-01

    Since human actions have been recognized as an important contributor to safety of operating plants in most industries, research has been performed to better understand and account for the way operators interact during accidents through the control room and equipment interface. This paper describes the integration of a series of research projects sponsored by the Electric Power Research Institute to strengthen the methods for performing the human reliability analysis portion of the probabilistic safety studies. It focuses on the analytical framework used to guide the analysis, the development of the models for quantifying time-dependent actions, and simulator experiments used to validate the models.

  18. ANALYSIS OF AVAILABILITY AND RELIABILITY IN RHIC OPERATIONS

    International Nuclear Information System (INIS)

    PILAT, F.; INGRASSIA, P.; MICHNOFF, R.

    2006-01-01

    RHIC has been successfully operated for 5 years as a collider for different species, ranging from heavy ions including gold and copper, to polarized protons. We present a critical analysis of reliability data for RHIC that not only identifies the principal factors limiting availability but also evaluates critical choices at design times and assess their impact on present machine performance. RHIC availability data are typical when compared to similar high-energy colliders. The critical analysis of operations data is the basis for studies and plans to improve RHIC machine availability beyond the 50-60% typical of high-energy colliders

  19. A comparative reliability analysis of free-piston Stirling machines

    Science.gov (United States)

    Schreiber, Jeffrey G.

    2001-02-01

    A free-piston Stirling power convertor is being developed for use in an advanced radioisotope power system to provide electric power for NASA deep space missions. These missions are typically long lived, lasting for up to 14 years. The Department of Energy (DOE) is responsible for providing the radioisotope power system for the NASA missions, and has managed the development of the free-piston power convertor for this application. The NASA Glenn Research Center has been involved in the development of Stirling power conversion technology for over 25 years and is currently providing support to DOE. Due to the nature of the potential missions, long life and high reliability are important features for the power system. Substantial resources have been spent on the development of long life Stirling cryocoolers for space applications. As a very general statement, free-piston Stirling power convertors have many features in common with free-piston Stirling cryocoolers, however there are also significant differences. For example, designs exist for both power convertors and cryocoolers that use the flexure bearing support system to provide noncontacting operation of the close-clearance moving parts. This technology and the operating experience derived from one application may be readily applied to the other application. This similarity does not pertain in the case of outgassing and contamination. In the cryocooler, the contaminants normally condense in the critical heat exchangers and foul the performance. In the Stirling power convertor just the opposite is true as contaminants condense on non-critical surfaces. A methodology was recently published that provides a relative comparison of reliability, and is applicable to systems. The methodology has been applied to compare the reliability of a Stirling cryocooler relative to that of a free-piston Stirling power convertor. The reliability analysis indicates that the power convertor should be able to have superior reliability

  20. Modeling of seismic hazards for dynamic reliability analysis

    International Nuclear Information System (INIS)

    Mizutani, M.; Fukushima, S.; Akao, Y.; Katukura, H.

    1993-01-01

    This paper investigates the appropriate indices of seismic hazard curves (SHCs) for seismic reliability analysis. In the most seismic reliability analyses of structures, the seismic hazards are defined in the form of the SHCs of peak ground accelerations (PGAs). Usually PGAs play a significant role in characterizing ground motions. However, PGA is not always a suitable index of seismic motions. When random vibration theory developed in the frequency domain is employed to obtain statistics of responses, it is more convenient for the implementation of dynamic reliability analysis (DRA) to utilize an index which can be determined in the frequency domain. In this paper, we summarize relationships among the indices which characterize ground motions. The relationships between the indices and the magnitude M are arranged as well. In this consideration, duration time plays an important role in relating two distinct class, i.e. energy class and power class. Fourier and energy spectra are involved in the energy class, and power and response spectra and PGAs are involved in the power class. These relationships are also investigated by using ground motion records. Through these investigations, we have shown the efficiency of employing the total energy as an index of SHCs, which can be determined in the time and frequency domains and has less variance than the other indices. In addition, we have proposed the procedure of DRA based on total energy. (author)

  1. Interrater reliability of videotaped observational gait-analysis assessments.

    Science.gov (United States)

    Eastlack, M E; Arvidson, J; Snyder-Mackler, L; Danoff, J V; McGarvey, C L

    1991-06-01

    The purpose of this study was to determine the interrater reliability of videotaped observational gait-analysis (VOGA) assessments. Fifty-four licensed physical therapists with varying amounts of clinical experience served as raters. Three patients with rheumatoid arthritis who demonstrated an abnormal gait pattern served as subjects for the videotape. The raters analyzed each patient's most severely involved knee during the four subphases of stance for the kinematic variables of knee flexion and genu valgum. Raters were asked to determine whether these variables were inadequate, normal, or excessive. The temporospatial variables analyzed throughout the entire gait cycle were cadence, step length, stride length, stance time, and step width. Generalized kappa coefficients ranged from .11 to .52. Intraclass correlation coefficients (2,1) and (3,1) were slightly higher. Our results indicate that physical therapists' VOGA assessments are only slightly to moderately reliable and that improved interrater reliability of the assessments of physical therapists utilizing this technique is needed. Our data suggest that there is a need for greater standardization of gait-analysis training.

  2. Modeling and Analysis of Component Faults and Reliability

    DEFF Research Database (Denmark)

    Le Guilly, Thibaut; Olsen, Petur; Ravn, Anders Peter

    2016-01-01

    This chapter presents a process to design and validate models of reactive systems in the form of communicating timed automata. The models are extended with faults associated with probabilities of occurrence. This enables a fault tree analysis of the system using minimal cut sets that are automati......This chapter presents a process to design and validate models of reactive systems in the form of communicating timed automata. The models are extended with faults associated with probabilities of occurrence. This enables a fault tree analysis of the system using minimal cut sets...... that are automatically generated. The stochastic information on the faults is used to estimate the reliability of the fault affected system. The reliability is given with respect to properties of the system state space. We illustrate the process on a concrete example using the Uppaal model checker for validating...... the ideal system model and the fault modeling. Then the statistical version of the tool, UppaalSMC, is used to find reliability estimates....

  3. Damage tolerance reliability analysis of automotive spot-welded joints

    International Nuclear Information System (INIS)

    Mahadevan, Sankaran; Ni Kan

    2003-01-01

    This paper develops a damage tolerance reliability analysis methodology for automotive spot-welded joints under multi-axial and variable amplitude loading history. The total fatigue life of a spot weld is divided into two parts, crack initiation and crack propagation. The multi-axial loading history is obtained from transient response finite element analysis of a vehicle model. A three-dimensional finite element model of a simplified joint with four spot welds is developed for static stress/strain analysis. A probabilistic Miner's rule is combined with a randomized strain-life curve family and the stress/strain analysis result to develop a strain-based probabilistic fatigue crack initiation life prediction for spot welds. Afterwards, the fatigue crack inside the base material sheet is modeled as a surface crack. Then a probabilistic crack growth model is combined with the stress analysis result to develop a probabilistic fatigue crack growth life prediction for spot welds. Both methods are implemented with MSC/NASTRAN and MSC/FATIGUE software, and are useful for reliability assessment of automotive spot-welded joints against fatigue and fracture

  4. Reliability analysis of offshore structures using OMA based fatigue stresses

    DEFF Research Database (Denmark)

    Silva Nabuco, Bruna; Aissani, Amina; Glindtvad Tarpø, Marius

    2017-01-01

    focus is on the uncertainty observed on the different stresses used to predict the damage. This uncertainty can be reduced by Modal Based Fatigue Monitoring which is a technique based on continuously measuring of the accelerations in few points of the structure with the use of accelerometers known...... points of the structure, the stress history can be calculated in any arbitrary point of the structure. The accuracy of the estimated actual stress is analyzed by experimental tests on a scale model where the obtained stresses are compared to strain gauges measurements. After evaluating the fatigue...... stresses directly from the operational response of the structure, a reliability analysis is performed in order to estimate the reliability of using Modal Based Fatigue Monitoring for long term fatigue studies....

  5. Reliability analysis of neutron flux monitoring system for PFBR

    International Nuclear Information System (INIS)

    Rajesh, M.G.; Bhatnagar, P.V.; Das, D.; Pithawa, C.K.; Vinod, Gopika; Rao, V.V.S.S.

    2010-01-01

    The Neutron Flux Monitoring System (NFMS) measures reactor power, rate of change of power and reactivity changes in the core in all states of operation and shutdown. The system consists of instrument channels that are designed and built to have high reliability. All channels are required to have a Mean Time Between Failures (MTBF) of 150000 hours minimum. Failure Mode and Effects Analysis (FMEA) and failure rate estimation of NFMS channels has been carried out. FMEA is carried out in compliance with MIL-STD-338B. Reliability estimation of the channels is done according to MIL-HDBK-217FN2. Paper discusses the methodology followed for FMEA and failure rate estimation of two safety channels and results. (author)

  6. Photovoltaic module reliability improvement through application testing and failure analysis

    Science.gov (United States)

    Dumas, L. N.; Shumka, A.

    1982-01-01

    During the first four years of the U.S. Department of Energy (DOE) National Photovoltatic Program, the Jet Propulsion Laboratory Low-Cost Solar Array (LSA) Project purchased about 400 kW of photovoltaic modules for test and experiments. In order to identify, report, and analyze test and operational problems with the Block Procurement modules, a problem/failure reporting and analysis system was implemented by the LSA Project with the main purpose of providing manufacturers with feedback from test and field experience needed for the improvement of product performance and reliability. A description of the more significant types of failures is presented, taking into account interconnects, cracked cells, dielectric breakdown, delamination, and corrosion. Current design practices and reliability evaluations are also discussed. The conducted evaluation indicates that current module designs incorporate damage-resistant and fault-tolerant features which address field failure mechanisms observed to date.

  7. Mechanical Properties for Reliability Analysis of Structures in Glassy Carbon

    CERN Document Server

    Garion, Cédric

    2014-01-01

    Despite its good physical properties, the glassy carbon material is not widely used, especially for structural applications. Nevertheless, its transparency to particles and temperature resistance are interesting properties for the applications to vacuum chambers and components in high energy physics. For example, it has been proposed for fast shutter valve in particle accelerator [1] [2]. The mechanical properties have to be carefully determined to assess the reliability of structures in such a material. In this paper, mechanical tests have been carried out to determine the elastic parameters, the strength and toughness on commercial grades. A statistical approach, based on the Weibull’s distribution, is used to characterize the material both in tension and compression. The results are compared to the literature and the difference of properties for these two loading cases is shown. Based on a Finite Element analysis, a statistical approach is applied to define the reliability of a structural component in gl...

  8. User's manual of a support system for human reliability analysis

    International Nuclear Information System (INIS)

    Yokobayashi, Masao; Tamura, Kazuo.

    1995-10-01

    Many kinds of human reliability analysis (HRA) methods have been developed. However, users are required to be skillful so as to use them, and also required complicated works such as drawing event tree (ET) and calculation of uncertainty bounds. Moreover, each method is not so complete that only one method of them is not enough to evaluate human reliability. Therefore, a personal computer (PC) based support system for HRA has been developed to execute HRA practically and efficiently. The system consists of two methods, namely, simple method and detailed one. The former uses ASEP that is a simplified THERP-technique, and combined method of OAT and HRA-ET/DeBDA is used for the latter. Users can select a suitable method for their purpose. Human error probability (HEP) data were collected and a database of them was built to use for the support system. This paper describes outline of the HRA methods, support functions and user's guide of the system. (author)

  9. Review of the treat upgrade reactor scram system reliability analysis

    International Nuclear Information System (INIS)

    Montague, D.F.; Fussell, J.B.; Krois, P.A.; Morelock, T.C.; Knee, H.E.; Manning, J.J.; Haas, P.M.; West, K.W.

    1984-10-01

    In order to resolve some key LMFBR safety issues, ANL personnel are modifying the TREAT reactor to handle much larger experiments. As a result of these modifications, the upgraded Treat reactor will not always operate in a self-limited mode. During certain experiments in the upgraded TREAT reactor, it is possible that the fuel could be damaged by overheating if, once the computer systems fail, the reactor scram system (RSS) fails on demand. To help ensure that the upgraded TREAT reactor is shut down when required, ANL personnel have designed a triply redundant RSS for the facility. The RSS is designed to meet three reliability goals: (1) a loss of capability failure probability of 10 -9 /demand (independent failures only); (2) an inadvertent shutdown probability of 10 -3 /experiment; and (3) protection agaist any known potential common cause failures. According to ANL's reliability analysis of the RSS, this system substantially meets these goals

  10. Economic Growth and Transboundary Pollution in Europe. An Empirical Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ansuategi, A. [Ekonomi Analisiaren Oinarriak I Saila, Ekonomi Zientzien Fakultatea, Lehendakari Agirre Etorbidea, 83, 48015 Bilbao (Spain)

    2003-10-01

    The existing empirical evidence suggests that environmental Kuznets curves only exist for pollutants with semi-local and medium term impacts. Ansuategi and Perrings (2000) have considered the behavioral basis for the correlation observed between different spatial incidence of environmental degradation and the relation between economic growth and environmental quality. They show that self-interested planners following a Nash-type strategy tend to address environmental effects sequentially: addressing those with the most immediate costs first, and those whose costs are displaced in space later. This paper tests such behavioral basis in the context of sulphur dioxide emissions in Europe.

  11. Economic Growth and Transboundary Pollution in Europe. An Empirical Analysis

    International Nuclear Information System (INIS)

    Ansuategi, A.

    2003-01-01

    The existing empirical evidence suggests that environmental Kuznets curves only exist for pollutants with semi-local and medium term impacts. Ansuategi and Perrings (2000) have considered the behavioral basis for the correlation observed between different spatial incidence of environmental degradation and the relation between economic growth and environmental quality. They show that self-interested planners following a Nash-type strategy tend to address environmental effects sequentially: addressing those with the most immediate costs first, and those whose costs are displaced in space later. This paper tests such behavioral basis in the context of sulphur dioxide emissions in Europe

  12. Development of an empirical model of turbine efficiency using the Taylor expansion and regression analysis

    International Nuclear Information System (INIS)

    Fang, Xiande; Xu, Yu

    2011-01-01

    The empirical model of turbine efficiency is necessary for the control- and/or diagnosis-oriented simulation and useful for the simulation and analysis of dynamic performances of the turbine equipment and systems, such as air cycle refrigeration systems, power plants, turbine engines, and turbochargers. Existing empirical models of turbine efficiency are insufficient because there is no suitable form available for air cycle refrigeration turbines. This work performs a critical review of empirical models (called mean value models in some literature) of turbine efficiency and develops an empirical model in the desired form for air cycle refrigeration, the dominant cooling approach in aircraft environmental control systems. The Taylor series and regression analysis are used to build the model, with the Taylor series being used to expand functions with the polytropic exponent and the regression analysis to finalize the model. The measured data of a turbocharger turbine and two air cycle refrigeration turbines are used for the regression analysis. The proposed model is compact and able to present the turbine efficiency map. Its predictions agree with the measured data very well, with the corrected coefficient of determination R c 2 ≥ 0.96 and the mean absolute percentage deviation = 1.19% for the three turbines. -- Highlights: → Performed a critical review of empirical models of turbine efficiency. → Developed an empirical model in the desired form for air cycle refrigeration, using the Taylor expansion and regression analysis. → Verified the method for developing the empirical model. → Verified the model.

  13. EMPIRICAL ANALYSIS OF THE ROLE OF THE FIRMS’ VALUE DRIVERS

    Directory of Open Access Journals (Sweden)

    Anita KISS

    2015-12-01

    Full Text Available This paper focuses on the value creation and the value drivers. One of the objectives of this paper is to present the concept of maximizing shareholder value. The main goal is to categorize the most important value drivers, and their role of the firms’ value. This study proceeds as follows. The first section presents the value chain, the primary and the support activities. The second part describes the theoretical background of maximizing shareholder value. The third part illustrates the key value drivers. The fourth empirical section of the study analyses the database featuring data from 18 European countries, 10 sectors and 1553 firms in the period between 2004 and 2011. Finally, the fifth section includes concluding remarks. Based on the related literature reviewed and in the conducted empirical research it can be assessed that, the EBIT, reinvestment, invested capital, the return on invested capital, the net margin and the sales growth rate all have a positive effect on firm value, while the tax rate and the market value of return on asset (MROA has a negative one.

  14. Business ethics and economic growth: An empirical analysis for Turkish economy

    Directory of Open Access Journals (Sweden)

    Ekrem Erdem

    2015-12-01

    Full Text Available Purpose – The roots of the science of modern economics are originated from the ideas of Adam Smith who is not a pure economist but a moralist-philosopher. Basic concepts in the Wealth of Nations which is perceived as the hand book of economics depend on the arguments that Adam Smith suggests in his Theory of Moral Sentiments. In this theory, business ethics as a part of the Law of Sympathy appears as one of the factors that provide the invisible hand to operate properly. In light of this property, it is possible to assume business ethics as one of the components of the market mechanism. In this context, this study aims to analyse the link between business ethics and economic growth in the Turkish economy. Design/methodology/approach – The study employs bounced cheques and protested bonds for representing the degradation of business ethics and tries to show how this degradation affects economic growth in the Turkish economy for the period 1988-2013. Findings – Either illustrative or empirical results show that business ethics is an important determinant of economic growth in the Turkish economy and damaging it negatively effects the growth rate of the economy. Research limitations/implications – One of the most restrictive things conducting the present empirical analysis is the lack of various and longer data sets. Using different indicators in terms of business ethics with longer time span will definitely increase the reliability of the study. However, in the current form, results imply a policy that is capable of limiting the failures of business ethics may boost the Turkish economy up. Originality/value – The results tend to support the close link between business ethics and economic growth.

  15. Empirical Analysis for the Heat Exchange Effectiveness of a Thermoelectric Liquid Cooling and Heating Unit

    Directory of Open Access Journals (Sweden)

    Hansol Lim

    2018-03-01

    Full Text Available This study aims to estimate the performance of thermoelectric module (TEM heat pump for simultaneous liquid cooling and heating and propose empirical models for predicting the heat exchange effectiveness. The experiments were conducted to investigate and collect the performance data of TEM heat pump where the working fluid was water. A total of 57 sets of experimental data were statistically analyzed to estimate the effects of each independent variable on the heat exchange effectiveness using analysis of variance (ANOVA. To develop the empirical model, the six design parameters were measured: the number of transfer units (NTU of the heat exchangers (i.e., water blocks, the inlet water temperatures and temperatures of water blocks at the cold and hot sides of the TEM. As a result, two polynomial equations predicting heat exchange effectiveness at the cold and hot sides of the TEM heat pump were derived as a function of the six selected design parameters. Also, the proposed models and theoretical model of conventional condenser and evaporator for heat exchange effectiveness were compared with the additional measurement data to validate the reliability of the proposed models. Consequently, two conclusions have been made: (1 the possibility of using the TEM heat pump for simultaneous cooling and heating was examined with the maximum temperature difference of 30 °C between cold and hot side of TEM, and (2 it is revealed that TEM heat pump has difference with the conventional evaporator and condenser from the comparison results between the proposed models and theoretical model due to the heat conduction and Joule effect in TEM.

  16. Reliability Analysis of the CERN Radiation Monitoring Electronic System CROME

    CERN Document Server

    AUTHOR|(CDS)2126870

    For the new in-house developed CERN Radiation Monitoring Electronic System (CROME) a reliability analysis is necessary to ensure compliance with the statu-tory requirements regarding the Safety Integrity Level. The required Safety Integrity Level by IEC 60532 standard is SIL 2 (for the Safety Integrated Functions Measurement, Alarm Triggering and Interlock Triggering). The first step of the reliability analysis was a system and functional analysis which served as basis for the implementation of the CROME system in the software “Iso-graph”. In the “Prediction” module of Isograph the failure rates of all components were calculated. Failure rates for passive components were calculated by the Military Standard 217 and failure rates for active components were obtained from lifetime tests by the manufacturers. The FMEA was carried out together with the board designers and implemented in the “FMECA” module of Isograph. The FMEA served as basis for the Fault Tree Analysis and the detection of weak points...

  17. Decision theory, the context for risk and reliability analysis

    International Nuclear Information System (INIS)

    Kaplan, S.

    1985-01-01

    According to this model of the decision process then, the optimum decision is that option having the largest expected utility. This is the fundamental model of a decision situation. It is necessary to remark that in order for the model to represent a real-life decision situation, it must include all the options present in that situation, including, for example, the option of not deciding--which is itself a decision, although usually not the optimum one. Similarly, it should include the option of delaying the decision while the authors gather further information. Both of these options have probabilities, outcomes, impacts, and utilities like any option and should be included explicitly in the decision diagram. The reason for doing a quantitative risk or reliability analysis is always that, somewhere underlying there is a decision to be made. The decision analysis therefore always forms the context for the risk or reliability analysis, and this context shapes the form and language of that analysis. Therefore, they give in this section a brief review of the well-known decision theory diagram

  18. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  19. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  20. Limitations in simulator time-based human reliability analysis methods

    International Nuclear Information System (INIS)

    Wreathall, J.

    1989-01-01

    Developments in human reliability analysis (HRA) methods have evolved slowly. Current methods are little changed from those of almost a decade ago, particularly in the use of time-reliability relationships. While these methods were suitable as an interim step, the time (and the need) has come to specify the next evolution of HRA methods. As with any performance-oriented data source, power plant simulator data have no direct connection to HRA models. Errors reported in data are normal deficiencies observed in human performance; failures are events modeled in probabilistic risk assessments (PRAs). Not all errors cause failures; not all failures are caused by errors. Second, the times at which actions are taken provide no measure of the likelihood of failures to act correctly within an accident scenario. Inferences can be made about human reliability, but they must be made with great care. Specific limitations are discussed. Simulator performance data are useful in providing qualitative evidence of the variety of error types and their potential influences on operating systems. More work is required to combine recent developments in the psychology of error with the qualitative data collected at stimulators. Until data become openly available, however, such an advance will not be practical

  1. The DYLAM approach for the dynamic reliability analysis of systems

    International Nuclear Information System (INIS)

    Cojazzi, Giacomo

    1996-01-01

    In many real systems, failures occurring to the components, control failures and human interventions often interact with the physical system evolution in such a way that a simple reliability analysis, de-coupled from process dynamics, is very difficult or even impossible. In the last ten years many dynamic reliability approaches have been proposed to properly assess the reliability of these systems characterized by dynamic interactions. The DYLAM methodology, now implemented in its latest version, DYLAM-3, offers a powerful tool for integrating deterministic and failure events. This paper describes the main features of the DYLAM-3 code with reference to the classic fault-tree and event-tree techniques. Some aspects connected to the practical problems underlying dynamic event-trees are also discussed. A simple system, already analyzed with other dynamic methods is used as a reference for the numerical applications. The same system is also studied with a time-dependent fault-tree approach in order to show some features of dynamic methods vs classical techniques. Examples including stochastic failures, without and with repair, failures on demand and time dependent failure rates give an extensive overview of DYLAM-3 capabilities

  2. Reliability analysis of self-actuated shutdown system

    International Nuclear Information System (INIS)

    Itooka, S.; Kumasaka, K.; Okabe, A.; Satoh, K.; Tsukui, Y.

    1991-01-01

    An analytical study was performed for the reliability of a self-actuated shutdown system (SASS) under the unprotected loss of flow (ULOF) event in a typical loop-type liquid metal fast breeder reactor (LMFBR) by the use of the response surface Monte Carlo analysis method. Dominant parameters for the SASS, such as Curie point characteristics, subassembly outlet coolant temperature, electromagnetic surface condition, etc., were selected and their probability density functions (PDFs) were determined by the design study information and experimental data. To get the response surface function (RSF) for the maximum coolant temperature, transient analyses of ULOF were performed by utilizing the experimental design method in the determination of analytical cases. Then, the RSF was derived by the multi-variable regression analysis. The unreliability of the SASS was evaluated as a probability that the maximum coolant temperature exceeded an acceptable level, employing the Monte Carlo calculation using the above PDFs and RSF. In this study, sensitivities to the dominant parameter were compared. The dispersion of subassembly outlet coolant temperature near the SASS-was found to be one of the most sensitive parameters. Fault tree analysis was performed using this value for the SASS in order to evaluate the shutdown system reliability. As a result of this study, the effectiveness of the SASS on the reliability improvement in the LMFBR shutdown system was analytically confirmed. This study has been performed as a part of joint research and development projects for DFBR under the sponsorship of the nine Japanese electric power companies, Electric Power Development Company and the Japan Atomic Power Company. (author)

  3. Failure and Reliability Analysis for the Master Pump Shutdown System

    International Nuclear Information System (INIS)

    BEVINS, R.R.

    2000-01-01

    The Master Pump Shutdown System (MPSS) will be installed in the 200 Areas of the Hanford Site to monitor and control the transfer of liquid waste between tank farms and between the 200 West and 200 East areas through the Cross-Site Transfer Line. The Safety Function provided by the MPSS is to shutdown any waste transfer process within or between tank farms if a waste leak should occur along the selected transfer route. The MPSS, which provides this Safety Class Function, is composed of Programmable Logic Controllers (PLCs), interconnecting wires, relays, Human to Machine Interfaces (HMI), and software. These components are defined as providing a Safety Class Function and will be designated in this report as MPSS/PLC. Input signals to the MPSS/PLC are provided by leak detection systems from each of the tank farm leak detector locations along the waste transfer route. The combination of the MPSS/PLC, leak detection system, and transfer pump controller system will be referred to as MPSS/SYS. The components addressed in this analysis are associated with the MPSS/SYS. The purpose of this failure and reliability analysis is to address the following design issues of the Project Development Specification (PDS) for the MPSS/SYS (HNF 2000a): (1) Single Component Failure Criterion, (2) System Status Upon Loss of Electrical Power, (3) Physical Separation of Safety Class cables, (4) Physical Isolation of Safety Class Wiring from General Service Wiring, and (5) Meeting the MPSS/PLC Option 1b (RPP 1999) Reliability estimate. The failure and reliability analysis examined the system on a component level basis and identified any hardware or software elements that could fail and/or prevent the system from performing its intended safety function

  4. Fifty Years of THERP and Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    In 1962 at a Human Factors Society symposium, Alan Swain presented a paper introducing a Technique for Human Error Rate Prediction (THERP). This was followed in 1963 by a Sandia Laboratories monograph outlining basic human error quantification using THERP and, in 1964, by a special journal edition of Human Factors on quantification of human performance. Throughout the 1960s, Swain and his colleagues focused on collecting human performance data for the Sandia Human Error Rate Bank (SHERB), primarily in connection with supporting the reliability of nuclear weapons assembly in the US. In 1969, Swain met with Jens Rasmussen of Risø National Laboratory and discussed the applicability of THERP to nuclear power applications. By 1975, in WASH-1400, Swain had articulated the use of THERP for nuclear power applications, and the approach was finalized in the watershed publication of the NUREG/CR-1278 in 1983. THERP is now 50 years old, and remains the most well known and most widely used HRA method. In this paper, the author discusses the history of THERP, based on published reports and personal communication and interviews with Swain. The author also outlines the significance of THERP. The foundations of human reliability analysis are found in THERP: human failure events, task analysis, performance shaping factors, human error probabilities, dependence, event trees, recovery, and pre- and post-initiating events were all introduced in THERP. While THERP is not without its detractors, and it is showing signs of its age in the face of newer technological applications, the longevity of THERP is a testament of its tremendous significance. THERP started the field of human reliability analysis. This paper concludes with a discussion of THERP in the context of newer methods, which can be seen as extensions of or departures from Swain’s pioneering work.

  5. Reliability and Robustness Analysis of the Masinga Dam under Uncertainty

    Directory of Open Access Journals (Sweden)

    Hayden Postle-Floyd

    2017-02-01

    Full Text Available Kenya’s water abstraction must meet the projected growth in municipal and irrigation demand by the end of 2030 in order to achieve the country’s industrial and economic development plan. The Masinga dam, on the Tana River, is the key to meeting this goal to satisfy the growing demands whilst also continuing to provide hydroelectric power generation. This study quantitatively assesses the reliability and robustness of the Masinga dam system under uncertain future supply and demand using probabilistic climate and population projections, and examines how long-term planning may improve the longevity of the dam. River flow and demand projections are used alongside each other as inputs to the dam system simulation model linked to an optimisation engine to maximise water availability. Water availability after demand satisfaction is assessed for future years, and the projected reliability of the system is calculated for selected years. The analysis shows that maximising power generation on a short-term year-by-year basis achieves 80%, 50% and 1% reliability by 2020, 2025 and 2030 onwards, respectively. Longer term optimal planning, however, has increased system reliability to up to 95% in 2020, 80% in 2025, and more than 40% in 2030 onwards. In addition, increasing the capacity of the reservoir by around 25% can significantly improve the robustness of the system for all future time periods. This study provides a platform for analysing the implication of different planning and management of Masinga dam and suggests that careful consideration should be given to account for growing municipal needs and irrigation schemes in both the immediate and the associated Tana River basin.

  6. Empirical modeling and data analysis for engineers and applied scientists

    CERN Document Server

    Pardo, Scott A

    2016-01-01

    This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

  7. An empirical analysis of the corporate call decision

    International Nuclear Information System (INIS)

    Carlson, M.D.

    1998-01-01

    An economic study of the the behaviour of financial managers of utility companies was presented. The study examined whether or not an option pricing based model of the call decision does a better job of explaining callable preferred share prices and call decisions compared to other models. In this study, the Rust (1987) empirical technique was extended to include the use of information from preferred share prices in addition to the call decisions. Reasonable estimates were obtained from data of shares of the Pacific Gas and Electric Company (PGE) for the transaction costs associated with a call. It was concluded that the managers of the PGE clearly take into account the value of the option to delay the call when making their call decisions

  8. Empirical Analysis of Using Erasure Coding in Outsourcing Data Storage With Provable Security

    Science.gov (United States)

    2016-06-01

    computing and communication technologies become powerful and advanced , people are exchanging a huge amount of data, and they are de- manding more storage...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS EMPIRICAL ANALYSIS OF USING ERASURE CODING IN OUTSOURCING DATA STORAGEWITH PROVABLE SECURITY by...2015 to 06-17-2016 4. TITLE AND SUBTITLE EMPIRICAL ANALYSIS OF USING ERASURE CODING IN OUTSOURCING DATA STORAGE WITH PROVABLE SECURITY 5. FUNDING

  9. Reliability analysis of structures under periodic proof tests in service

    Science.gov (United States)

    Yang, J.-N.

    1976-01-01

    A reliability analysis of structures subjected to random service loads and periodic proof tests treats gust loads and maneuver loads as random processes. Crack initiation, crack propagation, and strength degradation are treated as the fatigue process. The time to fatigue crack initiation and ultimate strength are random variables. Residual strength decreases during crack propagation, so that failure rate increases with time. When a structure fails under periodic proof testing, a new structure is built and proof-tested. The probability of structural failure in service is derived from treatment of all the random variables, strength degradations, service loads, proof tests, and the renewal of failed structures. Some numerical examples are worked out.

  10. Creation and Reliability Analysis of Vehicle Dynamic Weighing Model

    Directory of Open Access Journals (Sweden)

    Zhi-Ling XU

    2014-08-01

    Full Text Available In this paper, it is modeled by using ADAMS to portable axle load meter of dynamic weighing system, controlling a single variable simulation weighing process, getting the simulation weighing data under the different speed and weight; simultaneously using portable weighing system with the same parameters to achieve the actual measurement, comparative analysis the simulation results under the same conditions, at 30 km/h or less, the simulation value and the measured value do not differ by more than 5 %, it is not only to verify the reliability of dynamic weighing model, but also to create possible for improving algorithm study efficiency by using dynamic weighing model simulation.

  11. Human Performance Modeling for Dynamic Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory

    2015-08-01

    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  12. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    G. W. Parry; J.A Forester; V.N. Dang; S. M. L. Hendrickson; M. Presley; E. Lois; J. Xing

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure event (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.

  13. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  14. Tailoring a Human Reliability Analysis to Your Industry Needs

    Science.gov (United States)

    DeMott, D. L.

    2016-01-01

    Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed

  15. Population, internal migration, and economic growth: an empirical analysis.

    Science.gov (United States)

    Moreland, R S

    1982-01-01

    The role of population growth in the development process has received increasing attention during the last 15 years, as manifested in the literature in 3 broad categories. In the 1st category, the effects of rapid population growth on the growth of income have been studied with the use of simulation models, which sometimes include endogenous population growth. The 2nd category of the literature is concerned with theoretical and empirical studies of the economic determinants of various demographic rates--most usually fertility. Internal migration and dualism is the 3rd population development category to recieve attention. An attempt is made to synthesize developments in these 3 categories by estimating from a consistent set of data a 2 sector economic demographic model in which the major demographic rates are endogenous. Due to the fact that the interactions between economic and demographic variables are nonlinear and complex, the indirect effects of changes in a particular variable may depend upon the balance of numerical coefficients. For this reason it was felt that the model should be empirically grounded. A brief overview of the model is provided, and the model is compared to some similar existing models. Estimation of the model's 9 behavior equations is discussed, followed by a "base run" simulation of a developing country "stereotype" and a report of a number of policy experiments. The relatively new field of economic determinants of demographic variables was drawn upon in estimating equations to endogenize demographic phenomena that are frequently left exogenous in simulation models. The fertility and labor force participation rate functions are fairly standard, but a step beyong existing literature was taken in the life expectancy and intersectorial migration equations. On the economic side, sectoral savings functions were estimated, and it was found that the marginal propensity to save is lower in agriculture than in nonagriculture. Testing to see the

  16. Inclusion of fatigue effects in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, Candice D. [Vanderbilt University, Nashville, TN (United States); Mahadevan, Sankaran, E-mail: sankaran.mahadevan@vanderbilt.edu [Vanderbilt University, Nashville, TN (United States)

    2011-11-15

    The effect of fatigue on human performance has been observed to be an important factor in many industrial accidents. However, defining and measuring fatigue is not easily accomplished. This creates difficulties in including fatigue effects in probabilistic risk assessments (PRA) of complex engineering systems that seek to include human reliability analysis (HRA). Thus the objectives of this paper are to discuss (1) the importance of the effects of fatigue on performance, (2) the difficulties associated with defining and measuring fatigue, (3) the current status of inclusion of fatigue in HRA methods, and (4) the future directions and challenges for the inclusion of fatigue, specifically sleep deprivation, in HRA. - Highlights: >We highlight the need for fatigue and sleep deprivation effects on performance to be included in human reliability analysis (HRA) methods. Current methods do not explicitly include sleep deprivation effects. > We discuss the difficulties in defining and measuring fatigue. > We review sleep deprivation research, and discuss the limitations and future needs of the current HRA methods.

  17. A methodology for strain-based fatigue reliability analysis

    International Nuclear Information System (INIS)

    Zhao, Y.X.

    2000-01-01

    A significant scatter of the cyclic stress-strain (CSS) responses should be noted for a nuclear reactor material, 1Cr18Ni9Ti pipe-weld metal. Existence of the scatter implies that a random cyclic strain applied history will be introduced under any of the loading modes even a deterministic loading history. A non-conservative evaluation might be given in the practice without considering the scatter. A methodology for strain-based fatigue reliability analysis, which has taken into account the scatter, is developed. The responses are approximately modeled by probability-based CSS curves of Ramberg-Osgood relation. The strain-life data are modeled, similarly, by probability-based strain-life curves of Coffin-Manson law. The reliability assessment is constructed by considering interference of the random fatigue strain applied and capacity histories. Probability density functions of the applied and capacity histories are analytically given. The methodology could be conveniently extrapolated to the case of deterministic CSS relation as the existent methods did. Non-conservative evaluation of the deterministic CSS relation and availability of present methodology have been indicated by an analysis of the material test results

  18. Inclusion of fatigue effects in human reliability analysis

    International Nuclear Information System (INIS)

    Griffith, Candice D.; Mahadevan, Sankaran

    2011-01-01

    The effect of fatigue on human performance has been observed to be an important factor in many industrial accidents. However, defining and measuring fatigue is not easily accomplished. This creates difficulties in including fatigue effects in probabilistic risk assessments (PRA) of complex engineering systems that seek to include human reliability analysis (HRA). Thus the objectives of this paper are to discuss (1) the importance of the effects of fatigue on performance, (2) the difficulties associated with defining and measuring fatigue, (3) the current status of inclusion of fatigue in HRA methods, and (4) the future directions and challenges for the inclusion of fatigue, specifically sleep deprivation, in HRA. - Highlights: →We highlight the need for fatigue and sleep deprivation effects on performance to be included in human reliability analysis (HRA) methods. Current methods do not explicitly include sleep deprivation effects. → We discuss the difficulties in defining and measuring fatigue. → We review sleep deprivation research, and discuss the limitations and future needs of the current HRA methods.

  19. A Temporal Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    (EOF) analysis, which provides a non-temporal analysis of one variable over time. The temporal extension proves its strength in separating the signals at different periods in an analysis of relevant oceanographic properties related to one of the largest El Niño events ever recorded....

  20. Data envelopment analysis in service quality evaluation: an empirical study

    Science.gov (United States)

    Najafi, Seyedvahid; Saati, Saber; Tavana, Madjid

    2015-09-01

    Service quality is often conceptualized as the comparison between service expectations and the actual performance perceptions. It enhances customer satisfaction, decreases customer defection, and promotes customer loyalty. Substantial literature has examined the concept of service quality, its dimensions, and measurement methods. We introduce the perceived service quality index (PSQI) as a single measure for evaluating the multiple-item service quality construct based on the SERVQUAL model. A slack-based measure (SBM) of efficiency with constant inputs is used to calculate the PSQI. In addition, a non-linear programming model based on the SBM is proposed to delineate an improvement guideline and improve service quality. An empirical study is conducted to assess the applicability of the method proposed in this study. A large number of studies have used DEA as a benchmarking tool to measure service quality. These models do not propose a coherent performance evaluation construct and consequently fail to deliver improvement guidelines for improving service quality. The DEA models proposed in this study are designed to evaluate and improve service quality within a comprehensive framework and without any dependency on external data.

  1. Regime switching model for financial data: Empirical risk analysis

    Science.gov (United States)

    Salhi, Khaled; Deaconu, Madalina; Lejay, Antoine; Champagnat, Nicolas; Navet, Nicolas

    2016-11-01

    This paper constructs a regime switching model for the univariate Value-at-Risk estimation. Extreme value theory (EVT) and hidden Markov models (HMM) are combined to estimate a hybrid model that takes volatility clustering into account. In the first stage, HMM is used to classify data in crisis and steady periods, while in the second stage, EVT is applied to the previously classified data to rub out the delay between regime switching and their detection. This new model is applied to prices of numerous stocks exchanged on NYSE Euronext Paris over the period 2001-2011. We focus on daily returns for which calibration has to be done on a small dataset. The relative performance of the regime switching model is benchmarked against other well-known modeling techniques, such as stable, power laws and GARCH models. The empirical results show that the regime switching model increases predictive performance of financial forecasting according to the number of violations and tail-loss tests. This suggests that the regime switching model is a robust forecasting variant of power laws model while remaining practical to implement the VaR measurement.

  2. The Twin Deficits Hypothesis: An Empirical Analysis for Tanzania

    Directory of Open Access Journals (Sweden)

    Manamba Epaphra

    2017-09-01

    Full Text Available This paper examines the relationship between current account and government budget deficits in Tanzania. The paper tests the validity of the twin deficits hypothesis, using annual time series data for the 1966-2015 period. The paper is thought to be significant because the concept of the twin deficit hypothesis is fraught with controversy. Some researches support the hypothesis that there is a positive relationship between current account deficits and fiscal deficits in the economy while others do not. In this paper, the empirical tests fail to reject the twin deficits hypothesis, indicating that rising budget deficits put more strain on the current account deficits in Tanzania. Specifically, the Vector Error Correction Model results support the conventional theory of a positive relationship between fiscal and external balances, with a relatively high speed of adjustment toward the equilibrium position. This evidence is consistent with a small open economy. To address the problem that may result from this kind of relationship, appropriate policy variables for reducing budget deficits such as reduction in non-development expenditure, enhancement of domestic revenue collection and actively fight corruption and tax evasion should be adopted. The government should also target export oriented firms and encourage an import substitution industry by creating favorable business environments.

  3. Environmentalism and elitism: a conceptual and empirical analysis

    Science.gov (United States)

    Morrison, Denton E.; Dunlap, Riley E.

    1986-09-01

    The frequent charge that environmentalism is “elitist” is examined conceptually and empirically. First, the concept of elitism is analyzed by distinguishing between three types of accusations made against the environmental movement: (a) compositional elitism suggests that environmentalists are drawn from privileged socioeconomic strata, (b) ideological elitism suggests that environmental reforms are a subterfuge for distributing benefits to environmentalists and/or costs to others, and (c) impact elitism suggests that environmental reforms, whether intentionally or not, do in fact have regressive social impacts. The evidence bearing on each of the three types of elitism is examined in some detail, and the following conclusions are drawn: Compositional elitism is an exaggeration, for although environmentalists are typically above average in socioeconomic status (as are most sociopolitical activists), few belong to the upper class. Ideological elitism may hold in some instances, but environmentalists have shown increasing sensitivity to equity concerns and there is little evidence of consistent pursuit of self-interest. Impact elitism is the most important issue, and also the most difficult to assess. It appears that there has been a general tendency for environmental reforms to have regressive impacts. However, it is increasingly recognized that problems such as workplace pollution and toxic waste contamination disproportionately affect the lower socioeconomic strata, and thus reforms aimed at such problems will likely have more progressive impacts.

  4. Production functions for climate policy modeling. An empirical analysis

    International Nuclear Information System (INIS)

    Van der Werf, Edwin

    2008-01-01

    Quantitative models for climate policy modeling differ in the production structure used and in the sizes of the elasticities of substitution. The empirical foundation for both is generally lacking. This paper estimates the parameters of 2-level CES production functions with capital, labour and energy as inputs, and is the first to systematically compare all nesting structures. Using industry-level data from 12 OECD countries, we find that the nesting structure where capital and labour are combined first, fits the data best, but for most countries and industries we cannot reject that all three inputs can be put into one single nest. These two nesting structures are used by most climate models. However, while several climate policy models use a Cobb-Douglas function for (part of the) production function, we reject elasticities equal to one, in favour of considerably smaller values. Finally we find evidence for factor-specific technological change. With lower elasticities and with factor-specific technological change, some climate policy models may find a bigger effect of endogenous technological change on mitigating the costs of climate policy. (author)

  5. Analysis of dependent failures in risk assessment and reliability evaluation

    International Nuclear Information System (INIS)

    Fleming, K.N.; Mosleh, A.; Kelley, A.P. Jr.; Gas-Cooled Reactors Associates, La Jolla, CA)

    1983-01-01

    The ability to estimate the risk of potential reactor accidents is largely determined by the ability to analyze statistically dependent multiple failures. The importance of dependent failures has been indicated in recent probabilistic risk assessment (PRA) studies as well as in reports of reactor operating experiences. This article highlights the importance of several different types of dependent failures from the perspective of the risk and reliability analyst and provides references to the methods and data available for their analysis. In addition to describing the current state of the art, some recent advances, pitfalls, misconceptions, and limitations of some approaches to dependent failure analysis are addressed. A summary is included of the discourse on this subject, which is presented in the Institute of Electrical and Electronics Engineers/American Nuclear Society PRA Procedures Guide

  6. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  7. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  8. Standardization of domestic human reliability analysis and experience of human reliability analysis in probabilistic safety assessment for NPPs under design

    International Nuclear Information System (INIS)

    Kang, D. I.; Jung, W. D.

    2002-01-01

    This paper introduces the background and development activities of domestic standardization of procedure and method for Human Reliability Analysis (HRA) to avoid the intervention of subjectivity by HRA analyst in Probabilistic Safety Assessment (PSA) as possible, and the review of the HRA results for domestic nuclear power plants under design studied by Korea Atomic Energy Research Institute. We identify the HRA methods used for PSA for domestic NPPs and discuss the subjectivity of HRA analyst shown in performing a HRA. Also, we introduce the PSA guidelines published in USA and review the HRA results based on them. We propose the system of a standard procedure and method for HRA to be developed

  9. Systems reliability analysis: applications of the SPARCS System-Reliability Assessment Computer Program

    International Nuclear Information System (INIS)

    Locks, M.O.

    1978-01-01

    SPARCS-2 (Simulation Program for Assessing the Reliabilities of Complex Systems, Version 2) is a PL/1 computer program for assessing (establishing interval estimates for) the reliability and the MTBF of a large and complex s-coherent system of any modular configuration. The system can consist of a complex logical assembly of independently failing attribute (binomial-Bernoulli) and time-to-failure (Poisson-exponential) components, without regard to their placement. Alternatively, it can be a configuration of independently failing modules, where each module has either or both attribute and time-to-failure components. SPARCS-2 also has an improved super modularity feature. Modules with minimal-cut unreliabiliy calculations can be mixed with those having minimal-path reliability calculations. All output has been standardized to system reliability or probability of success, regardless of the form in which the input data is presented, and whatever the configuration of modules or elements within modules

  10. A Bivariate Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of canonical correlations analysis to the joint analysis of global monthly mean values of 1996-1997 sea surface temperature (SST) and height (SSH) data. The SST data are considered as one set and the SSH data as another set of multivariate observations, both w...... as for example an increase in the SST will lead to an increase in the SSH. The analysis clearly shows the build-up of one of the largest El Niño events on record. Also the analysis indicates a phase lag of approximately one month between the SST and SSH fields....

  11. Empirical Comparison of Publication Bias Tests in Meta-Analysis.

    Science.gov (United States)

    Lin, Lifeng; Chu, Haitao; Murad, Mohammad Hassan; Hong, Chuan; Qu, Zhiyong; Cole, Stephen R; Chen, Yong

    2018-04-16

    Decision makers rely on meta-analytic estimates to trade off benefits and harms. Publication bias impairs the validity and generalizability of such estimates. The performance of various statistical tests for publication bias has been largely compared using simulation studies and has not been systematically evaluated in empirical data. This study compares seven commonly used publication bias tests (i.e., Begg's rank test, trim-and-fill, Egger's, Tang's, Macaskill's, Deeks', and Peters' regression tests) based on 28,655 meta-analyses available in the Cochrane Library. Egger's regression test detected publication bias more frequently than other tests (15.7% in meta-analyses of binary outcomes and 13.5% in meta-analyses of non-binary outcomes). The proportion of statistically significant publication bias tests was greater for larger meta-analyses, especially for Begg's rank test and the trim-and-fill method. The agreement among Tang's, Macaskill's, Deeks', and Peters' regression tests for binary outcomes was moderately strong (most κ's were around 0.6). Tang's and Deeks' tests had fairly similar performance (κ > 0.9). The agreement among Begg's rank test, the trim-and-fill method, and Egger's regression test was weak or moderate (κ < 0.5). Given the relatively low agreement between many publication bias tests, meta-analysts should not rely on a single test and may apply multiple tests with various assumptions. Non-statistical approaches to evaluating publication bias (e.g., searching clinical trials registries, records of drug approving agencies, and scientific conference proceedings) remain essential.

  12. Empirical Green's function analysis: Taking the next step

    Science.gov (United States)

    Hough, S.E.

    1997-01-01

    An extension of the empirical Green's function (EGF) method is presented that involves determination of source parameters using standard EGF deconvolution, followed by inversion for a common attenuation parameter for a set of colocated events. Recordings of three or more colocated events can thus be used to constrain a single path attenuation estimate. I apply this method to recordings from the 1995-1996 Ridgecrest, California, earthquake sequence; I analyze four clusters consisting of 13 total events with magnitudes between 2.6 and 4.9. I first obtain corner frequencies, which are used to infer Brune stress drop estimates. I obtain stress drop values of 0.3-53 MPa (with all but one between 0.3 and 11 MPa), with no resolved increase of stress drop with moment. With the corner frequencies constrained, the inferred attenuation parameters are very consistent; they imply an average shear wave quality factor of approximately 20-25 for alluvial sediments within the Indian Wells Valley. Although the resultant spectral fitting (using corner frequency and ??) is good, the residuals are consistent among the clusters analyzed. Their spectral shape is similar to the the theoretical one-dimensional response of a layered low-velocity structure in the valley (an absolute site response cannot be determined by this method, because of an ambiguity between absolute response and source spectral amplitudes). I show that even this subtle site response can significantly bias estimates of corner frequency and ??, if it is ignored in an inversion for only source and path effects. The multiple-EGF method presented in this paper is analogous to a joint inversion for source, path, and site effects; the use of colocated sets of earthquakes appears to offer significant advantages in improving resolution of all three estimates, especially if data are from a single site or sites with similar site response.

  13. Empirical analysis on the runners' velocity distribution in city marathons

    Science.gov (United States)

    Lin, Zhenquan; Meng, Fan

    2018-01-01

    In recent decades, much researches have been performed on human temporal activity and mobility patterns, while few investigations have been made to examine the features of the velocity distributions of human mobility patterns. In this paper, we investigated empirically the velocity distributions of finishers in New York City marathon, American Chicago marathon, Berlin marathon and London marathon. By statistical analyses on the datasets of the finish time records, we captured some statistical features of human behaviors in marathons: (1) The velocity distributions of all finishers and of partial finishers in the fastest age group both follow log-normal distribution; (2) In the New York City marathon, the velocity distribution of all male runners in eight 5-kilometer internal timing courses undergoes two transitions: from log-normal distribution at the initial stage (several initial courses) to the Gaussian distribution at the middle stage (several middle courses), and to log-normal distribution at the last stage (several last courses); (3) The intensity of the competition, which is described by the root-mean-square value of the rank changes of all runners, goes weaker from initial stage to the middle stage corresponding to the transition of the velocity distribution from log-normal distribution to Gaussian distribution, and when the competition gets stronger in the last course of the middle stage, there will come a transition from Gaussian distribution to log-normal one at last stage. This study may enrich the researches on human mobility patterns and attract attentions on the velocity features of human mobility.

  14. 188 An Empirical Investigation of Value-Chain Analysis and ...

    African Journals Online (AJOL)

    User

    Analysis has a positive but insignificant impact on Competitive Advantage of a manufacturing firm in ... chain analysis is a means of increasing customer satisfaction and managing costs more ... The linkages express the relationships between the ... margins, return on assets, benchmarking, and capital budgeting. When a.

  15. Reliability and availability requirements analysis for DEMO: fuel cycle system

    International Nuclear Information System (INIS)

    Pinna, T.; Borgognoni, F.

    2015-01-01

    The Demonstration Power Plant (DEMO) will be a fusion reactor prototype designed to demonstrate the capability to produce electrical power in a commercially acceptable way. Two of the key elements of the engineering development of the DEMO reactor are the definitions of reliability and availability requirements (or targets). The availability target for a hypothesized Fuel Cycle has been analysed as a test case. The analysis has been done on the basis of the experience gained in operating existing tokamak fusion reactors and developing the ITER design. Plant Breakdown Structure (PBS) and Functional Breakdown Structure (FBS) related to the DEMO Fuel Cycle and correlations between PBS and FBS have been identified. At first, a set of availability targets has been allocated to the various systems on the basis of their operating, protection and safety functions. 75% and 85% of availability has been allocated to the operating functions of fuelling system and tritium plant respectively. 99% of availability has been allocated to the overall systems in executing their safety functions. The chances of the systems to achieve the allocated targets have then been investigated through a Failure Mode and Effect Analysis and Reliability Block Diagram analysis. The following results have been obtained: 1) the target of 75% for the operations of the fuelling system looks reasonable, while the target of 85% for the operations of the whole tritium plant should be reduced to 80%, even though all the tritium plant systems can individually reach quite high availability targets, over 90% - 95%; 2) all the DEMO Fuel Cycle systems can reach the target of 99% in accomplishing their safety functions. (authors)

  16. Empirical Risk Analysis of Severe Reactor Accidents in Nuclear Power Plants after Fukushima

    OpenAIRE

    Kaiser, Jan Christian

    2012-01-01

    Many countries are reexamining the risks connected with nuclear power generation after the Fukushima accidents. To provide updated information for the corresponding discussion a simple empirical approach is applied for risk quantification of severe reactor accidents with International Nuclear and Radiological Event Scale (INES) level ≥5. The analysis is based on worldwide data of commercial nuclear facilities. An empirical hazard of 21 (95% confidence intervals (CI) 4; 62) severe accidents am...

  17. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  18. Reliability analysis of large scaled structures by optimization technique

    International Nuclear Information System (INIS)

    Ishikawa, N.; Mihara, T.; Iizuka, M.

    1987-01-01

    This paper presents a reliability analysis based on the optimization technique using PNET (Probabilistic Network Evaluation Technique) method for the highly redundant structures having a large number of collapse modes. This approach makes the best use of the merit of the optimization technique in which the idea of PNET method is used. The analytical process involves the minimization of safety index of the representative mode, subjected to satisfaction of the mechanism condition and of the positive external work. The procedure entails the sequential performance of a series of the NLP (Nonlinear Programming) problems, where the correlation condition as the idea of PNET method pertaining to the representative mode is taken as an additional constraint to the next analysis. Upon succeeding iterations, the final analysis is achieved when a collapse probability at the subsequent mode is extremely less than the value at the 1st mode. The approximate collapse probability of the structure is defined as the sum of the collapse probabilities of the representative modes classified by the extent of correlation. Then, in order to confirm the validity of the proposed method, the conventional Monte Carlo simulation is also revised by using the collapse load analysis. Finally, two fairly large structures were analyzed to illustrate the scope and application of the approach. (orig./HP)

  19. Modeling ionospheric foF2 by using empirical orthogonal function analysis

    Directory of Open Access Journals (Sweden)

    E. A

    2011-08-01

    Full Text Available A similar-parameters interpolation method and an empirical orthogonal function analysis are used to construct empirical models for the ionospheric foF2 by using the observational data from three ground-based ionosonde stations in Japan which are Wakkanai (Geographic 45.4° N, 141.7° E, Kokubunji (Geographic 35.7° N, 140.1° E and Yamagawa (Geographic 31.2° N, 130.6° E during the years of 1971–1987. The impact of different drivers towards ionospheric foF2 can be well indicated by choosing appropriate proxies. It is shown that the missing data of original foF2 can be optimal refilled using similar-parameters method. The characteristics of base functions and associated coefficients of EOF model are analyzed. The diurnal variation of base functions can reflect the essential nature of ionospheric foF2 while the coefficients represent the long-term alteration tendency. The 1st order EOF coefficient A1 can reflect the feature of the components with solar cycle variation. A1 also contains an evident semi-annual variation component as well as a relatively weak annual fluctuation component. Both of which are not so obvious as the solar cycle variation. The 2nd order coefficient A2 contains mainly annual variation components. The 3rd order coefficient A3 and 4th order coefficient A4 contain both annual and semi-annual variation components. The seasonal variation, solar rotation oscillation and the small-scale irregularities are also included in the 4th order coefficient A4. The amplitude range and developing tendency of all these coefficients depend on the level of solar activity and geomagnetic activity. The reliability and validity of EOF model are verified by comparison with observational data and with International Reference Ionosphere (IRI. The agreement between observations and EOF model is quite well, indicating that the EOF model can reflect the major changes and the temporal distribution characteristics of the mid-latitude ionosphere of the

  20. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  1. Analogical reasoning for reliability analysis based on generic data

    Energy Technology Data Exchange (ETDEWEB)

    Kozin, Igor O

    1996-10-01

    The paper suggests using the systemic concept 'analogy' for the foundation of an approach to analyze system reliability on the basis of generic data, describing the method of structuring the set that defines analogical models, an approach of transition from the analogical model to a reliability model and a way of obtaining reliability intervals of analogous objects.

  2. Analogical reasoning for reliability analysis based on generic data

    International Nuclear Information System (INIS)

    Kozin, Igor O.

    1996-01-01

    The paper suggests using the systemic concept 'analogy' for the foundation of an approach to analyze system reliability on the basis of generic data, describing the method of structuring the set that defines analogical models, an approach of transition from the analogical model to a reliability model and a way of obtaining reliability intervals of analogous objects

  3. Applicability of simplified human reliability analysis methods for severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Boring, R.; St Germain, S. [Idaho National Lab., Idaho Falls, Idaho (United States); Banaseanu, G.; Chatri, H.; Akl, Y. [Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2016-03-15

    Most contemporary human reliability analysis (HRA) methods were created to analyse design-basis accidents at nuclear power plants. As part of a comprehensive expansion of risk assessments at many plants internationally, HRAs will begin considering severe accident scenarios. Severe accidents, while extremely rare, constitute high consequence events that significantly challenge successful operations and recovery. Challenges during severe accidents include degraded and hazardous operating conditions at the plant, the shift in control from the main control room to the technical support center, the unavailability of plant instrumentation, and the need to use different types of operating procedures. Such shifts in operations may also test key assumptions in existing HRA methods. This paper discusses key differences between design basis and severe accidents, reviews efforts to date to create customized HRA methods suitable for severe accidents, and recommends practices for adapting existing HRA methods that are already being used for HRAs at the plants. (author)

  4. Fatigue Reliability Analysis of Wind Turbine Cast Components

    DEFF Research Database (Denmark)

    Rafsanjani, Hesam Mirzaei; Sørensen, John Dalsgaard; Fæster, Søren

    2017-01-01

    .) and to quantify the relevant uncertainties using available fatigue tests. Illustrative results are presented as obtained by statistical analysis of a large set of fatigue data for casted test components typically used for wind turbines. Furthermore, the SN curves (fatigue life curves based on applied stress......The fatigue life of wind turbine cast components, such as the main shaft in a drivetrain, is generally determined by defects from the casting process. These defects may reduce the fatigue life and they are generally distributed randomly in components. The foundries, cutting facilities and test...... facilities can affect the verification of properties by testing. Hence, it is important to have a tool to identify which foundry, cutting and/or test facility produces components which, based on the relevant uncertainties, have the largest expected fatigue life or, alternatively, have the largest reliability...

  5. Basic aspects of stochastic reliability analysis for redundancy systems

    International Nuclear Information System (INIS)

    Doerre, P.

    1989-01-01

    Much confusion has been created by trying to establish common cause failure (CCF) as an extra phenomenon which has to be treated with extra methods in reliability and data analysis. This paper takes another approach which can be roughly described by the statement that dependent failure is the basic phenomenon, while 'independent failure' refers to a special limiting case, namely the perfectly homogeneous population. This approach is motivated by examples demonstrating that common causes do not lead to dependent failure, so far as physical dependencies like shared components are excluded, and that stochastic dependencies are not related to common causes. The possibility to select more than one failure behaviour from an inhomogeneous population is identified as an additional random process which creates stochastic dependence. However, this source of randomness is usually treated in the deterministic limit, which destroys dependence and hence yields incorrect multiple failure frequencies for redundancy structures, thus creating the need for applying corrective CCF models. (author)

  6. An Application of Graph Theory in Markov Chains Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Pavel Skalny

    2014-01-01

    Full Text Available The paper presents reliability analysis which was realized for an industrial company. The aim of the paper is to present the usage of discrete time Markov chains and the flow in network approach. Discrete Markov chains a well-known method of stochastic modelling describes the issue. The method is suitable for many systems occurring in practice where we can easily distinguish various amount of states. Markov chains are used to describe transitions between the states of the process. The industrial process is described as a graph network. The maximal flow in the network corresponds to the production. The Ford-Fulkerson algorithm is used to quantify the production for each state. The combination of both methods are utilized to quantify the expected value of the amount of manufactured products for the given time period.

  7. Time-dependent reliability analysis and condition assessment of structures

    International Nuclear Information System (INIS)

    Ellingwood, B.R.

    1997-01-01

    Structures generally play a passive role in assurance of safety in nuclear plant operation, but are important if the plant is to withstand the effect of extreme environmental or abnormal events. Relative to mechanical and electrical components, structural systems and components would be difficult and costly to replace. While the performance of steel or reinforced concrete structures in service generally has been very good, their strengths may deteriorate during an extended service life as a result of changes brought on by an aggressive environment, excessive loading, or accidental loading. Quantitative tools for condition assessment of aging structures can be developed using time-dependent structural reliability analysis methods. Such methods provide a framework for addressing the uncertainties attendant to aging in the decision process

  8. Reliability analysis for the quench detection in the LHC machine

    CERN Document Server

    Denz, R; Vergara-Fernández, A

    2002-01-01

    The Large Hadron Collider (LHC) will incorporate a large amount of superconducting elements that require protection in case of a quench. Key elements in the quench protection system are the electronic quench detectors. Their reliability will have an important impact on the down time as well as on the operational cost of the collider. The expected rates of both false and missed quenches have been computed for several redundant detection schemes. The developed model takes account of the maintainability of the system to optimise the frequency of foreseen checks, and evaluate their influence on the performance of different detection topologies. Seen the uncertainty of the failure rate of the components combined with the LHC tunnel environment, the study has been completed with a sensitivity analysis of the results. The chosen detection scheme and the maintainability strategy for each detector family are given.

  9. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    Science.gov (United States)

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. A reliability analysis of the revised competitiveness index.

    Science.gov (United States)

    Harris, Paul B; Houston, John M

    2010-06-01

    This study examined the reliability of the Revised Competitiveness Index by investigating the test-retest reliability, interitem reliability, and factor structure of the measure based on a sample of 280 undergraduates (200 women, 80 men) ranging in age from 18 to 28 years (M = 20.1, SD = 2.1). The findings indicate that the Revised Competitiveness Index has high test-retest reliability, high inter-item reliability, and a stable factor structure. The results support the assertion that the Revised Competitiveness Index assesses competitiveness as a stable trait rather than a dynamic state.

  11. Empirical analysis of scaling and fractal characteristics of outpatients

    International Nuclear Information System (INIS)

    Zhang, Li-Jiang; Liu, Zi-Xian; Guo, Jin-Li

    2014-01-01

    The paper uses power-law frequency distribution, power spectrum analysis, detrended fluctuation analysis, and surrogate data testing to evaluate outpatient registration data of two hospitals in China and to investigate the human dynamics of systems that use the “first come, first served” protocols. The research results reveal that outpatient behavior follow scaling laws. The results also suggest that the time series of inter-arrival time exhibit 1/f noise and have positive long-range correlation. Our research may contribute to operational optimization and resource allocation in hospital based on FCFS admission protocols.

  12. Empirical Analysis of Pneumatic Tire Friction on Ice

    OpenAIRE

    Holley, Troy Nigel

    2010-01-01

    Pneumatic tire friction on ice is an under-researched area of tire mechanics. This study covers the design and analysis of a series of pneumatic tire tests on a flat-level ice road surface. The terramechanics rig of the Advanced Vehicle Dynamics Lab (AVDL) is a single-wheel test rig that allows for the experimental analysis of the forces and moments on a tire, providing directly the data for the drawbar pull of said tire, thus supporting the calculation of friction based on this data. This...

  13. Formal analysis of empirical traces in incident management

    International Nuclear Information System (INIS)

    Hoogendoorn, Mark; Jonker, Catholijn M.; Maanen, Peter-Paul van; Sharpanskykh, Alexei

    2008-01-01

    Within the field of incident management split second decisions have to be made, usually on the basis of incomplete and partially incorrect information. As a result of these conditions, errors occur in such decision processes. In order to avoid repetition of such errors, historic cases, disaster plans, and training logs need to be thoroughly analysed. This paper presents a formal approach for such an analysis that pays special attention to spatial and temporal aspects, to information exchange, and to organisational structure. The formal nature of the approach enables automation of analysis, which is illustrated by case studies of two disasters

  14. Empirical Analysis of Religiosity as Predictor of Social Media Addiction

    OpenAIRE

    Jamal J Almenayes

    2015-01-01

    This study sought to examine the dimensions of social media addiction and its relationship to religiosity.  To investigate the matter, the present research utilized a well-known Internet addiction scale and modified it to fit social media (Young, 1996).  Factor analysis of items generated by a sample of 1326 participants, three addiction factors were apparent.  These factors were later regressed on a scale of religiosity.  This scale contained a single factor based on factor analysis.  Result...

  15. Empirical analysis of scaling and fractal characteristics of outpatients

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Li-Jiang, E-mail: zljjiang@gmail.com [College of Management and Economics, Tianjin University, Tianjin 300072 (China); Management Institute, Xinxiang Medical University, Xinxiang 453003, Henan (China); Liu, Zi-Xian, E-mail: liuzixian@tju.edu.cn [College of Management and Economics, Tianjin University, Tianjin 300072 (China); Guo, Jin-Li, E-mail: phd5816@163.com [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China)

    2014-01-31

    The paper uses power-law frequency distribution, power spectrum analysis, detrended fluctuation analysis, and surrogate data testing to evaluate outpatient registration data of two hospitals in China and to investigate the human dynamics of systems that use the “first come, first served” protocols. The research results reveal that outpatient behavior follow scaling laws. The results also suggest that the time series of inter-arrival time exhibit 1/f noise and have positive long-range correlation. Our research may contribute to operational optimization and resource allocation in hospital based on FCFS admission protocols.

  16. Physical Violence between Siblings: A Theoretical and Empirical Analysis

    Science.gov (United States)

    Hoffman, Kristi L.; Kiecolt, K. Jill; Edwards, John N.

    2005-01-01

    This study develops and tests a theoretical model to explain sibling violence based on the feminist, conflict, and social learning theoretical perspectives and research in psychology and sociology. A multivariate analysis of data from 651 young adults generally supports hypotheses from all three theoretical perspectives. Males with brothers have…

  17. Unemployment and Labor Market Institutions : An Empirical Analysis

    NARCIS (Netherlands)

    Belot, M.V.K.; van Ours, J.C.

    2001-01-01

    The development of the unemployment rate di¤ers substantially between OECD countries.In this paper we investigate to what extent these differences are related to labor market institutions.In our analysis we use data of eighteen OECD countries over the period 1960-1994 and show that the way in which

  18. Development of the integrated system reliability analysis code MODULE

    International Nuclear Information System (INIS)

    Han, S.H.; Yoo, K.J.; Kim, T.W.

    1987-01-01

    The major components in a system reliability analysis are the determination of cut sets, importance measure, and uncertainty analysis. Various computer codes have been used for these purposes. For example, SETS and FTAP are used to determine cut sets; Importance for importance calculations; and Sample, CONINT, and MOCUP for uncertainty analysis. There have been problems when the codes run each other and the input and output are not linked, which could result in errors when preparing input for each code. The code MODULE was developed to carry out the above calculations simultaneously without linking input and outputs to other codes. MODULE can also prepare input for SETS for the case of a large fault tree that cannot be handled by MODULE. The flow diagram of the MODULE code is shown. To verify the MODULE code, two examples are selected and the results and computation times are compared with those of SETS, FTAP, CONINT, and MOCUP on both Cyber 170-875 and IBM PC/AT. Two examples are fault trees of the auxiliary feedwater system (AFWS) of Korea Nuclear Units (KNU)-1 and -2, which have 54 gates and 115 events, 39 gates and 92 events, respectively. The MODULE code has the advantage that it can calculate the cut sets, importances, and uncertainties in a single run with little increase in computing time over other codes and that it can be used in personal computers

  19. Human reliability analysis for advanced control room of KNGR

    International Nuclear Information System (INIS)

    Kim, Myung-Ro; Park, Seong-Kyu

    2000-01-01

    There are two purposes in Human Reliability Analysis (HRA) which was performed during Korean Next Generation Reactor (KNGR) Phase 2 research project. One is to present the human error probability quantification results for Probabilistic Safety Assessment (PSA) and the other is to provide a list of the critical operator actions for Human Factor Engineering (HFE). Critical operator actions were identified from the KNGR HRA/RSA based on selection criteria and incorporated in the MMI Task Analysis, where they receive additional treatment. The use of HRA/PSA results in design, procedure development, and training was ensured by their incorporation in the MMI task analysis and MCR design such as fixed position alarms, displays and controls. Any dominant PSA sequence that takes credit for human performance to achieve acceptable results was incorporated in MMIS validation activities through the PSA-based critical operator actions. The integration of KNGR HRA into MMI design was sufficiently addressed all applicable review criteria of NUREG-0800, Chapter 18, Section 2 F and NUREG-0711. (S.Y.)

  20. Models and data requirements for human reliability analysis

    International Nuclear Information System (INIS)

    1989-03-01

    It has been widely recognised for many years that the safety of the nuclear power generation depends heavily on the human factors related to plant operation. This has been confirmed by the accidents at Three Mile Island and Chernobyl. Both these cases revealed how human actions can defeat engineered safeguards and the need for special operator training to cover the possibility of unexpected plant conditions. The importance of the human factor also stands out in the analysis of abnormal events and insights from probabilistic safety assessments (PSA's), which reveal a large proportion of cases having their origin in faulty operator performance. A consultants' meeting, organized jointly by the International Atomic Energy Agency (IAEA) and the International Institute for Applied Systems Analysis (IIASA) was held at IIASA in Laxenburg, Austria, December 7-11, 1987, with the aim of reviewing existing models used in Probabilistic Safety Assessment (PSA) for Human Reliability Analysis (HRA) and of identifying the data required. The report collects both the contributions offered by the members of the Expert Task Force and the findings of the extensive discussions that took place during the meeting. Refs, figs and tabs

  1. Reliability analysis of component of affination centrifugal 1 machine by using reliability engineering

    Science.gov (United States)

    Sembiring, N.; Ginting, E.; Darnello, T.

    2017-12-01

    Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.

  2. Bearing Procurement Analysis Method by Total Cost of Ownership Analysis and Reliability Prediction

    Science.gov (United States)

    Trusaji, Wildan; Akbar, Muhammad; Sukoyo; Irianto, Dradjad

    2018-03-01

    In making bearing procurement analysis, price and its reliability must be considered as decision criteria, since price determines the direct cost as acquisition cost and reliability of bearing determine the indirect cost such as maintenance cost. Despite the indirect cost is hard to identify and measured, it has high contribution to overall cost that will be incurred. So, the indirect cost of reliability must be considered when making bearing procurement analysis. This paper tries to explain bearing evaluation method with the total cost of ownership analysis to consider price and maintenance cost as decision criteria. Furthermore, since there is a lack of failure data when bearing evaluation phase is conducted, reliability prediction method is used to predict bearing reliability from its dynamic load rating parameter. With this method, bearing with a higher price but has higher reliability is preferable for long-term planning. But for short-term planning the cheaper one but has lower reliability is preferable. This contextuality can give rise to conflict between stakeholders. Thus, the planning horizon needs to be agreed by all stakeholder before making a procurement decision.

  3. A 16-year examination of domestic violence among Asians and Asian Americans in the empirical knowledge base: a content analysis.

    Science.gov (United States)

    Yick, Alice G; Oomen-Early, Jody

    2008-08-01

    Until recently, research studies have implied that domestic violence does not affect Asian American and immigrant communities, or even Asians abroad, because ethnicity or culture has not been addressed. In this content analysis, the authors examined trends in publications in leading scholarly journals on violence relating to Asian women and domestic violence. A coding schema was developed, with two raters coding the data with high interrater reliability. Sixty articles were published over the 16 years studied, most atheoretical and focusing on individual levels of analysis. The terms used in discussing domestic violence reflected a feminist perspective. Three quarters of the studies were empirical, with most guided by logical positivism using quantitative designs. Most targeted specific Asian subgroups (almost a third focused on Asian Indians) rather than categorizing Asians as a general ethnic category. The concept of "Asian culture" was most often assessed by discussing Asian family structure. Future research is discussed in light of the findings.

  4. An Empirical Study of Precise Interprocedural Array Analysis

    Directory of Open Access Journals (Sweden)

    Michael Hind

    1994-01-01

    Full Text Available In this article we examine the role played by the interprocedural analysis of array accesses in the automatic parallelization of Fortran programs. We use the PTRAN system to provide measurements of several benchmarks to compare different methods of representing interprocedurally accessed arrays. We examine issues concerning the effectiveness of automatic parallelization using these methods and the efficiency of a precise summarization method.

  5. Empirical Analysis of Religiosity as Predictor of Social Media Addiction

    Directory of Open Access Journals (Sweden)

    Jamal J Almenayes

    2015-10-01

    Full Text Available This study sought to examine the dimensions of social media addiction and its relationship to religiosity.  To investigate the matter, the present research utilized a well-known Internet addiction scale and modified it to fit social media (Young, 1996.  Factor analysis of items generated by a sample of 1326 participants, three addiction factors were apparent.  These factors were later regressed on a scale of religiosity.  This scale contained a single factor based on factor analysis.  Results indicated that social media addiction had three factors; "Social Consequences", "Time Displacement" and "Compulsive feelings.  Religiosity, on the other hand, contained a single factor.  Both of these results were arrived at using factor analysis of their respective scales. The relationship between religiosity and social media addiction was then examined using linear regression.  The results indicated that only two of the addiction factors were significantly related to religiosity.  Future research should address the operationalization of the concept of religiosity to account for multiple dimensions.

  6. Exploring Advertising in Higher Education: An Empirical Analysis in North America, Europe, and Japan

    Science.gov (United States)

    Papadimitriou, Antigoni; Blanco Ramírez, Gerardo

    2015-01-01

    This empirical study explores higher education advertising campaigns displayed in five world cities: Boston, New York, Oslo, Tokyo, and Toronto. The study follows a mixed-methods research design relying on content analysis and multimodal semiotic analysis and employs a conceptual framework based on the knowledge triangle of education, research,…

  7. Reliability analysis of land pipelines for hydrocarbons transportation in Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Leon, D.; Cortes, C. [Inst. Mexicano del Petroleo (Mexico)

    2004-07-01

    The reliability of a land pipeline operated by PEMEX in Mexico was estimated under a range of failure modes. Reliability and safety were evaluated in terms of the pipeline's internal pressure, bending, fracture toughness and its tension failure mode conditions. Loading conditions were applied individually, while bending and tension loads were applied in a combined fashion. The mechanical properties of the steel were also considered along with the degradation effect of internal corrosion resulting from the type of product being transported. A set of internal pressures and mechanical properties were generated randomly using Monte Carlo simulation. Commercial software was used to obtain the pipeline response under each modeled condition. The response analysis was based on the nonlinear finite element method involving a calculation of maximum stresses and stress concentration factors under conditions of corrosion and no corrosion. The margin between maximum stresses due to internal pressure, tension and bending was evaluated along with the margin between stress concentration factor and fracture initiation toughness. The study showed that internal pressure, stress concentration and tension-bending were the critical failure modes. It was suggested that more research should be conducted to improve the modeling of the deteriorating effects of corrosion and to adjust the probability distribution for fracture toughness and the length/depth defect ratio. The consideration of welding geometries along with features of marine pipelines and pipeline products would help to generalize the study to facilitate the creation of codes for the construction, design, inspection and maintenance of pipelines in Mexico. 7 refs., 1 tab., 14 figs.

  8. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  9. RELOSS, Reliability of Safety System by Fault Tree Analysis

    International Nuclear Information System (INIS)

    Allan, R.N.; Rondiris, I.L.; Adraktas, A.

    1981-01-01

    1 - Description of problem or function: Program RELOSS is used in the reliability/safety assessment of any complex system with predetermined operational logic in qualitative and (if required) quantitative terms. The program calculates the possible system outcomes following an abnormal operating condition and the probability of occurrence, if required. Furthermore, the program deduces the minimal cut or tie sets of the system outcomes and identifies the potential common mode failures. 4. Method of solution: The reliability analysis performed by the program is based on the event tree methodology. Using this methodology, the program develops the event tree of a system or a module of that system and relates each path of this tree to its qualitative and/or quantitative impact on specified system or module outcomes. If the system being analysed is subdivided into modules the program assesses each module in turn as described previously and then combines the module information to obtain results for the overall system. Having developed the event tree of a module or a system, the program identifies which paths lead or do not lead to various outcomes depending on whether the cut or the tie sets of the outcomes are required and deduces the corresponding sets. Furthermore the program identifies for a specific system outcome, the potential common mode failures and the cut or tie sets containing potential dependent failures of some components. 5. Restrictions on the complexity of the problem: The present dimensions of the program are as follows. They can however be easily modified: Maximum number of modules (equivalent components): 25; Maximum number of components in a module: 15; Maximum number of levels of parentheses in a logical statement: 10 Maximum number of system outcomes: 3; Maximum number of module outcomes: 2; Maximum number of points in time for which quantitative analysis is required: 5; Maximum order of any cut or tie set: 10; Maximum order of a cut or tie of any

  10. Tourism Competitiveness Index – An Empirical Analysis Romania vs. Bulgaria

    Directory of Open Access Journals (Sweden)

    Mihai CROITORU

    2011-09-01

    Full Text Available In the conditions of the current economic downturn, many specialists consider tourism as one of the sectors with the greatest potential to provide worldwide economic growth and development. A growing tourism sector can contribute effectively to employment, increase national income, and can also make a decisive mark on the balance of payments. Thus, tourism can be an important driving force for growth and prosperity, especially in emerging economies, being a key element in reducing poverty and regional disparities. Despite its contribution to economic growth, tourism sector development can be undermined by a series of economic and legislative barriers that can affect the competitiveness of this sector. In this context, the World Economic Forum proposes, via the Tourism Competitiveness Index (TCI, in addition to a methodology to identify key factors that contribute to increasing tourism competitiveness, tools for analysis and evaluation of these factors. In this context, this paper aims to analyze the underlying determinants of TCI from the perspective of two directly competing states, Romania and Bulgaria in order to highlight the effects of communication on the competitiveness of the tourism sector. The purpose of this analysis is to provide some answers, especially in terms of communication strategies, which may explain the completely different performances of the two national economies in the tourism sector.

  11. IFRS and Stock Returns: An Empirical Analysis in Brazil

    Directory of Open Access Journals (Sweden)

    Rodrigo F. Malaquias

    2016-09-01

    Full Text Available In recent years, the convergence of accounting standards has been an issue that motivated new studies in the accounting field. It is expected that the convergence provides users, especially external users of accounting information, with comparable reports among different economies. Considering this scenario, this article was developed in order to compare the effect of accounting numbers on the stock market before and after the accounting convergence in Brazil. The sample of the study involved Brazilian listed companies at BM&FBOVESPA that had American Depository Receipts (levels II and III at the New York Stock Exchange (NYSE. For data analysis, descriptive statistics and graphic analysis were employed in order to analyze the behavior of stock returns around the publication dates. The main results indicate that the stock market reacts to the accounting reports. Therefore, the accounting numbers contain relevant information for the decision making of investors in the stock market. Moreover, it is observed that after the accounting convergence, the stock returns of the companies seem to present lower volatility.

  12. Hybrid modeling and empirical analysis of automobile supply chain network

    Science.gov (United States)

    Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying

    2017-05-01

    Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.

  13. The reliability of mercury analysis in environmental materials

    International Nuclear Information System (INIS)

    Heinonen, J.; Suschny, O.

    1973-01-01

    Mercury occurs in nature in its native elemental as well as in different mineral forms. It has been mined for centuries and is used in many branches of industry, agriculture and medicine. Mercury is very toxic to man and reports of poisoning due to the presence of the element in fish and shellfish caught at Minamata and Niigata, Japan have led not only to local investigations but to multi-national research into the sources and the levels of mercury in the environment. The concentrations at which the element has to be determined in these studies are extremely small, usually of the order of a few parts in 10 9 parts of environmental material. Few analytical techniques provide the required sensitivity for analysis at such low concentrations, and only two are normally used for mercury: neutron activation analysis and atomic absorption photometry. They are also the most convenient end points of various separation schemes for different organic mercury compounds. Mercury analysis at the ppb-level is beset with many problems: volatility of the metal and its compounds, impurity of reagents, interference by other elements and many other analytical difficulties may influence the results. To be able to draw valid conclusions from the analyses it is necessary to know the reliability attached to the values obtained. To assist laboratories in the evaluation of their analytical performance, the International Atomic Energy Agency through its own laboratory at Seibersdorf already organised in 1967 an intercomparison of mercury analysis in flour. Based on the results obtained at that time, a whole series of intercomparisons of mercury determinations in nine different environmental materials was undertaken in 1971. The materials investigated included corn and wheat flour, spray-dried animal blood serum, fish solubles, milk powder, saw dust, cellulose, lacquer paint and coloric material

  14. The reliability of mercury analysis in environmental materials

    Energy Technology Data Exchange (ETDEWEB)

    Heinonen, J.; Suschny, O

    1973-01-01

    Mercury occurs in nature in its native elemental as well as in different mineral forms. It has been mined for centuries and is used in many branches of industry, agriculture and medicine. Mercury is very toxic to man and reports of poisoning due to the presence of the element in fish and shellfish caught at Minamata and Niigata, Japan have led not only to local investigations but to multi-national research into the sources and the levels of mercury in the environment. The concentrations at which the element has to be determined in these studies are extremely small, usually of the order of a few parts in 10{sup 9} parts of environmental material. Few analytical techniques provide the required sensitivity for analysis at such low concentrations, and only two are normally used for mercury: neutron activation analysis and atomic absorption photometry. They are also the most convenient end points of various separation schemes for different organic mercury compounds. Mercury analysis at the ppb-level is beset with many problems: volatility of the metal and its compounds, impurity of reagents, interference by other elements and many other analytical difficulties may influence the results. To be able to draw valid conclusions from the analyses it is necessary to know the reliability attached to the values obtained. To assist laboratories in the evaluation of their analytical performance, the International Atomic Energy Agency through its own laboratory at Seibersdorf already organised in 1967 an intercomparison of mercury analysis in flour. Based on the results obtained at that time, a whole series of intercomparisons of mercury determinations in nine different environmental materials was undertaken in 1971. The materials investigated included corn and wheat flour, spray-dried animal blood serum, fish solubles, milk powder, saw dust, cellulose, lacquer paint and coloric material.

  15. Multi-Unit Considerations for Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    St. Germain, S.; Boring, R.; Banaseanu, G.; Akl, Y.; Chatri, H.

    2017-03-01

    This paper uses the insights from the Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) methodology to help identify human actions currently modeled in the single unit PSA that may need to be modified to account for additional challenges imposed by a multi-unit accident as well as identify possible new human actions that might be modeled to more accurately characterize multi-unit risk. In identifying these potential human action impacts, the use of the SPAR-H strategy to include both errors in diagnosis and errors in action is considered as well as identifying characteristics of a multi-unit accident scenario that may impact the selection of the performance shaping factors (PSFs) used in SPAR-H. The lessons learned from the Fukushima Daiichi reactor accident will be addressed to further help identify areas where improved modeling may be required. While these multi-unit impacts may require modifications to a Level 1 PSA model, it is expected to have much more importance for Level 2 modeling. There is little currently written specifically about multi-unit HRA issues. A review of related published research will be presented. While this paper cannot answer all issues related to multi-unit HRA, it will hopefully serve as a starting point to generate discussion and spark additional ideas towards the proper treatment of HRA in a multi-unit PSA.

  16. 78 FR 45447 - Revisions to Modeling, Data, and Analysis Reliability Standard

    Science.gov (United States)

    2013-07-29

    ...; Order No. 782] Revisions to Modeling, Data, and Analysis Reliability Standard AGENCY: Federal Energy... Analysis (MOD) Reliability Standard MOD- 028-2, submitted to the Commission for approval by the North... Organization. The Commission finds that the proposed Reliability Standard represents an improvement over the...

  17. Field Programmable Gate Array Reliability Analysis Guidelines for Launch Vehicle Reliability Block Diagrams

    Science.gov (United States)

    Al Hassan, Mohammad; Britton, Paul; Hatfield, Glen Spencer; Novack, Steven D.

    2017-01-01

    Field Programmable Gate Arrays (FPGAs) integrated circuits (IC) are one of the key electronic components in today's sophisticated launch and space vehicle complex avionic systems, largely due to their superb reprogrammable and reconfigurable capabilities combined with relatively low non-recurring engineering costs (NRE) and short design cycle. Consequently, FPGAs are prevalent ICs in communication protocols and control signal commands. This paper will identify reliability concerns and high level guidelines to estimate FPGA total failure rates in a launch vehicle application. The paper will discuss hardware, hardware description language, and radiation induced failures. The hardware contribution of the approach accounts for physical failures of the IC. The hardware description language portion will discuss the high level FPGA programming languages and software/code reliability growth. The radiation portion will discuss FPGA susceptibility to space environment radiation.

  18. The reliability analysis of cutting tools in the HSM processes

    OpenAIRE

    W.S. Lin

    2008-01-01

    Purpose: This article mainly describe the reliability of the cutting tools in the high speed turning by normaldistribution model.Design/methodology/approach: A series of experimental tests have been done to evaluate the reliabilityvariation of the cutting tools. From experimental results, the tool wear distribution and the tool life are determined,and the tool life distribution and the reliability function of cutting tools are derived. Further, the reliability ofcutting tools at anytime for h...

  19. Empirical analysis of the effects of cyber security incidents.

    Science.gov (United States)

    Davis, Ginger; Garcia, Alfredo; Zhang, Weide

    2009-09-01

    We analyze the time series associated with web traffic for a representative set of online businesses that have suffered widely reported cyber security incidents. Our working hypothesis is that cyber security incidents may prompt (security conscious) online customers to opt out and conduct their business elsewhere or, at the very least, to refrain from accessing online services. For companies relying almost exclusively on online channels, this presents an important business risk. We test for structural changes in these time series that may have been caused by these cyber security incidents. Our results consistently indicate that cyber security incidents do not affect the structure of web traffic for the set of online businesses studied. We discuss various public policy considerations stemming from our analysis.

  20. Modeling for Determinants of Human Trafficking: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Seo-Young Cho

    2015-02-01

    Full Text Available This study aims to identify robust push and pull factors of human trafficking. I test for the robustness of 70 push and 63 pull factors suggested in the literature. In doing so, I employ an extreme bound analysis, running more than two million regressions with all possible combinations of variables for up to 153 countries during the period of 1995–2010. My results show that crime prevalence robustly explains human trafficking both in destination and origin countries. Income level also has a robust impact, suggesting that the cause of human trafficking shares that of economic migration. Law enforcement matters more in origin countries than destination countries. Interestingly, a very low level of gender equality may have constraining effects on human trafficking outflow, possibly because gender discrimination limits female mobility that is necessary for the occurrence of human trafficking.

  1. Adjoint sensitivity analysis of dynamic reliability models based on Markov chains - II: Application to IFMIF reliability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Cacuci, D. G. [Commiss Energy Atom, Direct Energy Nucl, Saclay, (France); Cacuci, D. G.; Balan, I. [Univ Karlsruhe, Inst Nucl Technol and Reactor Safetly, Karlsruhe, (Germany); Ionescu-Bujor, M. [Forschungszentrum Karlsruhe, Fus Program, D-76021 Karlsruhe, (Germany)

    2008-07-01

    In Part II of this work, the adjoint sensitivity analysis procedure developed in Part I is applied to perform sensitivity analysis of several dynamic reliability models of systems of increasing complexity, culminating with the consideration of the International Fusion Materials Irradiation Facility (IFMIF) accelerator system. Section II presents the main steps of a procedure for the automated generation of Markov chains for reliability analysis, including the abstraction of the physical system, construction of the Markov chain, and the generation and solution of the ensuing set of differential equations; all of these steps have been implemented in a stand-alone computer code system called QUEFT/MARKOMAG-S/MCADJSEN. This code system has been applied to sensitivity analysis of dynamic reliability measures for a paradigm '2-out-of-3' system comprising five components and also to a comprehensive dynamic reliability analysis of the IFMIF accelerator system facilities for the average availability and, respectively, the system's availability at the final mission time. The QUEFT/MARKOMAG-S/MCADJSEN has been used to efficiently compute sensitivities to 186 failure and repair rates characterizing components and subsystems of the first-level fault tree of the IFMIF accelerator system. (authors)

  2. Adjoint sensitivity analysis of dynamic reliability models based on Markov chains - II: Application to IFMIF reliability assessment

    International Nuclear Information System (INIS)

    Cacuci, D. G.; Cacuci, D. G.; Balan, I.; Ionescu-Bujor, M.

    2008-01-01

    In Part II of this work, the adjoint sensitivity analysis procedure developed in Part I is applied to perform sensitivity analysis of several dynamic reliability models of systems of increasing complexity, culminating with the consideration of the International Fusion Materials Irradiation Facility (IFMIF) accelerator system. Section II presents the main steps of a procedure for the automated generation of Markov chains for reliability analysis, including the abstraction of the physical system, construction of the Markov chain, and the generation and solution of the ensuing set of differential equations; all of these steps have been implemented in a stand-alone computer code system called QUEFT/MARKOMAG-S/MCADJSEN. This code system has been applied to sensitivity analysis of dynamic reliability measures for a paradigm '2-out-of-3' system comprising five components and also to a comprehensive dynamic reliability analysis of the IFMIF accelerator system facilities for the average availability and, respectively, the system's availability at the final mission time. The QUEFT/MARKOMAG-S/MCADJSEN has been used to efficiently compute sensitivities to 186 failure and repair rates characterizing components and subsystems of the first-level fault tree of the IFMIF accelerator system. (authors)

  3. Generalisability of an online randomised controlled trial: an empirical analysis.

    Science.gov (United States)

    Wang, Cheng; Mollan, Katie R; Hudgens, Michael G; Tucker, Joseph D; Zheng, Heping; Tang, Weiming; Ling, Li

    2018-02-01

    Investigators increasingly use online methods to recruit participants for randomised controlled trials (RCTs). However, the extent to which participants recruited online represent populations of interest is unknown. We evaluated how generalisable an online RCT sample is to men who have sex with men in China. Inverse probability of sampling weights (IPSW) and the G-formula were used to examine the generalisability of an online RCT using model-based approaches. Online RCT data and national cross-sectional study data from China were analysed to illustrate the process of quantitatively assessing generalisability. The RCT (identifier NCT02248558) randomly assigned participants to a crowdsourced or health marketing video for promotion of HIV testing. The primary outcome was self-reported HIV testing within 4 weeks, with a non-inferiority margin of -3%. In the original online RCT analysis, the estimated difference in proportions of HIV tested between the two arms (crowdsourcing and health marketing) was 2.1% (95% CI, -5.4% to 9.7%). The hypothesis that the crowdsourced video was not inferior to the health marketing video to promote HIV testing was not demonstrated. The IPSW and G-formula estimated differences were -2.6% (95% CI, -14.2 to 8.9) and 2.7% (95% CI, -10.7 to 16.2), with both approaches also not establishing non-inferiority. Conducting generalisability analysis of an online RCT is feasible. Examining the generalisability of online RCTs is an important step before an intervention is scaled up. NCT02248558. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Analysis of trends in aviation maintenance risk: An empirical approach

    International Nuclear Information System (INIS)

    Marais, Karen B.; Robichaud, Matthew R.

    2012-01-01

    Safety is paramount in the airline industry. A significant amount of effort has been devoted to reducing mechanical failures and pilot errors. Recently, more attention has been devoted to the contribution of maintenance to accidents and incidents. This study investigates and quantifies the contribution of maintenance, both in terms of frequency and severity, to passenger airline risk by analyzing three different sources of data from 1999 to 2008: 769 NTSB accident reports, 3242 FAA incident reports, and 7478 FAA records of fines and other legal actions taken against airlines and associated organizations. We analyze several safety related metrics and develop an aviation maintenance risk scorecard that collects these metrics to synthesize a comprehensive track record of maintenance contribution to airline accidents and incidents. We found for example that maintenance-related accidents are approximately 6.5 times more likely to be fatal than accidents in general, and that when fatalities do occur, maintenance accidents result in approximately 3.6 times more fatalities on average. Our analysis of accident trends indicates that this contribution to accident risk has remained fairly constant over the past decade. Our analysis of incidents and FAA fines and legal actions also revealed similar trends. We found that at least 10% of incidents involving mechanical failures such as ruptured hydraulic lines can be attributed to maintenance, suggesting that there may be issues surrounding both the design of and compliance with maintenance plans. Similarly 36% of FAA fines and legal actions involve inadequate maintenance, with recent years showing a decline to about 20%, which may be a reflection of improved maintenance practices. Our results can aid industry and government in focusing resources to continue improving aviation safety.

  5. Science Based Human Reliability Analysis: Using Digital Nuclear Power Plant Simulators for Human Reliability Research

    Science.gov (United States)

    Shirley, Rachel Elizabeth

    Nuclear power plant (NPP) simulators are proliferating in academic research institutions and national laboratories in response to the availability of affordable, digital simulator platforms. Accompanying the new research facilities is a renewed interest in using data collected in NPP simulators for Human Reliability Analysis (HRA) research. An experiment conducted in The Ohio State University (OSU) NPP Simulator Facility develops data collection methods and analytical tools to improve use of simulator data in HRA. In the pilot experiment, student operators respond to design basis accidents in the OSU NPP Simulator Facility. Thirty-three undergraduate and graduate engineering students participated in the research. Following each accident scenario, student operators completed a survey about perceived simulator biases and watched a video of the scenario. During the video, they periodically recorded their perceived strength of significant Performance Shaping Factors (PSFs) such as Stress. This dissertation reviews three aspects of simulator-based research using the data collected in the OSU NPP Simulator Facility: First, a qualitative comparison of student operator performance to computer simulations of expected operator performance generated by the Information Decision Action Crew (IDAC) HRA method. Areas of comparison include procedure steps, timing of operator actions, and PSFs. Second, development of a quantitative model of the simulator bias introduced by the simulator environment. Two types of bias are defined: Environmental Bias and Motivational Bias. This research examines Motivational Bias--that is, the effect of the simulator environment on an operator's motivations, goals, and priorities. A bias causal map is introduced to model motivational bias interactions in the OSU experiment. Data collected in the OSU NPP Simulator Facility are analyzed using Structural Equation Modeling (SEM). Data include crew characteristics, operator surveys, and time to recognize

  6. The Effect of Shocks: An Empirical Analysis of Ethiopia

    Directory of Open Access Journals (Sweden)

    Yilebes Addisu Damtie

    2015-07-01

    Full Text Available Besides striving for the increase of production and development, it is also necessary to reduce the losses created by the shocks. The people of Ethiopia are exposed to the impact of both natural and man-made shocks. Following this, policy makers, governmental and non-governmental organizations need to identify the important shocks and their effect and use as an input. This study was conducted to identify the food insecurity shocks and to estimate their effect based on the conceptual framework developed in Ethiopia, Amhara National Regional State of Libo Kemkem District. Descriptive statistical analysis, multiple regression, binary logistic regression, chi-squared and independent sample t-test were used as a data analysis technique. The results showed eight shocks affecting households which were weather variability, weed, plant insect and pest infestation, soil fertility problem, animal disease and epidemics, human disease and epidemics, price fluctuation problem and conflict. Weather variability, plant insect and pest infestation, weed, animal disease and epidemics created a mean loss of 3,821.38, 886.06, 508.04 and 1,418.32 Birr, respectively. In addition, human disease and epidemics, price fluctuation problem and conflict affected 68.11%, 88.11% and 14.59% of households, respectively. Among the sample households 28,1 % were not able to meet their food need throughout the year while 71,9 % could. The result of the multiple regression models revealed that weed existence (β = –0,142, p < 0,05, plant insect and pest infestation (β = –0,279, p < 0,01 and soil fertility problem (β = –0,321, p < 0,01 had significant effect on income. Asset was found significantly affected by plant insect and pest infestation (β = –0,229, p < 0,01, human disease and epidemics (β = 0,145, p < 0,05, and soil fertility problem (β = –0,317, p < 0,01 while food production was affected by soil fertility problem (β = –0,314, p < 0,01. Binary logistic

  7. INTER-RATER RELIABILITY FOR MOVEMENT PATTERN ANALYSIS (MPA: MEASURING PATTERNING OF BEHAVIORS VERSUS DISCRETE BEHAVIOR COUNTS AS INDICATORS OF DECISION-MAKING STYLE

    Directory of Open Access Journals (Sweden)

    Brenda L Connors

    2014-06-01

    Full Text Available The unique yield of collecting observational data on human movement has received increasing attention in a number of domains, including the study of decision-making style. As such, interest has grown in the nuances of core methodological issues, including the best ways of assessing inter-rater reliability. In this paper we focus on one key topic – the distinction between establishing reliability for the patterning of behaviors as opposed to the computation of raw counts – and suggest that reliability for each be compared empirically rather than determined a priori. We illustrate by assessing inter-rater reliability for key outcome measures derived from Movement Pattern Analysis (MPA, an observational methodology that records body movements as indicators of decision-making style with demonstrated predictive validity. While reliability ranged from moderate to good for raw counts of behaviors reflecting each of two Overall Factors generated within MPA (Assertion and Perspective, inter-rater reliability for patterning (proportional indicators of each factor was significantly higher and excellent (ICC = .89. Furthermore, patterning, as compared to raw counts, provided better prediction of observable decision-making process assessed in the laboratory. These analyses support the utility of using an empirical approach to inform the consideration of measuring discrete behavioral counts versus patterning of behaviors when determining inter-rater reliability of observable behavior. They also speak to the substantial reliability that may be achieved via application of theoretically grounded observational systems such as MPA that reveal thinking and action motivations via visible movement patterns.

  8. An empirical study of tourist preferences using conjoint analysis

    Directory of Open Access Journals (Sweden)

    Tripathi, S.N.

    2010-01-01

    Full Text Available Tourism and hospitality have become key global economic activities as expectations with regard to our use of leisure time have evolved, attributing greater meaning to our free time. While the growth in tourism has been impressive, India's share in total global tourism arrivals and earnings is quite insignificant. It is an accepted fact that India has tremendous potential for development of tourism. This anomaly and the various underlying factors responsible for it are the focus of our study. The objective being determination of customer preferences for multi attribute hybrid services like tourism, so as to enable the state tourism board to deliver a desired combination of intrinsic attributes, helping it to create a sustainable competitive advantage, leading to greater customer satisfaction and positive word of mouth. Conjoint analysis has been used for this purpose, which estimates the structure of a consumer’s preferences, given his/her overall evaluations of a set of alternatives that are pre-specified in terms of levels of different attributes.

  9. Empirical Analysis on CSR Communication in Romania: Transparency and Participation

    Directory of Open Access Journals (Sweden)

    Irina-Eugenia Iamandi

    2012-12-01

    Full Text Available In the specific field of corporate social responsibility (CSR, the participation of companies in supporting social and environmental issues is mainly analysed and/or measured based on their CSR communication policy; in this way, the transparency of the CSR reporting procedures is one of the most precise challenges for researchers and practitioners in the field. The main research objective of the present paper is to distinguish between different types of CSR participation by identifying the reasons behind CSR communication for a series of companies acting on the Romanian market. The descriptive analysis – conducted both at integrated and corporate level for the Romanian companies – took into account five main CSR communication related issues: CSR site, CSR report, CSR listing, CSR budget and CSR survey. The results highlight both the declarative/prescriptive and practical/descriptive perspectives of CSR communication in Romania, showing that the Romanian CSR market is reaching its full maturity. In more specific terms, the majority of the investigated companies are already using different types of CSR participation, marking the transition from CSR just for commercial purposes to CSR for long-term strategic use. The achieved results are broadly analysed in the paper and specific conclusions are emphasized.

  10. Political determinants of social expenditures in Greece: an empirical analysis

    Directory of Open Access Journals (Sweden)

    Ebru Canikalp

    2017-09-01

    Full Text Available A view prominently expounded is that the interaction between the composition and the volume of public expenditures is directly affected by political, institutional, psephological and ideological indicators. A crucial component of public expenditures, social expenditures play an important role in the economy as they directly and indirectly affect the distribution of income and wealth. Social expenditures aim at reallocating income and wealth unequal distribution. These expenditures comprise cash benefits, direct in-kind provision of goods and services, and tax breaks with social purposes.The aim of this study is to determine the relationship between political structure, i.e. government fragmentation, ideological composition, elections and so on, and the social expenditures in Greece. Employing data from the Comparative Political Dataset (CPDS and the OECD Social Expenditure Database (SOCX, a time series analysis was conducted for Greece for the 1980-2014 period. The findings of the study indicate that voter turnout, spending on the elderly population and the number of government changes have positive and statistically significant effects on social expenditures in Greece while debt stock and cabinet composition have negative effects.

  11. Empirical Analysis on The Existence of The Phillips Curve

    Directory of Open Access Journals (Sweden)

    Shaari Mohd Shahidan

    2018-01-01

    Full Text Available The Phillips curve shows the trade-off relationship between the inflation and unemployment rates. A rise in inflation due to the high economic growth, more jobs are available and therefore unemployment will fall. However, the existence of the Phillips curve in high-income countries has not been much discussed. Countries with high income should have low unemployment rate, suggesting a high inflation. However, some high-income countries, the United States in 1970s for example, could not avert stagflation whereby high unemployment rate and inflation occurred in the same time. This situation is contrary to the Phillips curve. Therefore, this study aims to investigate the existence of the Phillips curve in high-income countries for the period 1990-2014 using the panel data analysis. The most interesting finding of this study is the existence of a bidirectional relationship between unemployment rate and inflation rate in both long and short runs. Therefore, the governments should choose to stabilize inflation rate or reduce unemployment rate

  12. Prominent feature extraction for review analysis: an empirical study

    Science.gov (United States)

    Agarwal, Basant; Mittal, Namita

    2016-05-01

    Sentiment analysis (SA) research has increased tremendously in recent times. SA aims to determine the sentiment orientation of a given text into positive or negative polarity. Motivation for SA research is the need for the industry to know the opinion of the users about their product from online portals, blogs, discussion boards and reviews and so on. Efficient features need to be extracted for machine-learning algorithm for better sentiment classification. In this paper, initially various features are extracted such as unigrams, bi-grams and dependency features from the text. In addition, new bi-tagged features are also extracted that conform to predefined part-of-speech patterns. Furthermore, various composite features are created using these features. Information gain (IG) and minimum redundancy maximum relevancy (mRMR) feature selection methods are used to eliminate the noisy and irrelevant features from the feature vector. Finally, machine-learning algorithms are used for classifying the review document into positive or negative class. Effects of different categories of features are investigated on four standard data-sets, namely, movie review and product (book, DVD and electronics) review data-sets. Experimental results show that composite features created from prominent features of unigram and bi-tagged features perform better than other features for sentiment classification. mRMR is a better feature selection method as compared with IG for sentiment classification. Boolean Multinomial Naïve Bayes) algorithm performs better than support vector machine classifier for SA in terms of accuracy and execution time.

  13. Reliability analysis of the service water system of Angra 1 reactor

    International Nuclear Information System (INIS)

    Tayt-Sohn, L.C.; Oliveira, L.F.S. de.

    1984-01-01

    A reliability analysis of the service water system is done aiming to use in the evaluation of the non reliability of the Component Cooling System (SRC) for great loss of cooling accidents in nuclear power plants. (E.G.) [pt

  14. Reliability analysis of the service water system of Angra 1 reactor

    International Nuclear Information System (INIS)

    Oliveira, L.F.S. de; Fleming, P.V.; Frutuoso e Melo, P.F.F.; Tayt-Sohn, L.C.

    1983-01-01

    A reliability analysis of the service water system is done aiming to use in the evaluation of the non reliability of the component cooling system (SRC) for great loss of cooling accidents in nuclear power plants. (E.G.) [pt

  15. Human Reliability Analysis in Support of Risk Assessment for Positive Train Control

    Science.gov (United States)

    2003-06-01

    This report describes an approach to evaluating the reliability of human actions that are modeled in a probabilistic risk assessment : (PRA) of train control operations. This approach to human reliability analysis (HRA) has been applied in the case o...

  16. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan human reliability analysis (IJS-HRA) and standardized plant analysis risk human reliability analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance

  17. Comparison of methods for dependency determination between human failure events within human reliability analysis

    International Nuclear Information System (INIS)

    Cepis, M.

    2007-01-01

    The Human Reliability Analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease of subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan - Human Reliability Analysis (IJS-HRA) and Standardized Plant Analysis Risk Human Reliability Analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance. (author)

  18. Operator reliability analysis during NPP small break LOCA

    International Nuclear Information System (INIS)

    Zhang Jiong; Chen Shenglin

    1990-01-01

    To assess the human factor characteristic of a NPP main control room (MCR) design, the MCR operator reliability during a small break LOCA is analyzed, and some approaches for improving the MCR operator reliability are proposed based on the analyzing results

  19. Wind turbine reliability : a database and analysis approach.

    Energy Technology Data Exchange (ETDEWEB)

    Linsday, James (ARES Corporation); Briand, Daniel; Hill, Roger Ray; Stinebaugh, Jennifer A.; Benjamin, Allan S. (ARES Corporation)

    2008-02-01

    The US wind Industry has experienced remarkable growth since the turn of the century. At the same time, the physical size and electrical generation capabilities of wind turbines has also experienced remarkable growth. As the market continues to expand, and as wind generation continues to gain a significant share of the generation portfolio, the reliability of wind turbine technology becomes increasingly important. This report addresses how operations and maintenance costs are related to unreliability - that is the failures experienced by systems and components. Reliability tools are demonstrated, data needed to understand and catalog failure events is described, and practical wind turbine reliability models are illustrated, including preliminary results. This report also presents a continuing process of how to proceed with controlling industry requirements, needs, and expectations related to Reliability, Availability, Maintainability, and Safety. A simply stated goal of this process is to better understand and to improve the operable reliability of wind turbine installations.

  20. Extending Failure Modes and Effects Analysis Approach for Reliability Analysis at the Software Architecture Design Level

    NARCIS (Netherlands)

    Sözer, Hasan; Tekinerdogan, B.; Aksit, Mehmet; de Lemos, Rogerio; Gacek, Cristina

    2007-01-01

    Several reliability engineering approaches have been proposed to identify and recover from failures. A well-known and mature approach is the Failure Mode and Effect Analysis (FMEA) method that is usually utilized together with Fault Tree Analysis (FTA) to analyze and diagnose the causes of failures.

  1. Uncertainty analysis and validation of environmental models. The empirically based uncertainty analysis

    International Nuclear Information System (INIS)

    Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie

    1996-01-01

    The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae

  2. Reliability analysis of water distribution systems under uncertainty

    International Nuclear Information System (INIS)

    Kansal, M.L.; Kumar, Arun; Sharma, P.B.

    1995-01-01

    In most of the developing countries, the Water Distribution Networks (WDN) are of intermittent type because of the shortage of safe drinking water. Failure of a pipeline(s) in such cases will cause not only the fall in one or more nodal heads but also the poor connectivity of source with various demand nodes of the system. Most of the previous works have used the two-step algorithm based on pathset or cutset approach for connectivity analysis. The computations become more cumbersome when connectivity of all demand nodes taken together with that of supply is carried out. In the present paper, network connectivity based on the concept of Appended Spanning Tree (AST) is suggested to compute global network connectivity which is defined as the probability of the source node being connected with all the demand nodes simultaneously. The concept of AST has distinct advantages as it attacks the problem directly rather than in an indirect way as most of the studies so far have done. Since the water distribution system is a repairable one, a general expression for pipeline avialability using the failure/repair rate is considered. Furthermore, the sensitivity of global reliability estimates due to the likely error in the estimation of failure/repair rates of various pipelines is also studied

  3. Monte Carlo methods for the reliability analysis of Markov systems

    International Nuclear Information System (INIS)

    Buslik, A.J.

    1985-01-01

    This paper presents Monte Carlo methods for the reliability analysis of Markov systems. Markov models are useful in treating dependencies between components. The present paper shows how the adjoint Monte Carlo method for the continuous time Markov process can be derived from the method for the discrete-time Markov process by a limiting process. The straightforward extensions to the treatment of mean unavailability (over a time interval) are given. System unavailabilities can also be estimated; this is done by making the system failed states absorbing, and not permitting repair from them. A forward Monte Carlo method is presented in which the weighting functions are related to the adjoint function. In particular, if the exact adjoint function is known then weighting factors can be constructed such that the exact answer can be obtained with a single Monte Carlo trial. Of course, if the exact adjoint function is known, there is no need to perform the Monte Carlo calculation. However, the formulation is useful since it gives insight into choices of the weight factors which will reduce the variance of the estimator

  4. Reliability analysis of load-sharing systems with memory.

    Science.gov (United States)

    Wang, Dewei; Jiang, Chendi; Park, Chanseok

    2018-02-22

    The load-sharing model has been studied since the early 1940s to account for the stochastic dependence of components in a parallel system. It assumes that, as components fail one by one, the total workload applied to the system is shared by the remaining components and thus affects their performance. Such dependent systems have been studied in many engineering applications which include but are not limited to fiber composites, manufacturing, power plants, workload analysis of computing, software and hardware reliability, etc. Many statistical models have been proposed to analyze the impact of each redistribution of the workload; i.e., the changes on the hazard rate of each remaining component. However, they do not consider how long a surviving component has worked for prior to the redistribution. We name such load-sharing models as memoryless. To remedy this potential limitation, we propose a general framework for load-sharing models that account for the work history. Through simulation studies, we show that an inappropriate use of the memoryless assumption could lead to inaccurate inference on the impact of redistribution. Further, a real-data example of plasma display devices is analyzed to illustrate our methods.

  5. Procedure for conducting a human-reliability analysis for nuclear power plants. Final report

    International Nuclear Information System (INIS)

    Bell, B.J.; Swain, A.D.

    1983-05-01

    This document describes in detail a procedure to be followed in conducting a human reliability analysis as part of a probabilistic risk assessment when such an analysis is performed according to the methods described in NUREG/CR-1278, Handbook for Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications. An overview of the procedure describing the major elements of a human reliability analysis is presented along with a detailed description of each element and an example of an actual analysis. An appendix consists of some sample human reliability analysis problems for further study

  6. Spinal appearance questionnaire: factor analysis, scoring, reliability, and validity testing.

    Science.gov (United States)

    Carreon, Leah Y; Sanders, James O; Polly, David W; Sucato, Daniel J; Parent, Stefan; Roy-Beaudry, Marjolaine; Hopkins, Jeffrey; McClung, Anna; Bratcher, Kelly R; Diamond, Beverly E

    2011-08-15

    Cross sectional. This study presents the factor analysis of the Spinal Appearance Questionnaire (SAQ) and its psychometric properties. Although the SAQ has been administered to a large sample of patients with adolescent idiopathic scoliosis (AIS) treated surgically, its psychometric properties have not been fully evaluated. This study presents the factor analysis and scoring of the SAQ and evaluates its psychometric properties. The SAQ and the Scoliosis Research Society-22 (SRS-22) were administered to AIS patients who were being observed, braced or scheduled for surgery. Standard demographic data and radiographic measures including Lenke type and curve magnitude were also collected. Of the 1802 patients, 83% were female; with a mean age of 14.8 years and mean initial Cobb angle of 55.8° (range, 0°-123°). From the 32 items of the SAQ, 15 loaded on two factors with consistent and significant correlations across all Lenke types. There is an Appearance (items 1-10) and an Expectations factor (items 12-15). Responses are summed giving a range of 5 to 50 for the Appearance domain and 5 to 20 for the Expectations domain. The Cronbach's α was 0.88 for both domains and Total score with a test-retest reliability of 0.81 for Appearance and 0.91 for Expectations. Correlations with major curve magnitude were higher for the SAQ Appearance and SAQ Total scores compared to correlations between the SRS Appearance and SRS Total scores. The SAQ and SRS-22 Scores were statistically significantly different in patients who were scheduled for surgery compared to those who were observed or braced. The SAQ is a valid measure of self-image in patients with AIS with greater correlation to curve magnitude than SRS Appearance and Total score. It also discriminates between patients who require surgery from those who do not.

  7. An efficient phased mission reliability analysis for autonomous vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Remenyte-Prescott, R., E-mail: R.Remenyte-Prescott@nottingham.ac.u [Nottingham Transportation Engineering Centre, Faculty of Engineering, University of Nottingham, Nottingham NG7 2RD (United Kingdom); Andrews, J.D. [Nottingham Transportation Engineering Centre, Faculty of Engineering, University of Nottingham, Nottingham NG7 2RD (United Kingdom); Chung, P.W.H. [Department of Computer Science, Loughborough University, Loughborough LE11 3TU (United Kingdom)

    2010-03-15

    Autonomous systems are becoming more commonly used, especially in hazardous situations. Such systems are expected to make their own decisions about future actions when some capabilities degrade due to failures of their subsystems. Such decisions are made without human input, therefore they need to be well-informed in a short time when the situation is analysed and future consequences of the failure are estimated. The future planning of the mission should take account of the likelihood of mission failure. The reliability analysis for autonomous systems can be performed using the methodologies developed for phased mission analysis, where the causes of failure for each phase in the mission can be expressed by fault trees. Unmanned autonomous vehicles (UAVs) are of a particular interest in the aeronautical industry, where it is a long term ambition to operate them routinely in civil airspace. Safety is the main requirement for the UAV operation and the calculation of failure probability of each phase and the overall mission is the topic of this paper. When components or subsystems fail or environmental conditions throughout the mission change, these changes can affect the future mission. The new proposed methodology takes into account the available diagnostics data and is used to predict future capabilities of the UAV in real time. Since this methodology is based on the efficient BDD method, the quickly provided advice can be used in making decisions. When failures occur appropriate actions are required in order to preserve safety of the autonomous vehicle. The overall decision making strategy for autonomous vehicles is explained in this paper. Some limitations of the methodology are discussed and further improvements are presented based on experimental results.

  8. An efficient phased mission reliability analysis for autonomous vehicles

    International Nuclear Information System (INIS)

    Remenyte-Prescott, R.; Andrews, J.D.; Chung, P.W.H.

    2010-01-01

    Autonomous systems are becoming more commonly used, especially in hazardous situations. Such systems are expected to make their own decisions about future actions when some capabilities degrade due to failures of their subsystems. Such decisions are made without human input, therefore they need to be well-informed in a short time when the situation is analysed and future consequences of the failure are estimated. The future planning of the mission should take account of the likelihood of mission failure. The reliability analysis for autonomous systems can be performed using the methodologies developed for phased mission analysis, where the causes of failure for each phase in the mission can be expressed by fault trees. Unmanned autonomous vehicles (UAVs) are of a particular interest in the aeronautical industry, where it is a long term ambition to operate them routinely in civil airspace. Safety is the main requirement for the UAV operation and the calculation of failure probability of each phase and the overall mission is the topic of this paper. When components or subsystems fail or environmental conditions throughout the mission change, these changes can affect the future mission. The new proposed methodology takes into account the available diagnostics data and is used to predict future capabilities of the UAV in real time. Since this methodology is based on the efficient BDD method, the quickly provided advice can be used in making decisions. When failures occur appropriate actions are required in order to preserve safety of the autonomous vehicle. The overall decision making strategy for autonomous vehicles is explained in this paper. Some limitations of the methodology are discussed and further improvements are presented based on experimental results.

  9. Distribution-level electricity reliability: Temporal trends using statistical analysis

    International Nuclear Information System (INIS)

    Eto, Joseph H.; LaCommare, Kristina H.; Larsen, Peter; Todd, Annika; Fisher, Emily

    2012-01-01

    This paper helps to address the lack of comprehensive, national-scale information on the reliability of the U.S. electric power system by assessing trends in U.S. electricity reliability based on the information reported by the electric utilities on power interruptions experienced by their customers. The research analyzes up to 10 years of electricity reliability information collected from 155 U.S. electric utilities, which together account for roughly 50% of total U.S. electricity sales. We find that reported annual average duration and annual average frequency of power interruptions have been increasing over time at a rate of approximately 2% annually. We find that, independent of this trend, installation or upgrade of an automated outage management system is correlated with an increase in the reported annual average duration of power interruptions. We also find that reliance on IEEE Standard 1366-2003 is correlated with higher reported reliability compared to reported reliability not using the IEEE standard. However, we caution that we cannot attribute reliance on the IEEE standard as having caused or led to higher reported reliability because we could not separate the effect of reliance on the IEEE standard from other utility-specific factors that may be correlated with reliance on the IEEE standard. - Highlights: ► We assess trends in electricity reliability based on the information reported by the electric utilities. ► We use rigorous statistical techniques to account for utility-specific differences. ► We find modest declines in reliability analyzing interruption duration and frequency experienced by utility customers. ► Installation or upgrade of an OMS is correlated to an increase in reported duration of power interruptions. ► We find reliance in IEEE Standard 1366 is correlated with higher reported reliability.

  10. Modeling cognition dynamics and its application to human reliability analysis

    International Nuclear Information System (INIS)

    Mosleh, A.; Smidts, C.; Shen, S.H.

    1996-01-01

    For the past two decades, a number of approaches have been proposed for the identification and estimation of the likelihood of human errors, particularly for use in the risk and reliability studies of nuclear power plants. Despite the wide-spread use of the most popular among these methods, their fundamental weaknesses are widely recognized, and the treatment of human reliability has been considered as one of the soft spots of risk studies of large technological systems. To alleviate the situation, new efforts have focused on the development of human reliability models based on a more fundamental understanding of operator response and its cognitive aspects

  11. An empirical analysis of ERP adoption by oil and gas firms

    Science.gov (United States)

    Romero, Jorge

    2005-07-01

    Despite the growing popularity of enterprise-resource-planning (ERP) systems for the information technology infrastructure of large and medium-sized businesses, there is limited empirical evidence on the competitive benefits of ERP implementations. Case studies of individual firms provide insights but do not provide sufficient evidence to draw reliable inferences and cross-sectional studies of firms in multiple industries provide a broad-brush perspective of the performance effects associated with ERP installations. To narrow the focus to a specific competitive arena, I analyze the impact of ERP adoption on various dimensions of performance for firms in the Oil and Gas Industry. I selected the Oil and Gas Industry because several companies installed a specific type of ERP system, SAP R/3, during the period from 1990 to 2002. In fact, SAP was the dominant provider of enterprise software to oil and gas companies during this period. I evaluate performance of firms that implemented SAP R/3 relative to firms that did not adopt ERP systems in the pre-implementation, implementation and post-implementation periods. My analysis takes two different approaches, the first from a financial perspective and the second from a strategic perspective. Using the Sloan (General Motors) model commonly applied in financial statement analysis, I examine changes in performance for ERP-adopting firms versus non-adopting firms along the dimensions of asset utilization and return on sales. Asset utilization is more closely aligned with changes in leanness of operations, and return on sales is more closely aligned with customer-value-added. I test hypotheses related to the timing and magnitude of the impact of ERP implementation with respect to leanness of operations and customer value added. I find that SAP-adopting companies performed relatively better in terms of asset turnover than non-SAP-adopting companies during both the implementation and post-implementation periods and that SAP

  12. An Empirical Analysis of the Impact of Capital Market Activities on ...

    African Journals Online (AJOL)

    An Empirical Analysis of the Impact of Capital Market Activities on the Nigerian Economy. ... Others include the expansion of the stock market in terms of depth and breadth and the attraction of foreign direct investment and foreign portfolio investment into the Nigerian economic landscape. Keywords: Nigeria, Market ...

  13. Perception of urban retailing environments : an empirical analysis of consumer information and usage fields

    NARCIS (Netherlands)

    Timmermans, H.J.P.; vd Heijden, R.E.C.M.; Westerveld, J.

    1982-01-01

    This article reports on an empirical analysis of consumer information and usage fields in the city of Eindhoven. The main purposes of this study are to investigate the distance, sectoral and directional biases of these fields, to analyse whether the degree of biases is related to personal

  14. Steering the Ship through Uncertain Waters: Empirical Analysis and the Future of Evangelical Higher Education

    Science.gov (United States)

    Rine, P. Jesse; Guthrie, David S.

    2016-01-01

    Leaders of evangelical Christian colleges must navigate a challenging environment shaped by public concern about college costs and educational quality, federal inclinations toward increased regulation, and lingering fallout from the Great Recession. Proceeding from the premise that empirical analysis empowers institutional actors to lead well in…

  15. Deriving Multidimensional Poverty Indicators: Methodological Issues and an Empirical Analysis for Italy

    Science.gov (United States)

    Coromaldi, Manuela; Zoli, Mariangela

    2012-01-01

    Theoretical and empirical studies have recently adopted a multidimensional concept of poverty. There is considerable debate about the most appropriate degree of multidimensionality to retain in the analysis. In this work we add to the received literature in two ways. First, we derive indicators of multiple deprivation by applying a particular…

  16. Extended Analysis of Empirical Citations with Skinner's "Verbal Behavior": 1984-2004

    Science.gov (United States)

    Dixon, Mark R.; Small, Stacey L.; Rosales, Rocio

    2007-01-01

    The present paper comments on and extends the citation analysis of verbal operant publications based on Skinner's "Verbal Behavior" (1957) by Dymond, O'Hora, Whelan, and O'Donovan (2006). Variations in population parameters were evaluated for only those studies that Dymond et al. categorized as empirical. Preliminary results indicate that the…

  17. Does risk management contribute to IT project success? A meta-analysis of empirical evidence

    NARCIS (Netherlands)

    de Bakker, K.F.C.; Boonstra, A.; Wortmann, J.C.

    The question whether risk management contributes to IT project success is considered relevant by people from both academic and practitioners' communities already for a long time. This paper presents a meta-analysis of the empirical evidence that either supports or opposes the claim that risk

  18. Critical Access Hospitals and Retail Activity: An Empirical Analysis in Oklahoma

    Science.gov (United States)

    Brooks, Lara; Whitacre, Brian E.

    2011-01-01

    Purpose: This paper takes an empirical approach to determining the effect that a critical access hospital (CAH) has on local retail activity. Previous research on the relationship between hospitals and economic development has primarily focused on single-case, multiplier-oriented analysis. However, as the efficacy of federal and state-level rural…

  19. Human Reliability Analysis. Applicability of the HRA-concept in maintenance shutdown

    International Nuclear Information System (INIS)

    Obenius, Aino

    2007-08-01

    Probabilistic Safety Analysis (PSA) is performed for Swedish nuclear power plants in order to make predictions and improvements of system safety. The analysis of the Three Mile Island and Chernobyl accidents contributed to broaden the approach to nuclear power plant safety. A system perspective focusing on the interaction between aspects of Man, Technology and Organization (MTO) emerged in addition to the development of Human Factors knowledge. To take the human influence on the technical system into consideration when performing PSAs, a Human Reliability Analysis (HRA) is performed. PSA is performed for different stages and plant operating states, and the current state of Swedish analyses is Low power and Shutdown (LPSD), also called Shutdown PSA (SPSA). The purpose of this master's thesis is to describe methods and basic models used when analysing human reliability for the LPSD state. The following questions are at issue: 1. How can the LPSD state be characterised and defined? 2. What is important to take into consideration when performing a LPSD HRA? 3. How can human behaviour be modelled for a LPSD risk analysis? 4. According to available empirical material, how are the questions above treated in performed analysis of human operation during LPSD? 5. How does the result of the questions above affect the way methods for analysis of LPSD could and/or should be developed? The procedure of this project has mainly consisted of literature studies of available theory for modelling of human behaviour and risk analysis of the LPSD state. This study regards analysis of planned outages when maintenance, fuel change, tests and inspections are performed. The outage period is characterised by planned maintenance activities performed in rotating 3-shifts, around the clock, as well as many of the persons performing work tasks on the plant being external contractors. The working conditions are characterised by stress due to heat, radiation and physically demanding or monotonous

  20. An Empirical Analysis of the Default Rate of Informal Lending—Evidence from Yiwu, China

    Science.gov (United States)

    Lu, Wei; Yu, Xiaobo; Du, Juan; Ji, Feng

    This study empirically analyzes the underlying factors contributing to the default rate of informal lending. This paper adopts snowball sampling interview to collect data and uses the logistic regression model to explore the specific factors. The results of these analyses validate the explanation of how the informal lending differs from the commercial loan. Factors that contribute to the default rate have particular attributes, while sharing some similarities with commercial bank or FICO credit scoring Index. Finally, our concluding remarks draw some inferences from empirical analysis and speculate as to what this may imply for the role of formal and informal financial sectors.

  1. Reliability analysis of reinforced concrete grids with nonlinear material behavior

    Energy Technology Data Exchange (ETDEWEB)

    Neves, Rodrigo A [EESC-USP, Av. Trabalhador Sao Carlense, 400, 13566-590 Sao Carlos (Brazil); Chateauneuf, Alaa [LaMI-UBP and IFMA, Campus de Clermont-Fd, Les Cezeaux, BP 265, 63175 Aubiere cedex (France)]. E-mail: alaa.chateauneuf@ifma.fr; Venturini, Wilson S [EESC-USP, Av. Trabalhador Sao Carlense, 400, 13566-590 Sao Carlos (Brazil)]. E-mail: venturin@sc.usp.br; Lemaire, Maurice [LaMI-UBP and IFMA, Campus de Clermont-Fd, Les Cezeaux, BP 265, 63175 Aubiere cedex (France)

    2006-06-15

    Reinforced concrete grids are usually used to support large floor slabs. These grids are characterized by a great number of critical cross-sections, where the overall failure is usually sudden. However, nonlinear behavior of concrete leads to the redistribution of internal forces and accurate reliability assessment becomes mandatory. This paper presents a reliability study on reinforced concrete (RC) grids based on coupling Monte Carlo simulations with the response surface techniques. This approach allows us to analyze real RC grids with large number of failure components. The response surface is used to evaluate the structural safety by using first order reliability methods. The application to simple grids shows the interest of the proposed method and the role of moment redistribution in the reliability assessment.

  2. Analysis of travel time reliability on Indiana interstates.

    Science.gov (United States)

    2009-09-15

    Travel-time reliability is a key performance measure in any transportation system. It is a : measure of quality of travel time experienced by transportation system users and reflects the efficiency : of the transportation system to serve citizens, bu...

  3. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  4. CARES/PC - CERAMICS ANALYSIS AND RELIABILITY EVALUATION OF STRUCTURES

    Science.gov (United States)

    Szatmary, S. A.

    1994-01-01

    The beneficial properties of structural ceramics include their high-temperature strength, light weight, hardness, and corrosion and oxidation resistance. For advanced heat engines, ceramics have demonstrated functional abilities at temperatures well beyond the operational limits of metals. This is offset by the fact that ceramic materials tend to be brittle. When a load is applied, their lack of significant plastic deformation causes the material to crack at microscopic flaws, destroying the component. CARES/PC performs statistical analysis of data obtained from the fracture of simple, uniaxial tensile or flexural specimens and estimates the Weibull and Batdorf material parameters from this data. CARES/PC is a subset of the program CARES (COSMIC program number LEW-15168) which calculates the fast-fracture reliability or failure probability of ceramic components utilizing the Batdorf and Weibull models to describe the effects of multi-axial stress states on material strength. CARES additionally requires that the ceramic structure be modeled by a finite element program such as MSC/NASTRAN or ANSYS. The more limited CARES/PC does not perform fast-fracture reliability estimation of components. CARES/PC estimates ceramic material properties from uniaxial tensile or from three- and four-point bend bar data. In general, the parameters are obtained from the fracture stresses of many specimens (30 or more are recommended) whose geometry and loading configurations are held constant. Parameter estimation can be performed for single or multiple failure modes by using the least-squares analysis or the maximum likelihood method. Kolmogorov-Smirnov and Anderson-Darling goodness-of-fit tests measure the accuracy of the hypothesis that the fracture data comes from a population with a distribution specified by the estimated Weibull parameters. Ninety-percent confidence intervals on the Weibull parameters and the unbiased value of the shape parameter for complete samples are provided

  5. Reliability model analysis and primary experimental evaluation of laser triggered pulse trigger

    International Nuclear Information System (INIS)

    Chen Debiao; Yang Xinglin; Li Yuan; Li Jin

    2012-01-01

    High performance pulse trigger can enhance performance and stability of the PPS. It is necessary to evaluate the reliability of the LTGS pulse trigger, so we establish the reliability analysis model of this pulse trigger based on CARMES software, the reliability evaluation is accord with the statistical results. (authors)

  6. Reliability analysis for the creep rupture mode of failure

    International Nuclear Information System (INIS)

    Vaidyanathan, S.

    1975-01-01

    An analytical study has been carried out to relate the factors of safety employed in the design of a component to the probability of failure in the thermal creep rupture mode. The analysis considers the statistical variations in the operating temperature, stress and rupture time, and applies the life fraction damage criterion as the indicator of failure. Typical results for solution annealed type 304-stainless steel material for the temperature and stress variations expected in an LMFBR environment have been obtained. The analytical problem was solved by considering the joint distribution of the independent variables and deriving the distribution for the function associated with the probability of failure by integrating over proper regions as dictated by the deterministic design rule. This leads to a triple integral for the final probability of failure where the coefficients of variation associated with the temperature, stress and rupture time distributions can be specified by the user. The derivation is general, and can be used for time varying stress histories and the case of irradiated material where the rupture time varies with accumulated fluence. Example calculations applied to solution annealed type 304 stainless steel material have been carried out for an assumed coefficient of variation of 2% for temperature and 6% for stress. The results show that the probability of failure associated with dependent stress intensity limits specified in the ASME Boiler and Pressure Vessel Section III Code Case 1592 is less than 5x10 -8 . Rupture under thermal creep conditions is a highly complicated phenomenon. It is believed that the present study will help in quantizing the reliability to be expected with deterministic design factors of safety

  7. Reliability analysis of a sensitive and independent stabilometry parameter set.

    Science.gov (United States)

    Nagymáté, Gergely; Orlovits, Zsanett; Kiss, Rita M

    2018-01-01

    Recent studies have suggested reduced independent and sensitive parameter sets for stabilometry measurements based on correlation and variance analyses. However, the reliability of these recommended parameter sets has not been studied in the literature or not in every stance type used in stabilometry assessments, for example, single leg stances. The goal of this study is to evaluate the test-retest reliability of different time-based and frequency-based parameters that are calculated from the center of pressure (CoP) during bipedal and single leg stance for 30- and 60-second measurement intervals. Thirty healthy subjects performed repeated standing trials in a bipedal stance with eyes open and eyes closed conditions and in a single leg stance with eyes open for 60 seconds. A force distribution measuring plate was used to record the CoP. The reliability of the CoP parameters was characterized by using the intraclass correlation coefficient (ICC), standard error of measurement (SEM), minimal detectable change (MDC), coefficient of variation (CV) and CV compliance rate (CVCR). Based on the ICC, SEM and MDC results, many parameters yielded fair to good reliability values, while the CoP path length yielded the highest reliability (smallest ICC > 0.67 (0.54-0.79), largest SEM% = 19.2%). Usually, frequency type parameters and extreme value parameters yielded poor reliability values. There were differences in the reliability of the maximum CoP velocity (better with 30 seconds) and mean power frequency (better with 60 seconds) parameters between the different sampling intervals.

  8. Reliability analysis of a sensitive and independent stabilometry parameter set

    Science.gov (United States)

    Nagymáté, Gergely; Orlovits, Zsanett

    2018-01-01

    Recent studies have suggested reduced independent and sensitive parameter sets for stabilometry measurements based on correlation and variance analyses. However, the reliability of these recommended parameter sets has not been studied in the literature or not in every stance type used in stabilometry assessments, for example, single leg stances. The goal of this study is to evaluate the test-retest reliability of different time-based and frequency-based parameters that are calculated from the center of pressure (CoP) during bipedal and single leg stance for 30- and 60-second measurement intervals. Thirty healthy subjects performed repeated standing trials in a bipedal stance with eyes open and eyes closed conditions and in a single leg stance with eyes open for 60 seconds. A force distribution measuring plate was used to record the CoP. The reliability of the CoP parameters was characterized by using the intraclass correlation coefficient (ICC), standard error of measurement (SEM), minimal detectable change (MDC), coefficient of variation (CV) and CV compliance rate (CVCR). Based on the ICC, SEM and MDC results, many parameters yielded fair to good reliability values, while the CoP path length yielded the highest reliability (smallest ICC > 0.67 (0.54–0.79), largest SEM% = 19.2%). Usually, frequency type parameters and extreme value parameters yielded poor reliability values. There were differences in the reliability of the maximum CoP velocity (better with 30 seconds) and mean power frequency (better with 60 seconds) parameters between the different sampling intervals. PMID:29664938

  9. Analysis of NPP protection structure reliability under impact of a falling aircraft

    International Nuclear Information System (INIS)

    Shul'man, G.S.

    1996-01-01

    Methodology for evaluation of NPP protection structure reliability by impact of aircraft fall down is considered. The methodology is base on the probabilistic analysis of all potential events. The problem is solved in three stages: determination of loads on structural units, calculation of local reliability of protection structures by assigned loads and estimation of the structure reliability. The methodology proposed may be applied at the NPP design stage and by determination of reliability of already available structures

  10. Method of reliability allocation based on fault tree analysis and fuzzy math in nuclear power plants

    International Nuclear Information System (INIS)

    Chen Zhaobing; Deng Jian; Cao Xuewu

    2005-01-01

    Reliability allocation is a kind of a difficult multi-objective optimization problem. It can not only be applied to determine the reliability characteristic of reactor systems, subsystem and main components but also be performed to improve the design, operation and maintenance of nuclear plants. The fuzzy math known as one of the powerful tools for fuzzy optimization and the fault analysis deemed to be one of the effective methods of reliability analysis can be applied to the reliability allocation model so as to work out the problems of fuzzy characteristic of some factors and subsystem's choice respectively in this paper. Thus we develop a failure rate allocation model on the basis of the fault tree analysis and fuzzy math. For the choice of the reliability constraint factors, we choose the six important ones according to practical need for conducting the reliability allocation. The subsystem selected by the top-level fault tree analysis is to avoid allocating reliability for all the equipment and components including the unnecessary parts. During the reliability process, some factors can be calculated or measured quantitatively while others only can be assessed qualitatively by the expert rating method. So we adopt fuzzy decision and dualistic contrast to realize the reliability allocation with the help of fault tree analysis. Finally the example of the emergency diesel generator's reliability allocation is used to illustrate reliability allocation model and improve this model simple and applicable. (authors)

  11. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    Tanji, Junichi; Fujimoto, Haruo

    2013-09-01

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  12. Application of Metric-based Software Reliability Analysis to Example Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  13. Reliability Analysis Multiple Redundancy Controller for Nuclear Safety Systems

    International Nuclear Information System (INIS)

    Son, Gwangseop; Kim, Donghoon; Son, Choulwoong

    2013-01-01

    This controller is configured for multiple modular redundancy (MMR) composed of dual modular redundancy (DMR) and triple modular redundancy (TMR). The architecture of MRC is briefly described, and the Markov model is developed. Based on the model, the reliability and Mean Time To Failure (MTTF) are analyzed. In this paper, the architecture of MRC for nuclear safety systems is described. The MRC is configured for multiple modular redundancy (MMR) composed of dual modular redundancy (DMR) and triple modular redundancy (TMR). Markov models for MRC architecture was developed, and then the reliability was analyzed by using the model. From the reliability analyses for the MRC, it is obtained that the failure rate of each module in the MRC should be less than 2 Χ 10 -4 /hour and the MTTF average increase rate depending on FCF increment, i. e. ΔMTTF/ΔFCF, is 4 months/0.1

  14. Reliability Engineering Analysis of ATLAS Data Reprocessing Campaigns

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration; Karpenko, D

    2013-01-01

    During three years of LHC data taking, the ATLAS collaboration completed three petascale data reprocessing campaigns on the Grid, with up to 2 PB of data being reprocessed every year. In reprocessing on the Grid, failures can occur for a variety of reasons, while Grid heterogeneity makes failures hard to diagnose and repair quickly. As a result, Big Data processing on the Grid must tolerate a continuous stream of failures, errors and faults. While ATLAS fault-tolerance mechanisms improve the reliability of Big Data processing in the Grid, their benefits come at costs and result in delays making the performance prediction difficult. Reliability Engineering provides a framework for fundamental understanding of the Big Data processing on the Grid, which is not a desirable enhancement but a necessary requirement. In ATLAS, cost monitoring and performance prediction became critical for the success of the reprocessing campaigns conducted in preparation for the major physics conferences. In addition, our Reliability...

  15. Reliability Engineering Analysis of ATLAS Data Reprocessing Campaigns

    CERN Document Server

    Vaniachine, A; The ATLAS collaboration; Karpenko, D

    2014-01-01

    During three years of LHC data taking, the ATLAS collaboration completed three petascale data reprocessing campaigns on the Grid, with up to 2 PB of data being reprocessed every year. In reprocessing on the Grid, failures can occur for a variety of reasons, while Grid heterogeneity makes failures hard to diagnose and repair quickly. As a result, Big Data processing on the Grid must tolerate a continuous stream of failures, errors and faults. While ATLAS fault-tolerance mechanisms improve the reliability of Big Data processing in the Grid, their benefits come at costs and result in delays making the performance prediction difficult. Reliability Engineering provides a framework for fundamental understanding of the Big Data processing on the Grid, which is not a desirable enhancement but a necessary requirement. In ATLAS, cost monitoring and performance prediction became critical for the success of the reprocessing campaigns conducted in preparation for the major physics conferences. In addition, our Reliability...

  16. Pharmacoeconomic analysis of voriconazole vs. caspofungin in the empirical antifungal therapy of febrile neutropenia in Australia.

    Science.gov (United States)

    Al-Badriyeh, Daoud; Liew, Danny; Stewart, Kay; Kong, David C M

    2012-05-01

    In two major clinical trials, voriconazole and caspofungin were recommended as alternatives to liposomal amphotericin B for empirical use in febrile neutropenia. This study investigated the health economic impact of using voriconazole vs. caspofungin in patients with febrile neutropenia. A decision analytic model was developed to measure downstream consequences of empirical antifungal therapy. Clinical outcomes measured were success, breakthrough infection, persistent base-line infection, persistent fever, premature discontinuation and death. Treatment transition probabilities and patterns were directly derived from data in two relevant randomised controlled trials. Resource use was estimated using an expert clinical panel. Cost inputs were obtained from latest Australian sources. The analysis adopted the perspective of the Australian hospital system. The use of caspofungin led to a lower expected mean cost per patient than voriconazole (AU$40,558 vs. AU$41,356), with a net cost saving of AU$798 (1.9%) per patient. Results were most sensitive to the duration of therapy and the alternative therapy used post-discontinuation. In uncertainty analysis, the cost associated with caspofungin is less than that with voriconazole in 65.5% of cases. This is the first economic evaluation of voriconazole vs. caspofungin for empirical therapy. Caspofungin appears to have a higher probability of having cost-savings than voriconazole for empirical therapy. The difference between the two medications does not seem to be statistically significant however. © 2011 Blackwell Verlag GmbH.

  17. Long term reliability analysis of standby diesel generators

    International Nuclear Information System (INIS)

    Winfield, D.J.

    1988-01-01

    The long term reliability of 11 diesel generators of 125 to 250 kV A size has been analysed from 26 years of data base information on individual diesel service as standby power supplies for the Chalk River research reactor facilities. Failure to start on demand and failure to run data is presented and failure by diesel subsystem and multiple failures are also analysed. A brief comparison is made with reliability studies of larger diesel generator units used for standby power service in nuclear power plants. (author)

  18. Technical information report: Plasma melter operation, reliability, and maintenance analysis

    International Nuclear Information System (INIS)

    Hendrickson, D.W.

    1995-01-01

    This document provides a technical report of operability, reliability, and maintenance of a plasma melter for low-level waste vitrification, in support of the Hanford Tank Waste Remediation System (TWRS) Low-Level Waste (LLW) Vitrification Program. A process description is provided that minimizes maintenance and downtime and includes material and energy balances, equipment sizes and arrangement, startup/operation/maintence/shutdown cycle descriptions, and basis for scale-up to a 200 metric ton/day production facility. Operational requirements are provided including utilities, feeds, labor, and maintenance. Equipment reliability estimates and maintenance requirements are provided which includes a list of failure modes, responses, and consequences

  19. Embedded mechatronic systems 1 analysis of failures, predictive reliability

    CERN Document Server

    El Hami, Abdelkhalak

    2015-01-01

    In operation, mechatronics embedded systems are stressed by loads of different causes: climate (temperature, humidity), vibration, electrical and electromagnetic. These stresses in components which induce failure mechanisms should be identified and modeled for better control. AUDACE is a collaborative project of the cluster Mov'eo that address issues specific to mechatronic reliability embedded systems. AUDACE means analyzing the causes of failure of components of mechatronic systems onboard. The goal of the project is to optimize the design of mechatronic devices by reliability. The projec

  20. Total Longitudinal Moment Calculation and Reliability Analysis of Yacht Structures

    Science.gov (United States)

    Zhi, Wenzheng; Lin, Shaofen

    In order to check the reliability of the yacht in FRP (Fiber Reinforce Plastic) materials, in this paper, the vertical force and the calculation method of the overall longitudinal bending moment on yacht was analyzed. Specially, this paper focuses on the impact of speed on the still water bending moment on yacht. Then considering the mechanical properties of the cap type stiffeners in composite materials, the ultimate bearing capacity of the yacht has been worked out, finally the reliability of the yacht was calculated with using response surface methodology. The result can be used in yacht design and yacht driving.

  1. Reliability modeling and analysis of smart power systems

    CERN Document Server

    Karki, Rajesh; Verma, Ajit Kumar

    2014-01-01

    The volume presents the research work in understanding, modeling and quantifying the risks associated with different ways of implementing smart grid technology in power systems in order to plan and operate a modern power system with an acceptable level of reliability. Power systems throughout the world are undergoing significant changes creating new challenges to system planning and operation in order to provide reliable and efficient use of electrical energy. The appropriate use of smart grid technology is an important drive in mitigating these problems and requires considerable research acti

  2. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  3. A reliable method for the stability analysis of structures ...

    African Journals Online (AJOL)

    The detection of structural configurations with singular tangent stiffness matrix is essential because they can be unstable. The secondary paths, especially in unstable buckling, can play the most important role in the loss of stability and collapse of the structure. A new method for reliable detection and accurate computation of ...

  4. Fiber Access Networks: Reliability Analysis and Swedish Broadband Market

    Science.gov (United States)

    Wosinska, Lena; Chen, Jiajia; Larsen, Claus Popp

    Fiber access network architectures such as active optical networks (AONs) and passive optical networks (PONs) have been developed to support the growing bandwidth demand. Whereas particularly Swedish operators prefer AON, this may not be the case for operators in other countries. The choice depends on a combination of technical requirements, practical constraints, business models, and cost. Due to the increasing importance of reliable access to the network services, connection availability is becoming one of the most crucial issues for access networks, which should be reflected in the network owner's architecture decision. In many cases protection against failures is realized by adding backup resources. However, there is a trade off between the cost of protection and the level of service reliability since improving reliability performance by duplication of network resources (and capital expenditures CAPEX) may be too expensive. In this paper we present the evolution of fiber access networks and compare reliability performance in relation to investment and management cost for some representative cases. We consider both standard and novel architectures for deployment in both sparsely and densely populated areas. While some recent works focused on PON protection schemes with reduced CAPEX the current and future effort should be put on minimizing the operational expenditures (OPEX) during the access network lifetime.

  5. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    DEFF Research Database (Denmark)

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...

  6. Windfarm generation assessment for reliability analysis of power systems

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...

  7. A comparative reliability analysis of ETCS train radio communications

    NARCIS (Netherlands)

    Hermanns, H.; Becker, B.; Jansen, D.N.; Damm, W.; Usenko, Y.S.; Fränzle, M.; Olderog, E.-R.; Podelski, A.; Wilhelm, R.

    StoCharts have been proposed as a UML statechart extension for performance and dependability evaluation, and were applied in the context of train radio reliability assessment to show the principal tractability of realistic cases with this approach. In this paper, we extend on this bare feasibility

  8. Architecture-Based Reliability Analysis of Web Services

    Science.gov (United States)

    Rahmani, Cobra Mariam

    2012-01-01

    In a Service Oriented Architecture (SOA), the hierarchical complexity of Web Services (WS) and their interactions with the underlying Application Server (AS) create new challenges in providing a realistic estimate of WS performance and reliability. The current approaches often treat the entire WS environment as a black-box. Thus, the sensitivity…

  9. Erratum: Comparative Analysis of Some Reliability Characteristics of ...

    African Journals Online (AJOL)

    ... are analyzed using kolmogorov's forward equation method. Comparisons are performed for specific values of system parameters. Finally, the configurations are ranked based on MTSF and ( AV(∞)) and the results show that configuration 3 is optimal. Keywords: Reliability, Availability, Deterioration, Repair, Replacement.

  10. Reliability analysis of common hazardous waste treatment processes

    International Nuclear Information System (INIS)

    Waters, R.D.

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption

  11. Reliability analysis of common hazardous waste treatment processes

    Energy Technology Data Exchange (ETDEWEB)

    Waters, Robert D. [Vanderbilt Univ., Nashville, TN (United States)

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  12. Reliability of Computer Analysis of Electrocardiograms (ECG) of ...

    African Journals Online (AJOL)

    Background: Computer programmes have been introduced to electrocardiography (ECG) with most physicians in Africa depending on computer interpretation of ECG. This study was undertaken to evaluate the reliability of computer interpretation of the 12-Lead ECG in the Black race. Methodology: Using the SCHILLER ...

  13. Reliability analysis of the epidural spinal cord compression scale.

    Science.gov (United States)

    Bilsky, Mark H; Laufer, Ilya; Fourney, Daryl R; Groff, Michael; Schmidt, Meic H; Varga, Peter Paul; Vrionis, Frank D; Yamada, Yoshiya; Gerszten, Peter C; Kuklo, Timothy R

    2010-09-01

    The evolution of imaging techniques, along with highly effective radiation options has changed the way metastatic epidural tumors are treated. While high-grade epidural spinal cord compression (ESCC) frequently serves as an indication for surgical decompression, no consensus exists in the literature about the precise definition of this term. The advancement of the treatment paradigms in patients with metastatic tumors for the spine requires a clear grading scheme of ESCC. The degree of ESCC often serves as a major determinant in the decision to operate or irradiate. The purpose of this study was to determine the reliability and validity of a 6-point, MR imaging-based grading system for ESCC. To determine the reliability of the grading scale, a survey was distributed to 7 spine surgeons who participate in the Spine Oncology Study Group. The MR images of 25 cervical or thoracic spinal tumors were distributed consisting of 1 sagittal image and 3 axial images at the identical level including T1-weighted, T2-weighted, and Gd-enhanced T1-weighted images. The survey was administered 3 times at 2-week intervals. The inter- and intrarater reliability was assessed. The inter- and intrarater reliability ranged from good to excellent when surgeons were asked to rate the degree of spinal cord compression using T2-weighted axial images. The T2-weighted images were superior indicators of ESCC compared with T1-weighted images with and without Gd. The ESCC scale provides a valid and reliable instrument that may be used to describe the degree of ESCC based on T2-weighted MR images. This scale accounts for recent advances in the treatment of spinal metastases and may be used to provide an ESCC classification scheme for multicenter clinical trial and outcome studies.

  14. Time-dependent reliability analysis of nuclear reactor operators using probabilistic network models

    International Nuclear Information System (INIS)

    Oka, Y.; Miyata, K.; Kodaira, H.; Murakami, S.; Kondo, S.; Togo, Y.

    1987-01-01

    Human factors are very important for the reliability of a nuclear power plant. Human behavior has essentially a time-dependent nature. The details of thinking and decision making processes are important for detailed analysis of human reliability. They have, however, not been well considered by the conventional methods of human reliability analysis. The present paper describes the models for the time-dependent and detailed human reliability analysis. Recovery by an operator is taken into account and two-operators models are also presented

  15. Reliability Analysis Study of Digital Reactor Protection System in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Guo, Xiao Ming; Liu, Tao; Tong, Jie Juan; Zhao, Jun

    2011-01-01

    The Digital I and C systems are believed to improve a plants safety and reliability generally. The reliability analysis of digital I and C system has become one research hotspot. Traditional fault tree method is one of means to quantify the digital I and C system reliability. Review of advanced nuclear power plant AP1000 digital protection system evaluation makes clear both the fault tree application and analysis process to the digital system reliability. One typical digital protection system special for advanced reactor has been developed, which reliability evaluation is necessary for design demonstration. The typical digital protection system construction is introduced in the paper, and the process of FMEA and fault tree application to the digital protection system reliability evaluation are described. Reliability data and bypass logic modeling are two points giving special attention in the paper. Because the factors about time sequence and feedback not exist in reactor protection system obviously, the dynamic feature of digital system is not discussed

  16. Reliability analysis of stainless steel piping using a single stress corrosion cracking damage parameter

    International Nuclear Information System (INIS)

    Guedri, A.

    2013-01-01

    This article presents the results of an investigation that combines standard methods of fracture mechanics, empirical correlations of stress-corrosion cracking, and probabilistic methods to provide an assessment of Intergranular Stress Corrosion Cracking (IGSCC) of stainless steel piping. This is done by simulating the cracking of stainless steel piping under IGSCC conditions using the general methodology recommended in the modified computer program Piping Reliability Analysis Including Seismic Events, and by characterizing IGSCC using a single damage parameter. Good correlation between the pipe end-life probability of leak and the damage values were found. These correlations were later used to generalize this probabilistic fracture model. Also, the probability of detection curves and the benefits of in-service inspection in order to reduce the probability of leak for nuclear piping systems subjected to IGSCC were discussed for several pipe sizes. It was found that greater benefits could be gained from inspections for the large pipe as compared to the small pipe sizes. Also, the results indicate that the use of a better inspection procedure can be more effective than a tenfold increase in the number of inspections of inferior quality. -- Highlights: • We simulate the pipe probability of failure under different level of SCC damages. • The residual stresses are adjusted to calibrate the model. • Good correlations between 40-year cumulative leak probabilities and D σ are found. • These correlations were used to generalize this probabilistic fracture model. • We assess the effect of inspection procedures and scenarios on leak probabilities

  17. Reliability importance analysis of Markovian systems at steady state using perturbation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Phuc Do Van [Institut Charles Delaunay - FRE CNRS 2848, Systems Modeling and Dependability Group, Universite de technologie de Troyes, 12, rue Marie Curie, BP 2060-10010 Troyes cedex (France); Barros, Anne [Institut Charles Delaunay - FRE CNRS 2848, Systems Modeling and Dependability Group, Universite de technologie de Troyes, 12, rue Marie Curie, BP 2060-10010 Troyes cedex (France)], E-mail: anne.barros@utt.fr; Berenguer, Christophe [Institut Charles Delaunay - FRE CNRS 2848, Systems Modeling and Dependability Group, Universite de technologie de Troyes, 12, rue Marie Curie, BP 2060-10010 Troyes cedex (France)

    2008-11-15

    Sensitivity analysis has been primarily defined for static systems, i.e. systems described by combinatorial reliability models (fault or event trees). Several structural and probabilistic measures have been proposed to assess the components importance. For dynamic systems including inter-component and functional dependencies (cold spare, shared load, shared resources, etc.), and described by Markov models or, more generally, by discrete events dynamic systems models, the problem of sensitivity analysis remains widely open. In this paper, the perturbation method is used to estimate an importance factor, called multi-directional sensitivity measure, in the framework of Markovian systems. Some numerical examples are introduced to show why this method offers a promising tool for steady-state sensitivity analysis of Markov processes in reliability studies.

  18. Reliability importance analysis of Markovian systems at steady state using perturbation analysis

    International Nuclear Information System (INIS)

    Phuc Do Van; Barros, Anne; Berenguer, Christophe

    2008-01-01

    Sensitivity analysis has been primarily defined for static systems, i.e. systems described by combinatorial reliability models (fault or event trees). Several structural and probabilistic measures have been proposed to assess the components importance. For dynamic systems including inter-component and functional dependencies (cold spare, shared load, shared resources, etc.), and described by Markov models or, more generally, by discrete events dynamic systems models, the problem of sensitivity analysis remains widely open. In this paper, the perturbation method is used to estimate an importance factor, called multi-directional sensitivity measure, in the framework of Markovian systems. Some numerical examples are introduced to show why this method offers a promising tool for steady-state sensitivity analysis of Markov processes in reliability studies

  19. Development of RBDGG Solver and Its Application to System Reliability Analysis

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    For the purpose of making system reliability analysis easier and more intuitive, RBDGG (Reliability Block diagram with General Gates) methodology was introduced as an extension of the conventional reliability block diagram. The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system, and therefore the modeling of a system for system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar with that of the development of the RGGG (Reliability Graph with General Gates) methodology, which is an extension of a conventional reliability graph. The newly proposed methodology is now implemented into a software tool, RBDGG Solver. RBDGG Solver was developed as a WIN32 console application. RBDGG Solver receives information on the failure modes and failure probabilities of each component in the system, along with the connection structure and connection logics among the components in the system. Based on the received information, RBDGG Solver automatically generates a system reliability analysis model for the system, and then provides the analysis results. In this paper, application of RBDGG Solver to the reliability analysis of an example system, and verification of the calculation results are provided for the purpose of demonstrating how RBDGG Solver is used for system reliability analysis

  20. The effect of marketing expenses on car sales – an empirical analysis

    Directory of Open Access Journals (Sweden)

    Tudose Mihaela Brînduşa

    2017-01-01

    Full Text Available The paper assesses empirically the relationship between marketing expenditures and sales in a highly competitive industry, namely automotive, by analyzing the marketing expending of Automobile Dacia S.A. The first part of the paper presents the state-of-the-art and discusses the studies previously conducted which focus on the structure, dynamic and the impact of marketing expenses, while the second part consists in an empirical analysis conducted on Automobile Dacia S.A. marketing spending. The results of the study show that the company managed to increase its’ market share by adopting differentiated marketing for each geographical area. Although the research revealed that the allocation percentage from sales for marketing spending is relatively low (5-6%, the analysis conducted on the cost per unit sold reveals a share of 3% on marketing spending.

  1. Reliability evaluation of nuclear power plants by fault tree analysis

    International Nuclear Information System (INIS)

    Iwao, H.; Otsuka, T.; Fujita, I.

    1993-01-01

    As a work sponsored by the Ministry of International Trade and Industry, the Safety Information Research Center of NUPEC, using reliability data based on the operational experience of the domestic LWR Plants, has implemented FTA for the standard PWRs and BWRs in Japan with reactor scram due to system failures being at the top event. Up to this point, we have obtained the FT chart and minimal cut set for each type of system failure for qualitative evaluation, and we have estimated system unavailability, Fussell-Vesely importance and risk worth for components for quantitative evaluation. As the second stage of a series in our reliability evaluation work, another program was started to establish a support system. The aim of this system is to assist foreign and domestic plants in creating countermeasures when incidents occur, by providing them with the necessary information using the above analytical method and its results. (author)

  2. Reliability Analysis of Timber Structures through NDT Data Upgrading

    DEFF Research Database (Denmark)

    Sousa, Hélder; Sørensen, John Dalsgaard; Kirkegaard, Poul Henning

    The first part of this document presents, in chapter 2, a description of timber characteristics and common used NDT and MDT for timber elements. Stochastic models for timber properties and damage accumulation models are also referred. According to timber’s properties a framework is proposed...... for a safety reassessment procedure. For that purpose a theoretical background for structural reliability assessment including probabilistic concepts for structural systems and stochastic models are given in chapter 3. System models, both series and parallel systems, are presented as well as methods...... for robustness are dealt in chapter 5. The second part of this document begins in chapter 6, where a practical application of the premise definitions and methodologies is given through the implementation of upgraded models with NDT and MDT data. Structural life-cycle is, therefore, assessed and reliability...

  3. Reliability analysis of numerical simulation in near field behavior

    International Nuclear Information System (INIS)

    Kobayashi, Akira; Yamamoto, Kiyohito; Chijimatsu, Masakazu; Fujita, Tomoo

    2008-01-01

    The uncertainties of the boundary conditions, the elastic modulus and Poisson's ratio on the mechanical behavior at near field of high level radioactive waste repository were examined. The method used to examine the error propagation was the first order second moment method. The reliability of the maximum principal stress, maximum shear stress at crown of the tunnel and the minimum principal stress at spring line was examined for one million years. For elastic model, the reliability of the maximum shear stress gradually decreased while that of the maximum principle stress increased. That of the minimum principal stress was relatively low for one million years. This tendency was similar to that from the damage model. (author)

  4. Empowering Kanban through TPS-Principles - An Empirical Analysis of the Toyota Production System

    OpenAIRE

    Thun , Jörn-Henrik; Drüke , Martin; Grübner , Andre

    2010-01-01

    Abstract The purpose of this paper is the empirical investigation of the Toyota Production System in order to test existing relationships as they are proposed in theory. The underlying model consists of seven factors reflecting the key practices of the Toyota Production System. Using data from 188 manufacturing plants participating in the High Performance Manufacturing research project, the model?s measurement characteristics were validated through confirmatory factor analysis. Pat...

  5. Risk and Protective Factors of Internet Addiction: A Meta-Analysis of Empirical Studies in Korea

    OpenAIRE

    Koo, Hoon Jung; Kwon, Jung-Hye

    2014-01-01

    Purpose A meta-analysis of empirical studies performed in Korea was conducted to systematically investigate the associations between the indices of Internet addiction (IA) and psychosocial variables. Materials and Methods Systematic literature searches were carried out using the Korean Studies Information Service System, Research Information Sharing Service, Science Direct, Google Scholar, and references in review articles. The key words were Internet addiction, (Internet) game addiction, and...

  6. Empirical evidence from an inter-industry descriptive analysis of overall materiality measures

    OpenAIRE

    N. Pecchiari; C. Emby; G. Pogliani

    2013-01-01

    This study presents an empirical cross-industry descriptive analysis of overall quantitative materiality measures. We examine the behaviour of four commonly used quantitative materiality measures within and across industries with respect to their size, relative size and stability, over ten years. The sample consists of large- and medium-sized European companies, representing 24 different industry categories for the years 1998 through 2007 (a total sample of over 36,000 data points). Our resul...

  7. Characterizing Social Interaction in Tobacco-Oriented Social Networks: An Empirical Analysis

    OpenAIRE

    Liang, Yunji; Zheng, Xiaolong; Zeng, Daniel Dajun; Zhou, Xingshe; Leischow, Scott James; Chung, Wingyan

    2015-01-01

    Social media is becoming a new battlefield for tobacco ?wars?. Evaluating the current situation is very crucial for the advocacy of tobacco control in the age of social media. To reveal the impact of tobacco-related user-generated content, this paper characterizes user interaction and social influence utilizing social network analysis and information theoretic approaches. Our empirical studies demonstrate that the exploding pro-tobacco content has long-lasting effects with more active users a...

  8. Establishment of Grain Farmers' Supply Response Model and Empirical Analysis under Minimum Grain Purchase Price Policy

    OpenAIRE

    Zhang, Shuang

    2012-01-01

    Based on farmers' supply behavior theory and price expectations theory, this paper establishes grain farmers' supply response model of two major grain varieties (early indica rice and mixed wheat) in the major producing areas, to test whether the minimum grain purchase price policy can have price-oriented effect on grain production and supply in the major producing areas. Empirical analysis shows that the minimum purchase price published annually by the government has significant positive imp...

  9. An ACE-based Nonlinear Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen; Nielsen, Allan Aasbjerg; Andersen, Ole

    2001-01-01

    This paper shows the application of the empirical orthogonal unctions/principal component transformation on global sea surface height and temperature data from 1996 and 1997. A nonlinear correlation analysis of the transformed data is proposed and performed by applying the alternating conditional...... expectations algorithm. New canonical variates are found that indicate that the highest correlation between ocean temperature and height is associated with the build-up of the El Niño during the last half of 1997....

  10. An Empirical Study Based on the SPSS Variance Analysis of College Teachers' Sports Participation and Satisfaction

    OpenAIRE

    Yunqiu Liang

    2013-01-01

    The study on University Teachers ' sports participation and their job satisfaction relationship for empirical research, mainly from the group to participate in sports activities situation on the object of study, investigation and mathematical statistics analysis SPSS. Results show that sports groups participate in job satisfaction higher than those in groups of job satisfaction; sports participation, different job satisfaction is also different. Recommendations for college teachers to address...

  11. An empirical analysis of the relationship between the consumption of alcohol and liver cirrhosis mortality

    DEFF Research Database (Denmark)

    Bentzen, Jan Børsen; Smith, Valdemar

    The question whether intake of alcohol is associated with liver cirrhosis mortality is analyzed using aggregate data for alcohol consumption, alcohol related diseases and alcohol policies of 16 European countries. The empirical analysis gives support to a close association between cirrhosis morta...... mortality and intake of alcohol - and the latter also concerns each of the specific beverages, i.e. spirits, wine and beer, where other studies usually only find evidence of spirits and wine related to liver cirrhosis mortality.  ...

  12. Reliability Analysis of Public Survey in Satisfaction with Nuclear Safety

    Energy Technology Data Exchange (ETDEWEB)

    Park, Moon Soo; Moon, Joo Hyun; Kang, Chang Sun [Seoul National Univ., Seoul (Korea, Republic of)

    2005-07-01

    Korea Institute of Nuclear Safety (KINS) carried out a questionnaire survey on public's understanding nuclear safety and regulation in order to grasp public acceptance for nuclear energy. The survey was planned to help to analyze public opinion on nuclear energy and provide basic data for advertising strategy and policy development. In this study, based on results of the survey, the reliability of the survey was evaluated according to each nuclear site.

  13. Reliability analysis of Airbus A-330 computer flight management system

    OpenAIRE

    Fajmut, Metod

    2010-01-01

    Diploma thesis deals with digitized, computerized flight control system »Fly-by-wire« and security aspects of the computer system of an aircraft Airbus A330. As for space and military aircraft structures is also in commercial airplanes, much of the financial contribution devoted to reliability. Conventional aircraft control systems have, and some are still, to rely on mechanical and hydraulic connections between the controls on aircraft operated by the pilot and control surfaces. But newer a...

  14. Reliability Analysis of Public Survey in Satisfaction with Nuclear Safety

    International Nuclear Information System (INIS)

    Park, Moon Soo; Moon, Joo Hyun; Kang, Chang Sun

    2005-01-01

    Korea Institute of Nuclear Safety (KINS) carried out a questionnaire survey on public's understanding nuclear safety and regulation in order to grasp public acceptance for nuclear energy. The survey was planned to help to analyze public opinion on nuclear energy and provide basic data for advertising strategy and policy development. In this study, based on results of the survey, the reliability of the survey was evaluated according to each nuclear site

  15. Launch and Assembly Reliability Analysis for Human Space Exploration Missions

    Science.gov (United States)

    Cates, Grant; Gelito, Justin; Stromgren, Chel; Cirillo, William; Goodliff, Kandyce

    2012-01-01

    NASA's future human space exploration strategy includes single and multi-launch missions to various destinations including cis-lunar space, near Earth objects such as asteroids, and ultimately Mars. Each campaign is being defined by Design Reference Missions (DRMs). Many of these missions are complex, requiring multiple launches and assembly of vehicles in orbit. Certain missions also have constrained departure windows to the destination. These factors raise concerns regarding the reliability of launching and assembling all required elements in time to support planned departure. This paper describes an integrated methodology for analyzing launch and assembly reliability in any single DRM or set of DRMs starting with flight hardware manufacturing and ending with final departure to the destination. A discrete event simulation is built for each DRM that includes the pertinent risk factors including, but not limited to: manufacturing completion; ground transportation; ground processing; launch countdown; ascent; rendezvous and docking, assembly, and orbital operations leading up to trans-destination-injection. Each reliability factor can be selectively activated or deactivated so that the most critical risk factors can be identified. This enables NASA to prioritize mitigation actions so as to improve mission success.

  16. Reliability analysis of a complex standby redundant systems

    International Nuclear Information System (INIS)

    Subramanian, R.; Anantharaman, V.

    1995-01-01

    In any redundant system, the state of the standby unit is usually taken to be hot, warm or cold. In this paper, we present a new model of a two unit standby system wherein the standby unit is put in cold state for a certain amount of time before it is allowed to become warm. Upon failure of the online unit, the standby unit, if in warm state, instantaneously starts operating online; if it is in cold state, an emergency switching is made which takes it to warm state (and hence online) either instantaneously or non-instantaneously--each with some probability; if it is under repair, the system breaks down. Assuming all the associated distributions to be general except that of the life time of the standby unit in the warm state, various reliability characteristics that are of interest to reliability engineers and system designers are derived. A comprehensive cost function is also constructed and is then optimized with respect to three different control parameters numerically. In addition numerical results are presented to illustrate the behaviour of the various reliability characteristics derived

  17. Assessment of modern methods of human factor reliability analysis in PSA studies

    International Nuclear Information System (INIS)

    Holy, J.

    2001-12-01

    The report is structured as follows: Classical terms and objects (Probabilistic safety assessment as a framework for human reliability assessment; Human failure within the PSA model; Basic types of operator failure modelled in a PSA study and analyzed by HRA methods; Qualitative analysis of human reliability; Quantitative analysis of human reliability used; Process of analysis of nuclear reactor operator reliability in a PSA study); New terms and objects (Analysis of dependences; Errors of omission; Errors of commission; Error forcing context); and Overview and brief assessment of human reliability analysis (Basic characteristics of the methods; Assets and drawbacks of the use of each of HRA method; History and prospects of the use of the methods). (P.A.)

  18. Bayesian belief networks for human reliability analysis: A review of applications and gaps

    International Nuclear Information System (INIS)

    Mkrtchyan, L.; Podofillini, L.; Dang, V.N.

    2015-01-01

    The use of Bayesian Belief Networks (BBNs) in risk analysis (and in particular Human Reliability Analysis, HRA) is fostered by a number of features, attractive in fields with shortage of data and consequent reliance on subjective judgments: the intuitive graphical representation, the possibility of combining diverse sources of information, the use the probabilistic framework to characterize uncertainties. In HRA, BBN applications are steadily increasing, each emphasizing a different BBN feature or a different HRA aspect to improve. This paper aims at a critical review of these features as well as at suggesting research needs. Five groups of BBN applications are analysed: modelling of organizational factors, analysis of the relationships among failure influencing factors, BBN-based extensions of existing HRA methods, dependency assessment among human failure events, assessment of situation awareness. Further, the paper analyses the process for building BBNs and in particular how expert judgment is used in the assessment of the BBN conditional probability distributions. The gaps identified in the review suggest the need for establishing more systematic frameworks to integrate the different sources of information relevant for HRA (cognitive models, empirical data, and expert judgment) and to investigate algorithms to avoid elicitation of many relationships via expert judgment. - Highlights: • We analyze BBN uses for HRA applications; but some conclusions can be generalized. • Special review focus on BBN building approaches, key for model acceptance. • Gaps relate to the transparency of the BBN building and quantification phases. • Need for more systematic framework to integrate different sources of information. • Need of ways to avoid elicitation of many relationships via expert judgment

  19. Optimizing the design and operation of reactor emergency systems using reliability analysis techniques

    International Nuclear Information System (INIS)

    Snaith, E.R.

    1975-01-01

    Following a reactor trip various reactor emergency systems, e.g. essential power supplies, emergency core cooling and boiler feed water arrangements are required to operate with a high degree of reliability. These systems must therefore be critically assessed to confirm their capability of operation and determine their reliability of performance. The use of probability analysis techniques enables the potential operating reliability of the systems to be calculated and this can then be compared with the overall reliability requirements. However, a system reliability analysis does much more than calculate an overall reliability value for the system. It establishes the reliability of all parts of the system and thus identifies the most sensitive areas of unreliability. This indicates the areas where any required improvements should be made and enables the overall systems' designs and modes of operation to be optimized, to meet the system and hence the overall reactor safety criteria. This paper gives specific examples of sensitive areas of unreliability that were identified as a result of a reliability analysis that was carried out on a reactor emergency core cooling system. Details are given of modifications to design and operation that were implemented with a resulting improvement in reliability of various reactor sub-systems. The report concludes that an initial calculation of system reliability should represent only the beginning of continuing process of system assessment. Data on equipment and system performance, particularly in those areas shown to be sensitive in their effect on the overall nuclear power plant reliability, should be collected and processed to give reliability data. These data should then be applied in further probabilistic analyses and the results correlated with the original analysis. This will demonstrate whether the required and the originally predicted system reliability is likely to be achieved, in the light of the actual history to date of

  20. Reliability Worth Analysis of Distribution Systems Using Cascade Correlation Neural Networks

    DEFF Research Database (Denmark)

    Heidari, Alireza; Agelidis, Vassilios; Pou, Josep

    2018-01-01

    Reliability worth analysis is of great importance in the area of distribution network planning and operation. The reliability worth's precision can be affected greatly by the customer interruption cost model used. The choice of the cost models can change system and load point reliability indices....... In this study, a cascade correlation neural network is adopted to further develop two cost models comprising a probabilistic distribution model and an average or aggregate model. A contingency-based analytical technique is adopted to conduct the reliability worth analysis. Furthermore, the possible effects...

  1. Reliability analysis of reactor systems by applying probability method; Analiza pouzdanosti reaktorskih sistema primenom metoda verovatnoce

    Energy Technology Data Exchange (ETDEWEB)

    Milivojevic, S [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1974-12-15

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component.

  2. Reliability analysis of the automatic control and power supply of reactor equipment

    International Nuclear Information System (INIS)

    Monori, Pal; Nagy, J.A.; Meszaros, Zoltan; Konkoly, Laszlo; Szabo, Antal; Nagy, Laszlo

    1988-01-01

    Based on reliability analysis the shortcomings of nuclear facilities are discovered. Fault tree types constructed for the technology of automatic control and for power supply serve as input data of the ORCHARD 2 computer code. In order to charaterize the reliability of the system, availability, failure rates and time intervals between failures are calculated. The results of the reliability analysis of the feedwater system of the Paks Nuclear Power Plant showed that the system consisted of elements of similar reliabilities. (V.N.) 8 figs.; 3 tabs

  3. Using a Hybrid Cost-FMEA Analysis for Wind Turbine Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Nacef Tazi

    2017-02-01

    Full Text Available Failure mode and effects analysis (FMEA has been proven to be an effective methodology to improve system design reliability. However, the standard approach reveals some weaknesses when applied to wind turbine systems. The conventional criticality assessment method has been criticized as having many limitations such as the weighting of severity and detection factors. In this paper, we aim to overcome these drawbacks and develop a hybrid cost-FMEA by integrating cost factors to assess the criticality, these costs vary from replacement costs to expected failure costs. Then, a quantitative comparative study is carried out to point out average failure rate, main cause of failure, expected failure costs and failure detection techniques. A special reliability analysis of gearbox and rotor-blades are presented.

  4. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H S; Kim, J H; Lee, J C; Choi, Y R; Moon, S S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  5. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    International Nuclear Information System (INIS)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S.

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system

  6. Assessment of autonomic nervous system by using empirical mode decomposition-based reflection wave analysis during non-stationary conditions

    International Nuclear Information System (INIS)

    Chang, C C; Hsiao, T C; Kao, S C; Hsu, H Y

    2014-01-01

    Arterial blood pressure (ABP) is an important indicator of cardiovascular circulation and presents various intrinsic regulations. It has been found that the intrinsic characteristics of blood vessels can be assessed quantitatively by ABP analysis (called reflection wave analysis (RWA)), but conventional RWA is insufficient for assessment during non-stationary conditions, such as the Valsalva maneuver. Recently, a novel adaptive method called empirical mode decomposition (EMD) was proposed for non-stationary data analysis. This study proposed a RWA algorithm based on EMD (EMD-RWA). A total of 51 subjects participated in this study, including 39 healthy subjects and 12 patients with autonomic nervous system (ANS) dysfunction. The results showed that EMD-RWA provided a reliable estimation of reflection time in baseline and head-up tilt (HUT). Moreover, the estimated reflection time is able to assess the ANS function non-invasively, both in normal, healthy subjects and in the patients with ANS dysfunction. EMD-RWA provides a new approach for reflection time estimation in non-stationary conditions, and also helps with non-invasive ANS assessment. (paper)

  7. Calibrating a combined energy systems analysis and controller design method with empirical data

    International Nuclear Information System (INIS)

    Murphy, Gavin Bruce; Counsell, John; Allison, John; Brindley, Joseph

    2013-01-01

    The drive towards low carbon constructions has seen buildings increasingly utilise many different energy systems simultaneously to control the human comfort of the indoor environment; such as ventilation with heat recovery, various heating solutions and applications of renewable energy. This paper describes a dynamic modelling and simulation method (IDEAS – Inverse Dynamics based Energy Assessment and Simulation) for analysing the energy utilisation of a building and its complex servicing systems. The IDEAS case study presented in this paper is based upon small perturbation theory and can be used for the analysis of the performance of complex energy systems and also for the design of smart control systems. This paper presents a process of how any dynamic model can be calibrated against a more empirical based data model, in this case the UK Government's SAP (Standard Assessment Procedure). The research targets of this work are building simulation experts for analysing the energy use of a building and also control engineers to assist in the design of smart control systems for dwellings. The calibration process presented is transferable and has applications for simulation experts to assist in calibrating any dynamic building simulation method with an empirical based method. - Highlights: • Presentation of an energy systems analysis method for assessing the energy utilisation of buildings and their complex servicing systems. • An inverse dynamics based controller design method is detailed. • Method of how a dynamic model can be calibrated with an empirical based model

  8. Summary of component reliability data for probabilistic safety analysis of Korean standard nuclear power plant

    International Nuclear Information System (INIS)

    Choi, S. Y.; Han, S. H.

    2004-01-01

    The reliability data of Korean NPP that reflects the plant specific characteristics is necessary for PSA of Korean nuclear power plants. We have performed a study to develop the component reliability DB and S/W for component reliability analysis. Based on the system, we had have collected the component operation data and failure/repair data during plant operation data to 1998/2000 for YGN 3,4/UCN 3,4 respectively. Recently, we have upgraded the database by collecting additional data by 2002 for Korean standard nuclear power plants and performed component reliability analysis and Bayesian analysis again. In this paper, we supply the summary of component reliability data for probabilistic safety analysis of Korean standard nuclear power plant and describe the plant specific characteristics compared to the generic data

  9. An empirical analysis of the relationship between cost of control activities and project management success

    Directory of Open Access Journals (Sweden)

    Al-Tmeemy Samiaah

    2018-01-01

    Full Text Available To achieve the objectives of continuous improvement programs, construction managers must link the achievement of quality with cost. This paper aims to associate project management success (PMS with cost of control (COC activities in an empirical manner. Thus, the purpose is to determine the extent to which COC activities impact PMS. Quantitative method was adopted to collect data from Malaysian building companies using postal and email surveys. Hypothesis is tested using correlation and simple linear regression analysis. The findings of this study indicate that COC activities are positively associated with the PMS. The empirical evidences obtained from this research, provides financial justification for all quality improvement efforts. This can assist building contractors to enhance the success of project management by reducing the level of business failures due to poor quality, cost overruns, and delays.

  10. Tools for Empirical and Operational Analysis of Mobile Offloading in Loop-Based Applications

    Directory of Open Access Journals (Sweden)

    Alexandru-Corneliu OLTEANU

    2013-01-01

    Full Text Available Offloading for mobile devices is an increasingly popular research topic, matching the popu-larity mobile devices have in the general population. Studying mobile offloading is challenging because of device and application heterogeneity. However, we believe that focusing on a specific type of application can bring advances in offloading for mobile devices, while still keeping a wide range of applicability. In this paper we focus on loop-based applications, in which most of the functionality is given by iterating an execution loop. We model the main loop of the application with a graph that consists of a cycle and propose an operational analysis to study offloading on this model. We also propose a testbed based on a real-world application to empirically evaluate offloading. We conduct performance evaluation using both tools and compare the analytical and empirical results.

  11. Joint production and corporate pricing: An empirical analysis of joint products in the petroleum industry

    International Nuclear Information System (INIS)

    Karimnejad, H.

    1990-01-01

    This dissertation investigates the pricing mechanism of joint products in large multi-plant and multi-product corporations. The primary objective of this dissertation is to show the consistency of classical theories of production with corporate pricing of joint products. This dissertation has two major parts. Part One provides a theoretical framework for joint production and corporate pricing. In this part, joint production is defined and its historical treatment by classical and contemporary economists is analyzed. Part Two conducts an empirical analysis of joint products in the US petroleum industry. Methods of cost allocation are used in the pricing of each individual petroleum product. Three methods are employed to distribute joint production costs to individual petroleum products. These methods are, the sales value method, the barrel gravity method and the average unit cost method. The empirical findings of dissertation provide useful guidelines for pricing policies of large multi-product corporations

  12. Canonical Least-Squares Monte Carlo Valuation of American Options: Convergence and Empirical Pricing Analysis

    Directory of Open Access Journals (Sweden)

    Xisheng Yu

    2014-01-01

    Full Text Available The paper by Liu (2010 introduces a method termed the canonical least-squares Monte Carlo (CLM which combines a martingale-constrained entropy model and a least-squares Monte Carlo algorithm to price American options. In this paper, we first provide the convergence results of CLM and numerically examine the convergence properties. Then, the comparative analysis is empirically conducted using a large sample of the S&P 100 Index (OEX puts and IBM puts. The results on the convergence show that choosing the shifted Legendre polynomials with four regressors is more appropriate considering the pricing accuracy and the computational cost. With this choice, CLM method is empirically demonstrated to be superior to the benchmark methods of binominal tree and finite difference with historical volatilities.

  13. Reliability analysis of the solar array based on Fault Tree Analysis

    International Nuclear Information System (INIS)

    Wu Jianing; Yan Shaoze

    2011-01-01

    The solar array is an important device used in the spacecraft, which influences the quality of in-orbit operation of the spacecraft and even the launches. This paper analyzes the reliability of the mechanical system and certifies the most vital subsystem of the solar array. The fault tree analysis (FTA) model is established according to the operating process of the mechanical system based on DFH-3 satellite; the logical expression of the top event is obtained by Boolean algebra and the reliability of the solar array is calculated. The conclusion shows that the hinges are the most vital links between the solar arrays. By analyzing the structure importance(SI) of the hinge's FTA model, some fatal causes, including faults of the seal, insufficient torque of the locking spring, temperature in space, and friction force, can be identified. Damage is the initial stage of the fault, so limiting damage is significant to prevent faults. Furthermore, recommendations for improving reliability associated with damage limitation are discussed, which can be used for the redesigning of the solar array and the reliability growth planning.

  14. Reliability analysis of the solar array based on Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wu Jianing; Yan Shaoze, E-mail: yansz@mail.tsinghua.edu.cn [State Key Laboratory of Tribology, Department of Precision Instruments and Mechanology, Tsinghua University,Beijing 100084 (China)

    2011-07-19

    The solar array is an important device used in the spacecraft, which influences the quality of in-orbit operation of the spacecraft and even the launches. This paper analyzes the reliability of the mechanical system and certifies the most vital subsystem of the solar array. The fault tree analysis (FTA) model is established according to the operating process of the mechanical system based on DFH-3 satellite; the logical expression of the top event is obtained by Boolean algebra and the reliability of the solar array is calculated. The conclusion shows that the hinges are the most vital links between the solar arrays. By analyzing the structure importance(SI) of the hinge's FTA model, some fatal causes, including faults of the seal, insufficient torque of the locking spring, temperature in space, and friction force, can be identified. Damage is the initial stage of the fault, so limiting damage is significant to prevent faults. Furthermore, recommendations for improving reliability associated with damage limitation are discussed, which can be used for the redesigning of the solar array and the reliability growth planning.

  15. Design and reliability analysis of a novel laser acupuncture device

    Science.gov (United States)

    Pan, Boan; Zhong, Fulin; Zhao, Ke; Li, Ting

    2018-02-01

    Acupuncture has a long history of more than 2000 years in China. However, traditional acupuncture adopts metallic needles which may bring discomfort and pricking to patients. Laser acupuncture (LA) is a non-invasive and painless way to achieve some therapeutic effects. And compared to traditional acupuncture, LA is free from infection. Taking these advantages of LA into consideration, we innovatively developed a portable laser acupuncture device with therapy part and detection part together. Therapy part sends out laser at the wavelength of 650 nm onto special acupoints of patients. And detection part includes integrated light-emitting diode (LED, 735/805/850 nm) and photodiode (OPT101). The detection part is used for the data collection for calculation of hemodynamic parameters based on near-infrared spectroscopy (NIRS). In this work, we carried out current-power test for sensitivity of therapy part. And we also conducted liquid-model optical experiment and arm blocking test for the sensitivity and effectiveness of detection part. The final results demonstrated great potential and reliability of the novel laser acupuncture device. In the future, we will apply this device in clinical applications to verify the effectiveness of the device and improve the reliability for more treatment of diseases.

  16. Time-dependent reliability analysis of flood defences

    International Nuclear Information System (INIS)

    Buijs, F.A.; Hall, J.W.; Sayers, P.B.; Gelder, P.H.A.J.M. van

    2009-01-01

    This paper describes the underlying theory and a practical process for establishing time-dependent reliability models for components in a realistic and complex flood defence system. Though time-dependent reliability models have been applied frequently in, for example, the offshore, structural safety and nuclear industry, application in the safety-critical field of flood defence has to date been limited. The modelling methodology involves identifying relevant variables and processes, characterisation of those processes in appropriate mathematical terms, numerical implementation, parameter estimation and prediction. A combination of stochastic, hierarchical and parametric processes is employed. The approach is demonstrated for selected deterioration mechanisms in the context of a flood defence system. The paper demonstrates that this structured methodology enables the definition of credible statistical models for time-dependence of flood defences in data scarce situations. In the application of those models one of the main findings is that the time variability in the deterioration process tends to be governed the time-dependence of one or a small number of critical attributes. It is demonstrated how the need for further data collection depends upon the relevance of the time-dependence in the performance of the flood defence system.

  17. On Bayesian reliability analysis with informative priors and censoring

    International Nuclear Information System (INIS)

    Coolen, F.P.A.

    1996-01-01

    In the statistical literature many methods have been presented to deal with censored observations, both within the Bayesian and non-Bayesian frameworks, and such methods have been successfully applied to, e.g., reliability problems. Also, in reliability theory it is often emphasized that, through shortage of statistical data and possibilities for experiments, one often needs to rely heavily on judgements of engineers, or other experts, for which means Bayesian methods are attractive. It is therefore important that such judgements can be elicited easily to provide informative prior distributions that reflect the knowledge of the engineers well. In this paper we focus on this aspect, especially on the situation that the judgements of the consulted engineers are based on experiences in environments where censoring has also been present previously. We suggest the use of the attractive interpretation of hyperparameters of conjugate prior distributions when these are available for assumed parametric models for lifetimes, and we show how one may go beyond the standard conjugate priors, using similar interpretations of hyper-parameters, to enable easier elicitation when censoring has been present in the past. This may even lead to more flexibility for modelling prior knowledge than when using standard conjugate priors, whereas the disadvantage of more complicated calculations that may be needed to determine posterior distributions play a minor role due to the advanced mathematical and statistical software that is widely available these days

  18. Regulatory reforms and productivity: An empirical analysis of the Japanese electricity industry

    International Nuclear Information System (INIS)

    Nakano, Makiko; Managi, Shunsuke

    2008-01-01

    The Japanese electricity industry has experienced regulatory reforms since the mid-1990s. This article measures productivity in Japan's steam power-generation sector and examines the effect of reforms on the productivity of this industry over the period 1978-2003. We estimate the Luenberger productivity indicator, which is a generalization of the commonly used Malmquist productivity index, using a data envelopment analysis approach. Factors associated with productivity change are investigated through dynamic generalized method of moments (GMM) estimation of panel data. Our empirical analysis shows that the regulatory reforms have contributed to productivity growth in the steam power-generation sector in Japan

  19. A Price Index Model for Road Freight Transportation and Its Empirical analysis in China

    Directory of Open Access Journals (Sweden)

    Liu Zhishuo

    2017-01-01

    Full Text Available The aim of price index for road freight transportation (RFT is to reflect the changes of price in the road transport market. Firstly, a price index model for RFT based on the sample data from Alibaba logistics platform is built. This model is a three levels index system including total index, classification index and individual index and the Laspeyres method is applied to calculate these indices. Finally, an empirical analysis of the price index for RFT market in Zhejiang Province is performed. In order to demonstrate the correctness and validity of the exponential model, a comparative analysis with port throughput and PMI index is carried out.

  20. Application of reliability analysis methods to the comparison of two safety circuits

    International Nuclear Information System (INIS)

    Signoret, J.-P.

    1975-01-01

    Two circuits of different design, intended for assuming the ''Low Pressure Safety Injection'' function in PWR reactors are analyzed using reliability methods. The reliability analysis of these circuits allows the failure trees to be established and the failure probability derived. The dependence of these results on test use and maintenance is emphasized as well as critical paths. The great number of results obtained may allow a well-informed choice taking account of the reliability wanted for the type of circuits [fr