WorldWideScience

Sample records for reliability analysis empirical

  1. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; John A. Forester; Andreas Bye; Vinh N. Dang; Erasmia Lois

    2010-06-01

    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to “translate” the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  2. Validity and reliability of guidelines for neck pain treatment in primary health care. A nationwide empirical analysis in Spain.

    Science.gov (United States)

    Saturno, Pedro J; Medina, Francesc; Valera, Fermin; Montilla, Joaquina; Escolar, Pilar; Gascón, Juan J

    2003-12-01

    To assess the reliability and validity of existing clinical guidelines on neck-pain physiotherapy treatment and follow-up in Spain. We identified existing guidelines through a nationwide census and listed their recommendations, grouped according to the main steps of the process flow-chart. To assess reliability we analysed the variability of statements. To analyse validity we assessed the type of scientific evidence supporting the recommendations, and we compared them with a list of evidence-based recommendations that was elaborated for this study. Primary health care centres (n = 24) with guidelines for neck-pain treatment and follow-up. We quantified the number of recommendations, the proportion of valid statements, the frequencies of non-evidence-based recommendations, and the absence of the evidence-based recommendations we had identified. The 34 identified guidelines contained 325 recommendations, with great variation between guidelines with respect to the number, type (for up to 26 different clinical decisions), and content of the recommendations they provided. Direct assessment of the scientific evidence was not possible because no specific reference was given to support any recommendation. When compared with our list, only 20.9% of the recommendations could be considered evidence-based. No guideline contained all the eight evidence-based recommendations we identified. The results question the guidelines' reliability and validity, and their usefulness in ensuring quality. We conclude that guidelines should be reviewed and re-designed with greater scientific rigour.

  3. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  4. Empirical Measurement of the Software Testing and Reliability

    Institute of Scientific and Technical Information of China (English)

    Zou Feng-zhong; Xu Ren-zuo

    2004-01-01

    The meanings of parameters of software reliability models are investigated in terms of the process of the software testing and in terms of other measurements of software. Based on the investigation, the empirical estimation of the parameters is addressed. On one hand, these empirical estimates are also measurements of the software, which can be used to control and to optimize the process of the software development. On the other hand, by treating these empirical estimates as Bayes priors, software reliability models are extended such that the engineers' experience can be integrated into and hence to improve the models.

  5. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  6. System Reliability Analysis: Foundations.

    Science.gov (United States)

    1982-07-01

    performance formulas for systems subject to pre- ventive maintenance are given. V * ~, , 9 D -2 SYSTEM RELIABILITY ANALYSIS: FOUNDATIONS Richard E...reliability in this case is V P{s can communicate with the terminal t = h(p) Sp2(((((p p)p) p)p)gp) + p(l -p)(((pL p)p)(p 2 JLp)) + p(l -p)((p(p p...For undirected networks, the basic reference is A. Satyanarayana and Kevin Wood (1982). For directed networks, the basic reference is Avinash

  7. ATLAS reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bartsch, R.R.

    1995-09-01

    Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.

  8. Empirically Testing Thematic Analysis (ETTA)

    DEFF Research Database (Denmark)

    Alkier Gildberg, Frederik; Bradley, Stephen; Tingleff, Ellen Boldrup

    2015-01-01

    Text analysis is not a question of a right or wrong way to go about it, but a question of different traditions. These tend to not only give answers to how to conduct an analysis, but also to provide the answer as to why it is conducted in the way that it is. The problem however may be that the link...... between tradition and tool is unclear. The main objective of this article is therefore to present Empirical Testing Thematic Analysis, a step by step approach to thematic text analysis; discussing strengths and weaknesses, so that others might assess its potential as an approach that they might utilize....../develop for themselves. The advantage of utilizing the presented analytic approach is argued to be the integral empirical testing, which should assure systematic development, interpretation and analysis of the source textual material....

  9. Empirical analysis of consumer behavior

    NARCIS (Netherlands)

    Huang, Yufeng

    2015-01-01

    This thesis consists of three essays in quantitative marketing, focusing on structural empirical analysis of consumer behavior. In the first essay, he investigates the role of a consumer's skill of product usage, and its imperfect transferability across brands, in her product choice. It shows that

  10. Empirical analysis of consumer behavior

    NARCIS (Netherlands)

    Huang, Yufeng

    2015-01-01

    This thesis consists of three essays in quantitative marketing, focusing on structural empirical analysis of consumer behavior. In the first essay, he investigates the role of a consumer's skill of product usage, and its imperfect transferability across brands, in her product choice. It shows that e

  11. Empirical analysis of consumer behavior

    NARCIS (Netherlands)

    Huang, Yufeng

    2015-01-01

    This thesis consists of three essays in quantitative marketing, focusing on structural empirical analysis of consumer behavior. In the first essay, he investigates the role of a consumer's skill of product usage, and its imperfect transferability across brands, in her product choice. It shows that e

  12. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  13. Empirical likelihood method in survival analysis

    CERN Document Server

    Zhou, Mai

    2015-01-01

    Add the Empirical Likelihood to Your Nonparametric ToolboxEmpirical Likelihood Method in Survival Analysis explains how to use the empirical likelihood method for right censored survival data. The author uses R for calculating empirical likelihood and includes many worked out examples with the associated R code. The datasets and code are available for download on his website and CRAN.The book focuses on all the standard survival analysis topics treated with empirical likelihood, including hazard functions, cumulative distribution functions, analysis of the Cox model, and computation of empiric

  14. Reliability Analysis of Wind Turbines

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    2008-01-01

    In order to minimise the total expected life-cycle costs of a wind turbine it is important to estimate the reliability level for all components in the wind turbine. This paper deals with reliability analysis for the tower and blades of onshore wind turbines placed in a wind farm. The limit states...... consideres are in the ultimate limit state (ULS) extreme conditions in the standstill position and extreme conditions during operating. For wind turbines, where the magnitude of the loads is influenced by the control system, the ultimate limit state can occur in both cases. In the fatigue limit state (FLS......) the reliability level for a wind turbine placed in a wind farm is considered, and wake effects from neighbouring wind turbines is taken into account. An illustrative example with calculation of the reliability for mudline bending of the tower is considered. In the example the design is determined according...

  15. Reliability analysis in intelligent machines

    Science.gov (United States)

    Mcinroy, John E.; Saridis, George N.

    1990-01-01

    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  16. Sparse Empirical Bayes Analysis (SEBA)

    CERN Document Server

    Bochkina, Natalia

    2009-01-01

    We consider a joint processing of $n$ independent sparse regression problems. Each is based on a sample $(y_{i1},x_{i1})...,(y_{im},x_{im})$ of $m$ \\iid observations from $y_{i1}=x_{i1}\\t\\beta_i+\\eps_{i1}$, $y_{i1}\\in \\R$, $x_{i 1}\\in\\R^p$, $i=1,...,n$, and $\\eps_{i1}\\dist N(0,\\sig^2)$, say. $p$ is large enough so that the empirical risk minimizer is not consistent. We consider three possible extensions of the lasso estimator to deal with this problem, the lassoes, the group lasso and the RING lasso, each utilizing a different assumption how these problems are related. For each estimator we give a Bayesian interpretation, and we present both persistency analysis and non-asymptotic error bounds based on restricted eigenvalue - type assumptions.

  17. Design and experimentation of an empirical multistructure framework for accurate, sharp and reliable hydrological ensembles

    Science.gov (United States)

    Seiller, G.; Anctil, F.; Roy, R.

    2017-09-01

    This paper outlines the design and experimentation of an Empirical Multistructure Framework (EMF) for lumped conceptual hydrological modeling. This concept is inspired from modular frameworks, empirical model development, and multimodel applications, and encompasses the overproduce and select paradigm. The EMF concept aims to reduce subjectivity in conceptual hydrological modeling practice and includes model selection in the optimisation steps, reducing initial assumptions on the prior perception of the dominant rainfall-runoff transformation processes. EMF generates thousands of new modeling options from, for now, twelve parent models that share their functional components and parameters. Optimisation resorts to ensemble calibration, ranking and selection of individual child time series based on optimal bias and reliability trade-offs, as well as accuracy and sharpness improvement of the ensemble. Results on 37 snow-dominated Canadian catchments and 20 climatically-diversified American catchments reveal the excellent potential of the EMF in generating new individual model alternatives, with high respective performance values, that may be pooled efficiently into ensembles of seven to sixty constitutive members, with low bias and high accuracy, sharpness, and reliability. A group of 1446 new models is highlighted to offer good potential on other catchments or applications, based on their individual and collective interests. An analysis of the preferred functional components reveals the importance of the production and total flow elements. Overall, results from this research confirm the added value of ensemble and flexible approaches for hydrological applications, especially in uncertain contexts, and open up new modeling possibilities.

  18. Hybrid reliability model for fatigue reliability analysis of steel bridges

    Institute of Scientific and Technical Information of China (English)

    曹珊珊; 雷俊卿

    2016-01-01

    A kind of hybrid reliability model is presented to solve the fatigue reliability problems of steel bridges. The cumulative damage model is one kind of the models used in fatigue reliability analysis. The parameter characteristics of the model can be described as probabilistic and interval. The two-stage hybrid reliability model is given with a theoretical foundation and a solving algorithm to solve the hybrid reliability problems. The theoretical foundation is established by the consistency relationships of interval reliability model and probability reliability model with normally distributed variables in theory. The solving process is combined with the definition of interval reliability index and the probabilistic algorithm. With the consideration of the parameter characteristics of theS−N curve, the cumulative damage model with hybrid variables is given based on the standards from different countries. Lastly, a case of steel structure in the Neville Island Bridge is analyzed to verify the applicability of the hybrid reliability model in fatigue reliability analysis based on the AASHTO.

  19. Sensitivity Analysis of Component Reliability

    Institute of Scientific and Technical Information of China (English)

    ZhenhuaGe

    2004-01-01

    In a system, Every component has its unique position within system and its unique failure characteristics. When a component's reliability is changed, its effect on system reliability is not equal. Component reliability sensitivity is a measure of effect on system reliability while a component's reliability is changed. In this paper, the definition and relative matrix of component reliability sensitivity is proposed, and some of their characteristics are analyzed. All these will help us to analyse or improve the system reliability.

  20. Reliability Analysis of Sensor Networks

    Institute of Scientific and Technical Information of China (English)

    JIN Yan; YANG Xiao-zong; WANG Ling

    2005-01-01

    To Integrate the capacity of sensing, communication, computing, and actuating, one of the compelling technological advances of these years has been the appearance of distributed wireless sensor network (DSN) for information gathering tasks. In order to save the energy, multi-hop routing between the sensor nodes and the sink node is necessary because of limited resource. In addition, the unpredictable conditional factors make the sensor nodes unreliable. In this paper, the reliability of routing designed for sensor network and some dependability issues of DSN, such as MTTF(mean time to failure) and the probability of connectivity between the sensor nodes and the sink node are analyzed.Unfortunately, we could not obtain the accurate result for the arbitrary network topology, which is # P-hard problem.And the reliability analysis of restricted topologies clustering-based is given. The method proposed in this paper will show us a constructive idea about how to place energyconstrained sensor nodes in the network efficiently from the prospective of reliability.

  1. The Reliability and Stability of an Inferred Phylogenetic Tree from Empirical Data.

    Science.gov (United States)

    Katsura, Yukako; Stanley, Craig E; Kumar, Sudhir; Nei, Masatoshi

    2017-01-18

    The reliability of a phylogenetic tree obtained from empirical data is usually measured by the bootstrap probability (Pb) of interior branches of the tree. If the bootstrap probability is high for most branches, the tree is considered to be reliable. If some interior branches show relatively low bootstrap probabilities, we are not sure that the inferred tree is really reliable. Here, we propose another quantity measuring the reliability of the tree called the stability of a subtree. This quantity refers to the probability of obtaining a subtree (Ps) of an inferred tree obtained. We then show that if the tree is to be reliable, both Pb and Ps must be high. We also show that Ps is given by a bootstrap probability of the subtree with the closest outgroup sequence, and computer program RESTA for computing the Pb and Ps values will be presented.

  2. Empirical direction in design and analysis

    CERN Document Server

    Anderson, Norman H

    2001-01-01

    The goal of Norman H. Anderson's new book is to help students develop skills of scientific inference. To accomplish this he organized the book around the ""Experimental Pyramid""--six levels that represent a hierarchy of considerations in empirical investigation--conceptual framework, phenomena, behavior, measurement, design, and statistical inference. To facilitate conceptual and empirical understanding, Anderson de-emphasizes computational formulas and null hypothesis testing. Other features include: *emphasis on visual inspection as a basic skill in experimental analysis to help student

  3. Creep-rupture reliability analysis

    Science.gov (United States)

    Peralta-Duran, A.; Wirsching, P. H.

    1985-01-01

    A probabilistic approach to the correlation and extrapolation of creep-rupture data is presented. Time temperature parameters (TTP) are used to correlate the data, and an analytical expression for the master curve is developed. The expression provides a simple model for the statistical distribution of strength and fits neatly into a probabilistic design format. The analysis focuses on the Larson-Miller and on the Manson-Haferd parameters, but it can be applied to any of the TTP's. A method is developed for evaluating material dependent constants for TTP's. It is shown that optimized constants can provide a significant improvement in the correlation of the data, thereby reducing modelling error. Attempts were made to quantify the performance of the proposed method in predicting long term behavior. Uncertainty in predicting long term behavior from short term tests was derived for several sets of data. Examples are presented which illustrate the theory and demonstrate the application of state of the art reliability methods to the design of components under creep.

  4. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M.

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  5. Empirical Bayes analysis of single nucleotide polymorphisms

    Directory of Open Access Journals (Sweden)

    Ickstadt Katja

    2008-03-01

    Full Text Available Abstract Background An important goal of whole-genome studies concerned with single nucleotide polymorphisms (SNPs is the identification of SNPs associated with a covariate of interest such as the case-control status or the type of cancer. Since these studies often comprise the genotypes of hundreds of thousands of SNPs, methods are required that can cope with the corresponding multiple testing problem. For the analysis of gene expression data, approaches such as the empirical Bayes analysis of microarrays have been developed particularly for the detection of genes associated with the response. However, the empirical Bayes analysis of microarrays has only been suggested for binary responses when considering expression values, i.e. continuous predictors. Results In this paper, we propose a modification of this empirical Bayes analysis that can be used to analyze high-dimensional categorical SNP data. This approach along with a generalized version of the original empirical Bayes method are available in the R package siggenes version 1.10.0 and later that can be downloaded from http://www.bioconductor.org. Conclusion As applications to two subsets of the HapMap data show, the empirical Bayes analysis of microarrays cannot only be used to analyze continuous gene expression data, but also be applied to categorical SNP data, where the response is not restricted to be binary. In association studies in which typically several ten to a few hundred SNPs are considered, our approach can furthermore be employed to test interactions of SNPs. Moreover, the posterior probabilities resulting from the empirical Bayes analysis of (prespecified interactions/genotypes can also be used to quantify the importance of these interactions.

  6. Analysis of Empirical Software Effort Estimation Models

    CERN Document Server

    Basha, Saleem

    2010-01-01

    Reliable effort estimation remains an ongoing challenge to software engineers. Accurate effort estimation is the state of art of software engineering, effort estimation of software is the preliminary phase between the client and the business enterprise. The relationship between the client and the business enterprise begins with the estimation of the software. The credibility of the client to the business enterprise increases with the accurate estimation. Effort estimation often requires generalizing from a small number of historical projects. Generalization from such limited experience is an inherently under constrained problem. Accurate estimation is a complex process because it can be visualized as software effort prediction, as the term indicates prediction never becomes an actual. This work follows the basics of the empirical software effort estimation models. The goal of this paper is to study the empirical software effort estimation. The primary conclusion is that no single technique is best for all sit...

  7. Reliability Analysis of Money Habitudes

    Science.gov (United States)

    Delgadillo, Lucy M.; Bushman, Brittani S.

    2015-01-01

    Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…

  8. Reliability Analysis of Money Habitudes

    Science.gov (United States)

    Delgadillo, Lucy M.; Bushman, Brittani S.

    2015-01-01

    Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…

  9. The problem analysis for empirical studies

    NARCIS (Netherlands)

    Groenland, E.A.G.

    2014-01-01

    This article proposes a systematic methodology for the development of a problem analysis for cross-sectional, empirical research. This methodology is referred to as the 'Annabel approach'. It is suitable both for academic studies and applied (business) studies. In addition it can be used for both

  10. The problem analysis for empirical studies

    NARCIS (Netherlands)

    Groenland, E.A.G.

    2014-01-01

    This article proposes a systematic methodology for the development of a problem analysis for cross-sectional, empirical research. This methodology is referred to as the 'Annabel approach'. It is suitable both for academic studies and applied (business) studies. In addition it can be used for both qu

  11. An empirical look at the Defense Mechanism Test (DMT): reliability and construct validity.

    Science.gov (United States)

    Ekehammar, Bo; Zuber, Irena; Konstenius, Marja-Liisa

    2005-07-01

    Although the Defense Mechanism Test (DMT) has been in use for almost half a century, there are still quite contradictory views about whether it is a reliable instrument, and if so, what it really measures. Thus, based on data from 39 female students, we first examined DMT inter-coder reliability by analyzing the agreement among trained judges in their coding of the same DMT protocols. Second, we constructed a "parallel" photographic picture that retained all structural characteristic of the original and analyzed DMT parallel-test reliability. Third, we examined the construct validity of the DMT by (a) employing three self-report defense-mechanism inventories and analyzing the intercorrelations between DMT defense scores and corresponding defenses in these instruments, (b) studying the relationships between DMT responses and scores on trait and state anxiety, and (c) relating DMT-defense scores to measures of self-esteem. The main results showed that the DMT can be coded with high reliability by trained coders, that the parallel-test reliability is unsatisfactory compared to traditional psychometric standards, that there is a certain generalizability in the number of perceptual distortions that people display from one picture to another, and that the construct validation provided meager empirical evidence for the conclusion that the DMT measures what it purports to measure, that is, psychological defense mechanisms.

  12. Combination of structural reliability and interval analysis

    Institute of Scientific and Technical Information of China (English)

    Zhiping Qiu; Di Yang; saac Elishakoff

    2008-01-01

    In engineering applications,probabilistic reliability theory appears to be presently the most important method,however,in many cases precise probabilistic reliability theory cannot be considered as adequate and credible model of the real state of actual affairs.In this paper,we developed a hybrid of probabilistic and non-probabilistic reliability theory,which describes the structural uncertain parameters as interval variables when statistical data are found insufficient.By using the interval analysis,a new method for calculating the interval of the structural reliability as well as the reliability index is introduced in this paper,and the traditional probabilistic theory is incorporated with the interval analysis.Moreover,the new method preserves the useful part of the traditional probabilistic reliability theory,but removes the restriction of its strict requirement on data acquisition.Example is presented to demonstrate the feasibility and validity of the proposed theory.

  13. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  14. Analysis of the Reliability of the "Alternator- Alternator Belt" System

    Directory of Open Access Journals (Sweden)

    Ivan Mavrin

    2012-10-01

    Full Text Available Before starting and also during the exploitation of va1ioussystems, it is vety imp011ant to know how the system and itsparts will behave during operation regarding breakdowns, i.e.failures. It is possible to predict the service behaviour of a systemby determining the functions of reliability, as well as frequencyand intensity of failures.The paper considers the theoretical basics of the functionsof reliability, frequency and intensity of failures for the twomain approaches. One includes 6 equal intetvals and the other13 unequal intetvals for the concrete case taken from practice.The reliability of the "alternator- alternator belt" system installedin the buses, has been analysed, according to the empiricaldata on failures.The empitical data on failures provide empirical functionsof reliability and frequency and intensity of failures, that arepresented in tables and graphically. The first analysis perfO!med by dividing the mean time between failures into 6 equaltime intervals has given the forms of empirical functions of fa ilurefrequency and intensity that approximately cotTespond totypical functions. By dividing the failure phase into 13 unequalintetvals with two failures in each interval, these functions indicateexplicit transitions from early failure inte1val into the randomfailure interval, i.e. into the ageing intetval. Functions thusobtained are more accurate and represent a better solution forthe given case.In order to estimate reliability of these systems with greateraccuracy, a greater number of failures needs to be analysed.

  15. Integrated Methodology for Software Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Marian Pompiliu CRISTESCU

    2012-01-01

    Full Text Available The most used techniques to ensure safety and reliability of the systems are applied together as a whole, and in most cases, the software components are usually overlooked or to little analyzed. The present paper describes the applicability of fault trees analysis software system, analysis defined as Software Fault Tree Analysis (SFTA, fault trees are evaluated using binary decision diagrams, all of these being integrated and used with help from Java library reliability.

  16. Reliability Sensitivity Analysis for Location Scale Family

    Institute of Scientific and Technical Information of China (English)

    洪东跑; 张海瑞

    2011-01-01

    Many products always operate under various complex environment conditions. To describe the dynamic influence of environment factors on their reliability, a method of reliability sensitivity analysis is proposed. In this method, the location parameter is assumed as a function of relevant environment variables while the scale parameter is assumed as an un- known positive constant. Then, the location parameter function is constructed by using the method of radial basis function. Using the varied environment test data, the log-likelihood function is transformed to a generalized linear expression by de- scribing the indicator as Poisson variable. With the generalized linear model, the maximum likelihood estimations of the model coefficients are obtained. With the reliability model, the reliability sensitivity is obtained. An instance analysis shows that the method is feasible to analyze the dynamic variety characters of reliability along with environment factors and is straightforward for engineering application.

  17. Space Mission Human Reliability Analysis (HRA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this project is to extend current ground-based Human Reliability Analysis (HRA) techniques to a long-duration, space-based tool to more effectively...

  18. Production Facility System Reliability Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Dale, Crystal Buchanan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Klein, Steven Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-10-06

    This document describes the reliability, maintainability, and availability (RMA) modeling of the Los Alamos National Laboratory (LANL) design for the Closed Loop Helium Cooling System (CLHCS) planned for the NorthStar accelerator-based 99Mo production facility. The current analysis incorporates a conceptual helium recovery system, beam diagnostics, and prototype control system into the reliability analysis. The results from the 1000 hr blower test are addressed.

  19. Structural reliability analysis and reliability-based design optimization: Recent advances

    Science.gov (United States)

    Qiu, ZhiPing; Huang, Ren; Wang, XiaoJun; Qi, WuChao

    2013-09-01

    We review recent research activities on structural reliability analysis, reliability-based design optimization (RBDO) and applications in complex engineering structural design. Several novel uncertainty propagation methods and reliability models, which are the basis of the reliability assessment, are given. In addition, recent developments on reliability evaluation and sensitivity analysis are highlighted as well as implementation strategies for RBDO.

  20. Multi-Disciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song

    1997-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code developed under the leadership of NASA Lewis Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multi-disciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  1. An Empirical Analysis of Humanitarian Warehouse Locations

    Directory of Open Access Journals (Sweden)

    Sander de Leeuw

    2016-06-01

    Full Text Available The purpose of this paper is to empirically verify characteristics of current warehouse locations of humanitarian organizations (based on public information and to relate those to the model developed by Richardson, de Leeuw and Dullaert (2016. This paper is based on desk research. Public data such as (annual reports and databases are used to determine the features of the location in empirical terms. We find that a significant proportion of our sample co-locates their products at UNHRD premises. This suggests that organizations prefer to cluster their warehouse activities, particularly when there is no fee involved for using the warehouse (as is the case in the UNHRD network. The geographic map of the current warehouses, together with the quantified location factors, provides an overview of the current warehouse locations. We found that the characteristics of the current warehouse locations are aligned with literature on location selection factors. Current location can be characterized by infrastructure characteristics (in particular closeness to airport and safety concerns and by the low occurrence of disasters. Other factors that were considered by us but were not supported by empirical evidence were labor quality and availability as well as the political environment. In our study we were only able to use a limited sample of warehouses. We also focused our research on countries where two or more organizations have their warehouses located. We did not account for warehouse sizes or the kinds of products stored in our analysis.

  2. Reliability Analysis of DOOF for Weibull Distribution

    Institute of Scientific and Technical Information of China (English)

    陈文华; 崔杰; 樊小燕; 卢献彪; 相平

    2003-01-01

    Hierarchical Bayesian method for estimating the failure probability under DOOF by taking the quasi-Beta distribution as the prior distribution is proposed in this paper. The weighted Least Squares Estimate method was used to obtain the formula for computing reliability distribution parameters and estimating the reliability characteristic values under DOOF. Taking one type of aerospace electrical connector as an example, the correctness of the above method through statistical analysis of electrical connector accelerated life test data was verified.

  3. Reliability analysis of flood defence systems

    NARCIS (Netherlands)

    Steenbergen, H.M.G.M.; Lassing, B.L.; Vrouwenvelder, A.C.W.M.; Waarts, P.H.

    2004-01-01

    In recent years an advanced program for the reliability analysis of flood defence systems has been under development. This paper describes the global data requirements for the application and the setup of the models. The analysis generates the probability of system failure and the contribution of ea

  4. Reliability Analysis of High Rockfill Dam Stability

    Directory of Open Access Journals (Sweden)

    Ping Yi

    2015-01-01

    Full Text Available A program 3DSTAB combining slope stability analysis and reliability analysis is developed and validated. In this program, the limit equilibrium method is utilized to calculate safety factors of critical slip surfaces. The first-order reliability method is used to compute reliability indexes corresponding to critical probabilistic surfaces. When derivatives of the performance function are calculated by finite difference method, the previous iteration’s critical slip surface is saved and used. This sequential approximation strategy notably improves efficiency. Using this program, the stability reliability analyses of concrete faced rockfill dams and earth core rockfill dams with different heights and different slope ratios are performed. The results show that both safety factors and reliability indexes decrease as the dam’s slope increases at a constant height and as the dam’s height increases at a constant slope. They decrease dramatically as the dam height increases from 100 m to 200 m while they decrease slowly once the dam height exceeds 250 m, which deserves attention. Additionally, both safety factors and reliability indexes of the upstream slope of earth core rockfill dams are higher than that of the downstream slope. Thus, the downstream slope stability is the key failure mode for earth core rockfill dams.

  5. SMART empirical approaches for predicting field performance of PV modules from results of reliability tests

    Science.gov (United States)

    Hardikar, Kedar Y.; Liu, Bill J. J.; Bheemreddy, Venkata

    2016-09-01

    Gaining an understanding of degradation mechanisms and their characterization are critical in developing relevant accelerated tests to ensure PV module performance warranty over a typical lifetime of 25 years. As newer technologies are adapted for PV, including new PV cell technologies, new packaging materials, and newer product designs, the availability of field data over extended periods of time for product performance assessment cannot be expected within the typical timeframe for business decisions. In this work, to enable product design decisions and product performance assessment for PV modules utilizing newer technologies, Simulation and Mechanism based Accelerated Reliability Testing (SMART) methodology and empirical approaches to predict field performance from accelerated test results are presented. The method is demonstrated for field life assessment of flexible PV modules based on degradation mechanisms observed in two accelerated tests, namely, Damp Heat and Thermal Cycling. The method is based on design of accelerated testing scheme with the intent to develop relevant acceleration factor models. The acceleration factor model is validated by extensive reliability testing under different conditions going beyond the established certification standards. Once the acceleration factor model is validated for the test matrix a modeling scheme is developed to predict field performance from results of accelerated testing for particular failure modes of interest. Further refinement of the model can continue as more field data becomes available. While the demonstration of the method in this work is for thin film flexible PV modules, the framework and methodology can be adapted to other PV products.

  6. RELIABILITY ANALYSIS OF BENDING ELIABILITY ANALYSIS OF ...

    African Journals Online (AJOL)

    eobe

    performance of any structural system be eva ... by the Joint crete slabs, bending, shear, deflection, reliability, design codes. ement such as ... could be sensitive to this distribution. Table 1: ..... Ang, A. H-S and Tang, W. H. Probability Concepts in.

  7. Culture Representation in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    David Gertman; Julie Marble; Steven Novack

    2006-12-01

    Understanding human-system response is critical to being able to plan and predict mission success in the modern battlespace. Commonly, human reliability analysis has been used to predict failures of human performance in complex, critical systems. However, most human reliability methods fail to take culture into account. This paper takes an easily understood state of the art human reliability analysis method and extends that method to account for the influence of culture, including acceptance of new technology, upon performance. The cultural parameters used to modify the human reliability analysis were determined from two standard industry approaches to cultural assessment: Hofstede’s (1991) cultural factors and Davis’ (1989) technology acceptance model (TAM). The result is called the Culture Adjustment Method (CAM). An example is presented that (1) reviews human reliability assessment with and without cultural attributes for a Supervisory Control and Data Acquisition (SCADA) system attack, (2) demonstrates how country specific information can be used to increase the realism of HRA modeling, and (3) discusses the differences in human error probability estimates arising from cultural differences.

  8. Reliability Analysis of a Steel Frame

    Directory of Open Access Journals (Sweden)

    M. Sýkora

    2002-01-01

    Full Text Available A steel frame with haunches is designed according to Eurocodes. The frame is exposed to self-weight, snow, and wind actions. Lateral-torsional buckling appears to represent the most critical criterion, which is considered as a basis for the limit state function. In the reliability analysis, the probabilistic models proposed by the Joint Committee for Structural Safety (JCSS are used for basic variables. The uncertainty model coefficients take into account the inaccuracy of the resistance model for the haunched girder and the inaccuracy of the action effect model. The time invariant reliability analysis is based on Turkstra's rule for combinations of snow and wind actions. The time variant analysis describes snow and wind actions by jump processes with intermittencies. Assuming a 50-year lifetime, the obtained values of the reliability index b vary within the range from 3.95 up to 5.56. The cross-profile IPE 330 designed according to Eurocodes seems to be adequate. It appears that the time invariant reliability analysis based on Turkstra's rule provides considerably lower values of b than those obtained by the time variant analysis.

  9. Event/Time/Availability/Reliability-Analysis Program

    Science.gov (United States)

    Viterna, L. A.; Hoffman, D. J.; Carr, Thomas

    1994-01-01

    ETARA is interactive, menu-driven program that performs simulations for analysis of reliability, availability, and maintainability. Written to evaluate performance of electrical power system of Space Station Freedom, but methodology and software applied to any system represented by block diagram. Program written in IBM APL.

  10. Reliability analysis of DOOF for Weibull distribution

    Institute of Scientific and Technical Information of China (English)

    陈文华; 崔杰; 樊晓燕; 卢献彪; 相平

    2003-01-01

    Hierarchical Bayesian method for estimating the failure probability Pi under DOOF by taking the quasi-Beta distribution B(pi-1 , 1,1, b ) as the prior distribution is proposed in this paper. The weighted Least Squares Estimate method was used to obtain the formula for computing reliability distribution parameters and estimating the reliability characteristic values under DOOF. Taking one type of aerospace electrical connectoras an example, the correctness of the above method through statistical analysis of electrical connector acceler-ated life test data was verified.

  11. An Empirical Analysis of the Budget Deficit

    Directory of Open Access Journals (Sweden)

    Ioan Talpos

    2007-11-01

    Full Text Available Economic policies and, particularly, fiscal policies are not designed and implemented in an “empty space”: the structural characteristics of the economic systems, the institutional architecture of societies, the cultural paradigm and the power relations between different social groups define the borders of these policies. This paper tries to deal with these borders, to describe their nature and the implications of their existence to the fiscal policies’ quality and impact at a theoretical level as well as at an empirical one. The main results of the proposed analysis support the ideas that the mentioned variables matters both for the social mandate entrusted by the society to the state and thus to the role and functions of the state and for the economic growth as a support of the resources collected at distributed by the public authorities.

  12. EMPIRICAL ANALYSIS OF SEASONALITY PATTERNS IN TOURISM

    Directory of Open Access Journals (Sweden)

    Biljana Petrevska

    2013-04-01

    Full Text Available The paper makes an attempt empirically to investigate the presence of seasonality patterns in tourism. For that purpose, the case of Macedonia is elaborated by introducing data referring tourist arrivals for the period 1992- 2012. The analysis is based upon employment of the Gini coefficient, as one of the most commonly applied indicators for measuring and expressing inequalities caused by temporary disorders. The computed data reject the research hypothesis and highlights new facts regarding seasonality in tourism demand in Macedonia. Namely, the outcomes point to conclusion of absence of seasonality i.e. tourism flow concentration is not significant to tourism development. Hence, this study underlines that the up-to-date modest tourism results must not be addressed to seasonality as strong and limiting factor for tourism development in Macedonia, since there is no such.

  13. SUPPORT VECTOR MACHINE FOR STRUCTURAL RELIABILITY ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    LI Hong-shuang; L(U) Zhen-zhou; YUE Zhu-feng

    2006-01-01

    Support vector machine (SVM) was introduced to analyze the reliability of the implicit performance function, which is difficult to implement by the classical methods such as the first order reliability method (FORM) and the Monte Carlo simulation (MCS). As a classification method where the underlying structural risk minimization inference rule is employed, SVM possesses excellent learning capacity with a small amount of information and good capability of generalization over the complete data. Hence,two approaches, i.e., SVM-based FORM and SVM-based MCS, were presented for the structural reliability analysis of the implicit limit state function. Compared to the conventional response surface method (RSM) and the artificial neural network (ANN), which are widely used to replace the implicit state function for alleviating the computation cost,the more important advantages of SVM are that it can approximate the implicit function with higher precision and better generalization under the small amount of information and avoid the "curse of dimensionality". The SVM-based reliability approaches can approximate the actual performance function over the complete sampling data with the decreased number of the implicit performance function analysis (usually finite element analysis), and the computational precision can satisfy the engineering requirement, which are demonstrated by illustrations.

  14. A fast and reliable empirical approach for estimating solubility of crystalline drugs in polymers for hot melt extrusion formulations.

    Science.gov (United States)

    Kyeremateng, Samuel O; Pudlas, Marieke; Woehrle, Gerd H

    2014-09-01

    A novel empirical analytical approach for estimating solubility of crystalline drugs in polymers has been developed. The approach utilizes a combination of differential scanning calorimetry measurements and a reliable mathematical algorithm to construct complete solubility curve of a drug in polymer. Compared with existing methods, this novel approach reduces the required experimentation time and amount of material by approximately 80%. The predictive power and relevance of such solubility curves in development of amorphous solid dispersion (ASD) formulations are shown by applications to a number of hot-melt extrudate formulations of ibuprofen and naproxen in Soluplus. On the basis of the temperature-drug load diagrams using the solubility curves and the glass transition temperatures, physical stability of the extrudate formulations was predicted and checked by placing the formulations on real-time stability studies. An analysis of the stability samples with microscopy, thermal, and imaging techniques confirmed the predicted physical stability of the formulations. In conclusion, this study presents a fast and reliable approach for estimating solubility of crystalline drugs in polymer matrixes. This powerful approach can be applied by formulation scientists as an early and convenient tool in designing ASD formulations for maximum drug load and physical stability.

  15. Human reliability analysis of control room operators

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Isaac J.A.L.; Carvalho, Paulo Victor R.; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)

    2005-07-01

    Human reliability is the probability that a person correctly performs some system required action in a required time period and performs no extraneous action that can degrade the system Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. Significant progress has been made in the HRA field during the last years, mainly in nuclear area. Some first-generation HRA methods were developed, as THERP (Technique for human error rate prediction). Now, an array of called second-generation methods are emerging as alternatives, for instance ATHEANA (A Technique for human event analysis). The ergonomics approach has as tool the ergonomic work analysis. It focus on the study of operator's activities in physical and mental form, considering at the same time the observed characteristics of operator and the elements of the work environment as they are presented to and perceived by the operators. The aim of this paper is to propose a methodology to analyze the human reliability of the operators of industrial plant control room, using a framework that includes the approach used by ATHEANA, THERP and the work ergonomics analysis. (author)

  16. ACCOUNTING POLICIES AND FINANCIAL ANALYSIS INTERDEPENDENCES - EMPIRICAL EVIDENCE

    Directory of Open Access Journals (Sweden)

    Nino Serdarević

    2011-06-01

    Full Text Available This paper presents empirical evidence on applied analysis interdependences with created accounting policies and estimates within Bosnia and Herzegovina (BIH private commercial entities, in specific, targeting practice oriented relevance of financial indicators, non-financial indicators, enterprise resource planning and management account-ting insight frequencies. Recently, standard setters (International Accounting Standards Board and International Federation of Accountants have published outcomes of an internationally organized research on financial reports usefulness, recommending enforced usage of enterprise relevant information, non-financial indicators and risks implications in assets and liabilities positions. These imply litigation and possible income smoothening. In regard to financial reporting reliability, many authors suggest accounting conservatism as a measure to compose risk assessment and earnings response ratio. Author argues that recently suggested financial management measures involving cash and assets management, liquidity ratios and turns do not directly imply accounting information quality, prior computed within applied accounting conservatism.

  17. Reliability Analysis of Elasto-Plastic Structures

    DEFF Research Database (Denmark)

    1984-01-01

    . Failure of this type of system is defined either as formation of a mechanism or by failure of a prescribed number of elements. In the first case failure is independent of the order in which the elements fail, but this is not so by the second definition. The reliability analysis consists of two parts...... are described and the two definitions of failure can be used by the first formulation, but only the failure definition based on formation of a mechanism by the second formulation. The second part of the reliability analysis is an estimate of the failure probability for the structure on the basis...... are obtained if the failure mechanisms are used. Lower bounds can be calculated on the basis of series systems where the elements are the non-failed elements in a non-failed structure (see Augusti & Baratta [3])....

  18. Bridging Resilience Engineering and Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2010-06-01

    There has been strong interest in the new and emerging field called resilience engineering. This field has been quick to align itself with many existing safety disciplines, but it has also distanced itself from the field of human reliability analysis. To date, the discussion has been somewhat one-sided, with much discussion about the new insights afforded by resilience engineering. This paper presents an attempt to address resilience engineering from the perspective of human reliability analysis (HRA). It is argued that HRA shares much in common with resilience engineering and that, in fact, it can help strengthen nascent ideas in resilience engineering. This paper seeks to clarify and ultimately refute the arguments that have served to divide HRA and resilience engineering.

  19. Empirical mode decomposition analysis for visual stylometry.

    Science.gov (United States)

    Hughes, James M; Mao, Dong; Rockmore, Daniel N; Wang, Yang; Wu, Qiang

    2012-11-01

    In this paper, we show how the tools of empirical mode decomposition (EMD) analysis can be applied to the problem of “visual stylometry,” generally defined as the development of quantitative tools for the measurement and comparisons of individual style in the visual arts. In particular, we introduce a new form of EMD analysis for images and show that it is possible to use its output as the basis for the construction of effective support vector machine (SVM)-based stylometric classifiers. We present the methodology and then test it on collections of two sets of digital captures of drawings: a set of authentic and well-known imitations of works attributed to the great Flemish artist Pieter Bruegel the Elder (1525-1569) and a set of works attributed to Dutch master Rembrandt van Rijn (1606-1669) and his pupils. Our positive results indicate that EMD-based methods may hold promise generally as a technique for visual stylometry.

  20. Reliability analysis of wastewater treatment plants.

    Science.gov (United States)

    Oliveira, Sílvia C; Von Sperling, Marcos

    2008-02-01

    This article presents a reliability analysis of 166 full-scale wastewater treatment plants operating in Brazil. Six different processes have been investigated, comprising septic tank+anaerobic filter, facultative pond, anaerobic pond+facultative pond, activated sludge, upflow anaerobic sludge blanket (UASB) reactors alone and UASB reactors followed by post-treatment. A methodology developed by Niku et al. [1979. Performance of activated sludge process and reliability-based design. J. Water Pollut. Control Assoc., 51(12), 2841-2857] is used for determining the coefficients of reliability (COR), in terms of the compliance of effluent biochemical oxygen demand (BOD), chemical oxygen demand (COD), total suspended solids (TSS), total nitrogen (TN), total phosphorus (TP) and fecal or thermotolerant coliforms (FC) with discharge standards. The design concentrations necessary to meet the prevailing discharge standards and the expected compliance percentages have been calculated from the COR obtained. The results showed that few plants, under the observed operating conditions, would be able to present reliable performances considering the compliance with the analyzed standards. The article also discusses the importance of understanding the lognormal behavior of the data in setting up discharge standards, in interpreting monitoring results and compliance with the legislation.

  1. Empirical analysis on risk of security investment

    Institute of Scientific and Technical Information of China (English)

    AN Peng; LI Sheng-hong

    2009-01-01

    The paper analyzes the theory and application of Markowitz Mean-Variance Model and CAPM model. Firstly, it explains the development process and standpoints of two models and deduces the whole process in detail. Then 30 stocks are choosen from Shangzheng 50 stocks and are testified whether the prices of Shanghai stocks conform to the two models. With the technique of time series and panel data analysis, the research on the stock risk and effective portfolio by ORIGIN and MATLAB software is conducted. The result shows that Shanghai stock market conforms to Markowitz Mean-Variance Model to a certain extent and can give investors reliable suggestion to gain higher return, but there is no positive relation between system risk and profit ratio and CAPM doesn't function well in China's security market.

  2. Representative Sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Kim Harry

    2005-01-01

    regime in order to secure the necessary reliability of: samples (which must be representative, from the primary sampling onwards), analysis (which will not mean anything outside the miniscule analytical volume without representativity ruling all mass reductions involved, also in the laboratory) and data...... analysis (“data” do not exist in isolation of their provenance). The Total Sampling Error (TSE) is by far the dominating contribution to all analytical endeavours, often 100+ times larger than the Total Analytical Error (TAE).We present a summarizing set of only seven Sampling Unit Operations (SUOs...

  3. Empirical analysis of online human dynamics

    Science.gov (United States)

    Zhao, Zhi-Dan; Zhou, Tao

    2012-06-01

    Patterns of human activities have attracted increasing academic interests, since the quantitative understanding of human behavior is helpful to uncover the origins of many socioeconomic phenomena. This paper focuses on behaviors of Internet users. Six large-scale systems are studied in our experiments, including the movie-watching in Netflix and MovieLens, the transaction in Ebay, the bookmark-collecting in Delicious, and the posting in FreindFeed and Twitter. Empirical analysis reveals some common statistical features of online human behavior: (1) The total number of user's actions, the user's activity, and the interevent time all follow heavy-tailed distributions. (2) There exists a strongly positive correlation between user's activity and the total number of user's actions, and a significantly negative correlation between the user's activity and the width of the interevent time distribution. We further study the rescaling method and show that this method could to some extent eliminate the different statistics among users caused by the different activities, yet the effectiveness depends on the data sets.

  4. Planning Irreversible Electroporation in the Porcine Kidney: Are Numerical Simulations Reliable for Predicting Empiric Ablation Outcomes?

    Energy Technology Data Exchange (ETDEWEB)

    Wimmer, Thomas, E-mail: thomas.wimmer@medunigraz.at; Srimathveeravalli, Govindarajan; Gutta, Narendra [Memorial Sloan-Kettering Cancer Center, Interventional Radiology Service, Department of Radiology (United States); Ezell, Paula C. [The Rockefeller University, Research Animal Resource Center, Memorial Sloan-Kettering Cancer Center, Weill Cornell Medical College (United States); Monette, Sebastien [The Rockefeller University, Laboratory of Comparative Pathology, Memorial Sloan-Kettering Cancer Center, Weill Cornell Medical College (United States); Maybody, Majid; Erinjery, Joseph P.; Durack, Jeremy C. [Memorial Sloan-Kettering Cancer Center, Interventional Radiology Service, Department of Radiology (United States); Coleman, Jonathan A. [Memorial Sloan-Kettering Cancer Center, Urology Service, Department of Surgery (United States); Solomon, Stephen B. [Memorial Sloan-Kettering Cancer Center, Interventional Radiology Service, Department of Radiology (United States)

    2015-02-15

    PurposeNumerical simulations are used for treatment planning in clinical applications of irreversible electroporation (IRE) to determine ablation size and shape. To assess the reliability of simulations for treatment planning, we compared simulation results with empiric outcomes of renal IRE using computed tomography (CT) and histology in an animal model.MethodsThe ablation size and shape for six different IRE parameter sets (70–90 pulses, 2,000–2,700 V, 70–100 µs) for monopolar and bipolar electrodes was simulated using a numerical model. Employing these treatment parameters, 35 CT-guided IRE ablations were created in both kidneys of six pigs and followed up with CT immediately and after 24 h. Histopathology was analyzed from postablation day 1.ResultsAblation zones on CT measured 81 ± 18 % (day 0, p ≤ 0.05) and 115 ± 18 % (day 1, p ≤ 0.09) of the simulated size for monopolar electrodes, and 190 ± 33 % (day 0, p ≤ 0.001) and 234 ± 12 % (day 1, p ≤ 0.0001) for bipolar electrodes. Histopathology indicated smaller ablation zones than simulated (71 ± 41 %, p ≤ 0.047) and measured on CT (47 ± 16 %, p ≤ 0.005) with complete ablation of kidney parenchyma within the central zone and incomplete ablation in the periphery.ConclusionBoth numerical simulations for planning renal IRE and CT measurements may overestimate the size of ablation compared to histology, and ablation effects may be incomplete in the periphery.

  5. Reliability of photographic posture analysis of adolescents.

    Science.gov (United States)

    Hazar, Zeynep; Karabicak, Gul Oznur; Tiftikci, Ugur

    2015-10-01

    [Purpose] Postural problems of adolescents needs to be evaluated accurately because they may lead to greater problems in the musculoskeletal system as they develop. Although photographic posture analysis has been frequently used, more simple and accessible methods are still needed. The purpose of this study was to investigate the inter- and intra-rater reliability of photographic posture analysis using MB-ruler software. [Subjects and Methods] Subjects were 30 adolescents (15 girls and 15 boys, mean age: 16.4±0.4 years, mean height 166.3±6.7 cm, mean weight 63.8±15.1 kg) and photographs of their habitual standing posture photographs were taken in the sagittal plane. For the evaluation of postural angles, reflective markers were placed on anatomical landmarks. For angular measurements, MB-ruler (Markus Bader- MB Software Solutions, triangular screen ruler) was used. Photographic evaluations were performed by two observers with a repetition after a week. Test-retest and inter-rater reliability evaluations were calculated using intra-class correlation coefficients (ICC). [Results] Inter-rater (ICC>0.972) and test-retest (ICC>0.774) reliability were found to be in the range of acceptable to excellent. [Conclusion] Reference angles for postural evaluation were found to be reliable and repeatable. The present method was found to be an easy and non-invasive method and it may be utilized by researchers who are in search of an alternative method for photographic postural assessments.

  6. EMPIRICAL-NUMERICAL ANALYSIS OF HEADCUT MIGRATION

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Headcut migration is studied by using empirical and numerical modeling approaches. Empirical formulas for the headcut migration are established using available measurement data, which consider not only the flow strength but also the properties of soil. Numerical model for the headcut migration is proposed. The influences of dynamic pressure gradient, downward flow, and bed slope on sediment entrainment are considered. The local erosion patterns and migration speeds of headcut calculated by the numerical model agree reasonably well with observed data.

  7. Representative Sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Petersen, Lars; Esbensen, Kim Harry

    2005-01-01

    The Theory of Sampling (TOS) provides a description of all errors involved in sampling of heterogeneous materials as well as all necessary tools for their evaluation, elimination and/or minimization. This tutorial elaborates on—and illustrates—selected central aspects of TOS. The theoretical...... regime in order to secure the necessary reliability of: samples (which must be representative, from the primary sampling onwards), analysis (which will not mean anything outside the miniscule analytical volume without representativity ruling all mass reductions involved, also in the laboratory) and data...

  8. Reliability Analysis of Adhesive Bonded Scarf Joints

    DEFF Research Database (Denmark)

    Kimiaeifar, Amin; Toft, Henrik Stensgaard; Lund, Erik;

    2012-01-01

    A probabilistic model for the reliability analysis of adhesive bonded scarfed lap joints subjected to static loading is developed. It is representative for the main laminate in a wind turbine blade subjected to flapwise bending. The structural analysis is based on a three dimensional (3D) finite...... the FEA model, and a sensitivity analysis on the influence of various geometrical parameters and material properties on the maximum stress is conducted. Because the yield behavior of many polymeric structural adhesives is dependent on both deviatoric and hydrostatic stress components, different ratios...... of the compressive to tensile adhesive yield stresses in the failure criterion are considered. It is shown that the chosen failure criterion, the scarf angle and the load are significant for the assessment of the probability of failure....

  9. RELAV - RELIABILITY/AVAILABILITY ANALYSIS PROGRAM

    Science.gov (United States)

    Bowerman, P. N.

    1994-01-01

    RELAV (Reliability/Availability Analysis Program) is a comprehensive analytical tool to determine the reliability or availability of any general system which can be modeled as embedded k-out-of-n groups of items (components) and/or subgroups. Both ground and flight systems at NASA's Jet Propulsion Laboratory have utilized this program. RELAV can assess current system performance during the later testing phases of a system design, as well as model candidate designs/architectures or validate and form predictions during the early phases of a design. Systems are commonly modeled as System Block Diagrams (SBDs). RELAV calculates the success probability of each group of items and/or subgroups within the system assuming k-out-of-n operating rules apply for each group. The program operates on a folding basis; i.e. it works its way towards the system level from the most embedded level by folding related groups into single components. The entire folding process involves probabilities; therefore, availability problems are performed in terms of the probability of success, and reliability problems are performed for specific mission lengths. An enhanced cumulative binomial algorithm is used for groups where all probabilities are equal, while a fast algorithm based upon "Computing k-out-of-n System Reliability", Barlow & Heidtmann, IEEE TRANSACTIONS ON RELIABILITY, October 1984, is used for groups with unequal probabilities. Inputs to the program include a description of the system and any one of the following: 1) availabilities of the items, 2) mean time between failures and mean time to repairs for the items from which availabilities are calculated, 3) mean time between failures and mission length(s) from which reliabilities are calculated, or 4) failure rates and mission length(s) from which reliabilities are calculated. The results are probabilities of success of each group and the system in the given configuration. RELAV assumes exponential failure distributions for

  10. Integrated Reliability and Risk Analysis System (IRRAS)

    Energy Technology Data Exchange (ETDEWEB)

    Russell, K D; McKay, M K; Sattison, M.B. Skinner, N.L.; Wood, S T [EG and G Idaho, Inc., Idaho Falls, ID (United States); Rasmuson, D M [Nuclear Regulatory Commission, Washington, DC (United States)

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance.

  11. Advancing Usability Evaluation through Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2005-07-01

    This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probability of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.

  12. Compassion: An Evolutionary Analysis and Empirical Review

    Science.gov (United States)

    Goetz, Jennifer L.; Keltner, Dacher; Simon-Thomas, Emiliana

    2010-01-01

    What is compassion? And how did it evolve? In this review, we integrate 3 evolutionary arguments that converge on the hypothesis that compassion evolved as a distinct affective experience whose primary function is to facilitate cooperation and protection of the weak and those who suffer. Our empirical review reveals compassion to have distinct…

  13. Typology of Empirical Attributes: Dissimilarity Linkage Analysis (DLA).

    Science.gov (United States)

    Dubin, Robert; Champoux, Joseph E.

    Dissimilarity Linkage Analysis (DLA) is an extremely simple procedure for developing a typology from empirical attributes that permits the clustering of entities. First the procedure develops a taxonomy of types from empirical attributes possessed by entities in the sample. Second, the procedure assigns entities to one, and only one, type in the…

  14. Reliability Analysis of Tubular Joints in Offshore Structures

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Sørensen, John Dalsgaard

    1987-01-01

    Reliability analysis of single tubular joints and offshore platforms with tubular joints is" presented. The failure modes considered are yielding, punching, buckling and fatigue failure. Element reliability as well as systems reliability approaches are used and illustrated by several examples....... Finally, optimal design of tubular.joints with reliability constraints is discussed and illustrated by an example....

  15. Software Architecture Reliability Analysis using Failure Scenarios

    NARCIS (Netherlands)

    Tekinerdogan, B.; Sözer, Hasan; Aksit, Mehmet

    With the increasing size and complexity of software in embedded systems, software has now become a primary threat for the reliability. Several mature conventional reliability engineering techniques exist in literature but traditionally these have primarily addressed failures in hardware components

  16. HUMAN RELIABILITY ANALYSIS FOR COMPUTERIZED PROCEDURES

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman; Katya Le Blanc

    2011-09-01

    This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.

  17. Human Reliability Analysis for Small Modular Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-06-01

    Because no human reliability analysis (HRA) method was specifically developed for small modular reactors (SMRs), the application of any current HRA method to SMRs represents tradeoffs. A first- generation HRA method like THERP provides clearly defined activity types, but these activity types do not map to the human-system interface or concept of operations confronting SMR operators. A second- generation HRA method like ATHEANA is flexible enough to be used for SMR applications, but there is currently insufficient guidance for the analyst, requiring considerably more first-of-a-kind analyses and extensive SMR expertise in order to complete a quality HRA. Although no current HRA method is optimized to SMRs, it is possible to use existing HRA methods to identify errors, incorporate them as human failure events in the probabilistic risk assessment (PRA), and quantify them. In this paper, we provided preliminary guidance to assist the human reliability analyst and reviewer in understanding how to apply current HRA methods to the domain of SMRs. While it is possible to perform a satisfactory HRA using existing HRA methods, ultimately it is desirable to formally incorporate SMR considerations into the methods. This may require the development of new HRA methods. More practicably, existing methods need to be adapted to incorporate SMRs. Such adaptations may take the form of guidance on the complex mapping between conventional light water reactors and small modular reactors. While many behaviors and activities are shared between current plants and SMRs, the methods must adapt if they are to perform a valid and accurate analysis of plant personnel performance in SMRs.

  18. [Qualitative analysis: theory, steps and reliability].

    Science.gov (United States)

    Minayo, Maria Cecília de Souza

    2012-03-01

    This essay seeks to conduct in-depth analysis of qualitative research, based on benchmark authors and the author's own experience. The hypothesis is that in order for an analysis to be considered reliable, it needs to be based on structuring terms of qualitative research, namely the verbs 'comprehend' and 'interpret', and the nouns 'experience', 'common sense' and 'social action'. The 10 steps begin with the construction of the scientific object by its inclusion on the national and international agenda; the development of tools that make the theoretical concepts tangible; conducting field work that involves the researcher empathetically with the participants in the use of various techniques and approaches, making it possible to build relationships, observations and a narrative with perspective. Finally, the author deals with the analysis proper, showing how the object, which has already been studied in all the previous steps, should become a second-order construct, in which the logic of the actors in their diversity and not merely their speech predominates. The final report must be a theoretic, contextual, concise and clear narrative.

  19. Task Decomposition in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  20. Reliability and risk analysis data base development: an historical perspective

    Energy Technology Data Exchange (ETDEWEB)

    Fragola, Joseph R

    1996-02-01

    Collection of empirical data and data base development for use in the prediction of the probability of future events has a long history. Dating back at least to the 17th century, safe passage events and mortality events were collected and analyzed to uncover prospective underlying classes and associated class attributes. Tabulations of these developed classes and associated attributes formed the underwriting basis for the fledgling insurance industry. Much earlier, master masons and architects used design rules of thumb to capture the experience of the ages and thereby produce structures of incredible longevity and reliability (Antona, E., Fragola, J. and Galvagni, R. Risk based decision analysis in design. Fourth SRA Europe Conference Proceedings, Rome, Italy, 18-20 October 1993). These rules served so well in producing robust designs that it was not until almost the 19th century that the analysis (Charlton, T.M., A History Of Theory Of Structures In The 19th Century, Cambridge University Press, Cambridge, UK, 1982) of masonry voussoir arches, begun by Galileo some two centuries earlier (Galilei, G. Discorsi e dimostrazioni mathematiche intorno a due nuove science, (Discourses and mathematical demonstrations concerning two new sciences, Leiden, The Netherlands, 1638), was placed on a sound scientific basis. Still, with the introduction of new materials (such as wrought iron and steel) and the lack of theoretical knowledge and computational facilities, approximate methods of structural design abounded well into the second half of the 20th century. To this day structural designers account for material variations and gaps in theoretical knowledge by employing factors of safety (Benvenuto, E., An Introduction to the History of Structural Mechanics, Part II: Vaulted Structures and Elastic Systems, Springer-Verlag, NY, 1991) or codes of practice (ASME Boiler and Pressure Vessel Code, ASME, New York) originally developed in the 19th century (Antona, E., Fragola, J. and

  1. An Empirical Analysis of Perceptual Judgments

    Directory of Open Access Journals (Sweden)

    Nicholas Ray

    2014-12-01

    Full Text Available This paper is a defense of Reformed Empiricism, especially against those critics who take Reformed Empiricism to be a viable account of empirical rationality only if it avails itself of certain rationalist assumptions that are inconsistent with empiricism. I argue against three broad types of criticism that are found in the current literature, and propose a way of characterising Gupta’s constraints for any model of experience as analytic of empiricism itself, avoiding the charge by some (e.g. McDowell, Berker, and Schafer who think that the constraints are substantive.

  2. Empirical Analysis of the Online Rating Systems

    CERN Document Server

    Lu, Xin-Yi; Guo, Qiang; Liu, Jian-Guo

    2015-01-01

    This paper is to analyze the properties of evolving bipartite networks from four aspects, the growth of networks, the degree distribution, the popularity of objects and the diversity of user behaviours, leading a deep understanding on the empirical data. By empirical studies of data from the online bookstore Amazon and a question and answer site Stack Overflow, which are both rating bipartite networks, we could reveal the rules for the evolution of bipartite networks. These rules have significant meanings in practice for maintaining the operation of real systems and preparing for their future development. We find that the degree distribution of users follows a power law with an exponential cutoff. Also, according to the evolution of popularity for objects, we find that the large-degree objects tend to receive more new ratings than expected depending on their current degrees while the small-degree objects receive less ratings in terms of their degrees. Moreover, the user behaviours show such a trend that the l...

  3. Reliability analysis of an associated system

    Institute of Scientific and Technical Information of China (English)

    陈长杰; 魏一鸣; 蔡嗣经

    2002-01-01

    Based on engineering reliability of large complex system and distinct characteristic of soft system, some new conception and theory on the medium elements and the associated system are created. At the same time, the reliability logic model of associated system is provided. In this paper, through the field investigation of the trial operation, the engineering reliability of the paste fill system in No.2 mine of Jinchuan Non-ferrous Metallic Corporation is analyzed by using the theory of associated system.

  4. Sensitivity Analysis for the System Reliability Function

    Science.gov (United States)

    1987-12-01

    reliabilities. The unique feature of the approach is that stunple data collected on K inde-ndent replications using a specified component reliability % v &:•r...Carlo method. The polynomial time algorithm of Agrawaw Pad Satyanarayana (104) fIr the exact reliability computaton for seres- allel systems exemplifies...consideration. As an example for the s-t connectedness problem, let denote -7- edge-disjoint minimal s-t paths of G and let V , denote edge-disjoint

  5. A Novel Two-Terminal Reliability Analysis for MANET

    OpenAIRE

    Xibin Zhao; Zhiyang You; Hai Wan

    2013-01-01

    Mobile ad hoc network (MANET) is a dynamic wireless communication network. Because of the dynamic and infrastructureless characteristics, MANET is vulnerable in reliability. This paper presents a novel reliability analysis for MANET. The node mobility effect and the node reliability based on a real MANET platform are modeled and analyzed. An effective Monte Carlo method for reliability analysis is proposed. A detailed evaluation is performed in terms of the experiment results.

  6. A Novel Two-Terminal Reliability Analysis for MANET

    Directory of Open Access Journals (Sweden)

    Xibin Zhao

    2013-01-01

    Full Text Available Mobile ad hoc network (MANET is a dynamic wireless communication network. Because of the dynamic and infrastructureless characteristics, MANET is vulnerable in reliability. This paper presents a novel reliability analysis for MANET. The node mobility effect and the node reliability based on a real MANET platform are modeled and analyzed. An effective Monte Carlo method for reliability analysis is proposed. A detailed evaluation is performed in terms of the experiment results.

  7. Solving reliability analysis problems in the polar space

    OpenAIRE

    Ghasem Ezzati; Musa Mammadov; Siddhivinayak Kulkarni

    2014-01-01

    An optimization model that is widely used in engineering problems is Reliability-Based Design Optimization (RBDO). Input data of the RBDO is non-deterministic and constraints are probabilistic. The RBDO aims at minimizing cost ensuring that reliability is at least an accepted level. Reliability analysis is an important step in two-level RBDO approaches. Although many methods have been introduced to apply in reliability analysis loop of the RBDO, there are still many drawbacks in their efficie...

  8. Reliability Analysis and Optimal Design of Monolithic Vertical Wall Breakwaters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, Hans F.; Christiani, E.

    1994-01-01

    Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of the most important failure modes, sliding failure, failure of the foundation and overturning failure are described . Relevant design variables are identified and relia......Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of the most important failure modes, sliding failure, failure of the foundation and overturning failure are described . Relevant design variables are identified...

  9. Reliability in Cross-National Content Analysis.

    Science.gov (United States)

    Peter, Jochen; Lauf, Edmund

    2002-01-01

    Investigates how coder characteristics such as language skills, political knowledge, coding experience, and coding certainty affected inter-coder and coder-training reliability. Shows that language skills influenced both reliability types. Suggests that cross-national researchers should pay more attention to cross-national assessments of…

  10. Software architecture reliability analysis using failure scenarios

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Sozer, Hasan; Aksit, Mehmet

    2008-01-01

    With the increasing size and complexity of software in embedded systems, software has now become a primary threat for the reliability. Several mature conventional reliability engineering techniques exist in literature but traditionally these have primarily addressed failures in hardware components a

  11. Software reliability experiments data analysis and investigation

    Science.gov (United States)

    Walker, J. Leslie; Caglayan, Alper K.

    1991-01-01

    The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.

  12. Reliability Analysis of Slope Stability by Central Point Method

    OpenAIRE

    Li, Chunge; WU Congliang

    2015-01-01

    Given uncertainty and variability of the slope stability analysis parameter, the paper proceed from the perspective of probability theory and statistics based on the reliability theory. Through the central point method of reliability analysis, performance function about the reliability of slope stability analysis is established. What’s more, the central point method and conventional limit equilibrium methods do comparative analysis by calculation example. The approach’s numerical ...

  13. Individual Differences in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. Joe; Ronald L. Boring

    2014-06-01

    While human reliability analysis (HRA) methods include uncertainty in quantification, the nominal model of human error in HRA typically assumes that operator performance does not vary significantly when they are given the same initiating event, indicators, procedures, and training, and that any differences in operator performance are simply aleatory (i.e., random). While this assumption generally holds true when performing routine actions, variability in operator response has been observed in multiple studies, especially in complex situations that go beyond training and procedures. As such, complexity can lead to differences in operator performance (e.g., operator understanding and decision-making). Furthermore, psychological research has shown that there are a number of known antecedents (i.e., attributable causes) that consistently contribute to observable and systematically measurable (i.e., not random) differences in behavior. This paper reviews examples of individual differences taken from operational experience and the psychological literature. The impact of these differences in human behavior and their implications for HRA are then discussed. We propose that individual differences should not be treated as aleatory, but rather as epistemic. Ultimately, by understanding the sources of individual differences, it is possible to remove some epistemic uncertainty from analyses.

  14. RELIABILITY ANALYSIS OF POWER DISTRIBUTION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Popescu V.S.

    2012-04-01

    Full Text Available Power distribution systems are basic parts of power systems and reliability of these systems at present is a key issue for power engineering development and requires special attention. Operation of distribution systems is accompanied by a number of factors that produce random data a large number of unplanned interruptions. Research has shown that the predominant factors that have a significant influence on the reliability of distribution systems are: weather conditions (39.7%, defects in equipment(25% and unknown random factors (20.1%. In the article is studied the influence of random behavior and are presented estimations of reliability of predominantly rural electrical distribution systems.

  15. Reliability Analysis on English Writing Test of SHSEE in Shanghai

    Institute of Scientific and Technical Information of China (English)

    黄玉麒; 黄芳

    2014-01-01

    As a subjective test, the validity of writing test is acceptable. What about the reliability? Writing test occupies a special position in the senior high school entrance examination (SHSEE for short). It is important to ensure its reliability. By the analysis of recent years’English writing items in SHSEE, the author offer suggestions on how to guarantee the reliability of writing tests.

  16. Analysis on Some of Software Reliability Models

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Software reliability & maintainability evaluation tool (SRMET 3.0) is introducted in detail in this paper,which was developed by Software Evaluation and Test Center of China Aerospace Mechanical Corporation. SRMET 3.0is supported by seven soft ware reliability models and four software maintainability models. Numerical characteristicsfor all those models are deeply studied in this paper, and corresponding numerical algorithms for each model are alsogiven in the paper.

  17. System reliability analysis for kinematic performance of planar mechanisms

    Institute of Scientific and Technical Information of China (English)

    ZHANG YiMin; HUANG XianZhen; ZHANG XuFang; HE XiangDong; WEN BangChun

    2009-01-01

    Based on the reliability and mechanism kinematic accuracy theories, we propose a general methodology for system reliability analysis of kinematic performance of planar mechanisms. The loop closure equations are used to estimate the kinematic performance errors of planar mechanisms. Reliability and system reliability theories are introduced to develop the limit state functions (LSF) for failure of kinematic performance qualities. The statistical fourth moment method and the Edgeworth series technique are used on system reliability analysis for kinematic performance of planar mechanisms, which relax the restrictions of probability distribution of design variables. Finally, the practicality, efficiency and accuracy of the proposed method are demonstrated by numerical examples.

  18. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  19. A Temporal Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of temporal maximum autocorrelation factor analysis to global monthly mean values of 1996-1997 sea surface temperature (SST) and sea surface height (SSH) data. This type of analysis can be considered as an extension of traditional empirical orthogonal function...

  20. Empirical and theoretical analysis of complex systems

    Science.gov (United States)

    Zhao, Guannan

    structures evolve on a similar timescale to individual level transmission, we investigated the process of transmission through a model population comprising of social groups which follow simple dynamical rules for growth and break-up, and the profiles produced bear a striking resemblance to empirical data obtained from social, financial and biological systems. Finally, for better implementation of a widely accepted power law test algorithm, we have developed a fast testing procedure using parallel computation.

  1. Analysis on testing and operational reliability of software

    Institute of Scientific and Technical Information of China (English)

    ZHAO Jing; LIU Hong-wei; CUI Gang; WANG Hui-qiang

    2008-01-01

    Software reliability was estimated based on NHPP software reliability growth models. Testing reliability and operational reliability may be essentially different. On the basis of analyzing similarities and differences of the testing phase and the operational phase, using the concept of operational reliability and the testing reliability, different forms of the comparison between the operational failure ratio and the predicted testing failure ratio were conducted, and the mathematical discussion and analysis were performed in detail. Finally, software optimal release was studied using software failure data. The results show that two kinds of conclusions can be derived by applying this method, one conclusion is to continue testing to meet the required reliability level of users, and the other is that testing stops when the required operational reliability is met, thus the testing cost can be reduced.

  2. Reliability estimation in a multilevel confirmatory factor analysis framework.

    Science.gov (United States)

    Geldhof, G John; Preacher, Kristopher J; Zyphur, Michael J

    2014-03-01

    Scales with varying degrees of measurement reliability are often used in the context of multistage sampling, where variance exists at multiple levels of analysis (e.g., individual and group). Because methodological guidance on assessing and reporting reliability at multiple levels of analysis is currently lacking, we discuss the importance of examining level-specific reliability. We present a simulation study and an applied example showing different methods for estimating multilevel reliability using multilevel confirmatory factor analysis and provide supporting Mplus program code. We conclude that (a) single-level estimates will not reflect a scale's actual reliability unless reliability is identical at each level of analysis, (b) 2-level alpha and composite reliability (omega) perform relatively well in most settings, (c) estimates of maximal reliability (H) were more biased when estimated using multilevel data than either alpha or omega, and (d) small cluster size can lead to overestimates of reliability at the between level of analysis. We also show that Monte Carlo confidence intervals and Bayesian credible intervals closely reflect the sampling distribution of reliability estimates under most conditions. We discuss the estimation of credible intervals using Mplus and provide R code for computing Monte Carlo confidence intervals.

  3. Mechanical reliability analysis of tubes intended for hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Nahal, Mourad; Khelif, Rabia [Badji Mokhtar University, Annaba (Algeria)

    2013-02-15

    Reliability analysis constitutes an essential phase in any study concerning reliability. Many industrialists evaluate and improve the reliability of their products during the development cycle - from design to startup (design, manufacture, and exploitation) - to develop their knowledge on cost/reliability ratio and to control sources of failure. In this study, we obtain results for hardness, tensile, and hydrostatic tests carried out on steel tubes for transporting hydrocarbons followed by statistical analysis. Results obtained allow us to conduct a reliability study based on resistance request. Thus, index of reliability is calculated and the importance of the variables related to the tube is presented. Reliability-based assessment of residual stress effects is applied to underground pipelines under a roadway, with and without active corrosion. Residual stress has been found to greatly increase probability of failure, especially in the early stages of pipe lifetime.

  4. Empirical Research Concerning the Impact of the Public Internal Audit on the Accounting System and its Reliability in Romanian Universities

    Directory of Open Access Journals (Sweden)

    Drăguşin Cristina-Petrina

    2016-12-01

    Full Text Available The present paper is materialized in an empirical study concerning the impact of the internal audit on the accounting system and its reliability, in case of public universities in Romania. In order to achieve the study, it was necessary to know the different points of view of the representatives of the accounting departments of public institutions of academic education, using a statistical survey based on questionnaire. The research objectives were focused on obtaining conclusions regarding: the importance of internal auditing of the accounting system and its reliability; the extent to which the internal audit manages to provide reasonable assurances regarding the accounting and financial activity; the importance in auditing of the items related to the accounting activity; the assurance and the adequacy of the human resources allocated to the internal audit departments; the frequency with which the internal audit reports projects are modified in order to follow the audited structure recommendations; the extent to which the audit reports reflect the reality; the internal audit activity contribution in improving the accounting systems and their reliability in the Romanian universities.

  5. Analysis of Reliability of CET Band4

    Institute of Scientific and Technical Information of China (English)

    王铁琳

    2005-01-01

    CET Band 4 has been carried out for more than a decade. It becomes so large- scaled, so popular and so influential that many testing experts and foreign language teachers are willing to do research on it. In this paper, I will mainly analyse its reliability from the perspective of writing test and speaking test.

  6. Bypassing BDD Construction for Reliability Analysis

    DEFF Research Database (Denmark)

    Williams, Poul Frederick; Nikolskaia, Macha; Rauzy, Antoine

    2000-01-01

    In this note, we propose a Boolean Expression Diagram (BED)-based algorithm to compute the minimal p-cuts of boolean reliability models such as fault trees. BEDs make it possible to bypass the Binary Decision Diagram (BDD) construction, which is the main cost of fault tree assessment....

  7. Reliability Analysis of an Offshore Structure

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle; Rackwitz, R.

    1992-01-01

    A jacket type offshore structure from the North Sea is considered. The time variant reliability is estimated for failure defined as brittie fradure and crack through the tubular roerober walls. The stochastic modeiling is described. The hot spot stress speetral moments as fundion of the stochastic...

  8. Empiric reliability of diagnostic and prognostic estimations of physical standards of children, going in for sports.

    Directory of Open Access Journals (Sweden)

    Zaporozhanov V.A.

    2012-12-01

    Full Text Available In the conditions of sporting-pedagogical practices objective estimation of potential possibilities gettings busy already on the initial stages of long-term preparation examined as one of issues of the day. The proper quantitative information allows to individualize preparation of gettings in obedience to requirements to the guided processes busy. Research purpose - logically and metrical to rotin expedience of metrical method of calculations of reliability of results of the control measurings, in-use for diagnostics of psychophysical fitness and prognosis of growth of trade gettings busy in the select type of sport. Material and methods. Analysed the results of the control measurings on four indexes of psychophysical preparedness and estimation of experts of fitness 24th gettings busy composition of children of gymnastic school. The results of initial and final inspection of gymnasts on the same control tests processed the method of mathematical statistics. Expected the metrical estimations of reliability of measurings is stability, co-ordination and informing of control information for current diagnostics and prognosis of sporting possibilities inspected. Results. Expedience of the use in these aims of metrical operations of calculation of complex estimation of the psychophysical state of gettings busy is metrology grounded. Conclusions. Research results confirm expedience of calculation of complex estimation of psychophysical features gettings busy for diagnostics of fitness in the select type of sport and trade prognosis on the subsequent stages of preparation.

  9. THE LISBON STRATEGY: AN EMPIRICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Silvestri Marcello

    2010-07-01

    Full Text Available This paper investigates the European economic integration within the frame work of the 2000 Lisbon Council with the aim of studying the dynamics affecting the social and economic life of European Countries. Such a descriptive investigation focuses on certain significant variables of the new theories highlighting the importance of technological innovation and human capital. To this end the multivariate statistic technique of Principal Component Analysis has been applied in order to classify Countries with regard to the investigated phenomenon.

  10. Reliability sensitivity-based correlation coefficient calculation in structural reliability analysis

    Science.gov (United States)

    Yang, Zhou; Zhang, Yimin; Zhang, Xufang; Huang, Xianzhen

    2012-05-01

    The correlation coefficients of random variables of mechanical structures are generally chosen with experience or even ignored, which cannot actually reflect the effects of parameter uncertainties on reliability. To discuss the selection problem of the correlation coefficients from the reliability-based sensitivity point of view, the theory principle of the problem is established based on the results of the reliability sensitivity, and the criterion of correlation among random variables is shown. The values of the correlation coefficients are obtained according to the proposed principle and the reliability sensitivity problem is discussed. Numerical studies have shown the following results: (1) If the sensitivity value of correlation coefficient ρ is less than (at what magnitude 0.000 01), then the correlation could be ignored, which could simplify the procedure without introducing additional error. (2) However, as the difference between ρ s, that is the most sensitive to the reliability, and ρ R , that is with the smallest reliability, is less than 0.001, ρ s is suggested to model the dependency of random variables. This could ensure the robust quality of system without the loss of safety requirement. (3) In the case of | E abs|>0.001 and also | E rel|>0.001, ρ R should be employed to quantify the correlation among random variables in order to ensure the accuracy of reliability analysis. Application of the proposed approach could provide a practical routine for mechanical design and manufactory to study the reliability and reliability-based sensitivity of basic design variables in mechanical reliability analysis and design.

  11. OTC Derivatives and Global Economic Activity: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Gordon Bodnar

    2017-06-01

    Full Text Available That the global market for derivatives has expanded beyond recognition is well known. What is not know is how this market interacts with economic activity. We provide the first empirical characterization of interdependencies between OECD economic activity and the global OTC derivatives market. To this end, we apply a vector-error correction model to OTC derivatives disaggregated across instruments and counterparties. The results indicate that with one exception, the heterogeneity of OTC contracts is too pronounced to be reliably summarized by our measures of economic activity. The one exception is interest-rate derivatives held by Other Financial Institutions.

  12. Empirical analysis of industrial operations in Montenegro

    Directory of Open Access Journals (Sweden)

    Galić Jelena

    2012-12-01

    Full Text Available Since the starting process of transition, industrial production in Montenegro has been faced with serious problems and its share in GDP is constantly decreasing. Global financial crises had in large extent negatively influenced industry. Analysis of financial indicators showed that industry had significant losses, problem of undercapitalisation and liquidity problems. If we look by industry sectors, than situation is more favourable in the production of electricity, gas and water compared to extracting industry and mining. In paper is proposed measures of economic policy in order to improve situation in industry.

  13. Reliability analysis of ceramic matrix composite laminates

    Science.gov (United States)

    Thomas, David J.; Wetherhold, Robert C.

    1991-01-01

    At a macroscopic level, a composite lamina may be considered as a homogeneous orthotropic solid whose directional strengths are random variables. Incorporation of these random variable strengths into failure models, either interactive or non-interactive, allows for the evaluation of the lamina reliability under a given stress state. Using a non-interactive criterion for demonstration purposes, laminate reliabilities are calculated assuming previously established load sharing rules for the redistribution of load as the failure of laminae occur. The matrix cracking predicted by ACK theory is modeled to allow a loss of stiffness in the fiber direction. The subsequent failure in the fiber direction is controlled by a modified bundle theory. Results using this modified bundle model are compared with previous models which did not permit separate consideration of matrix cracking, as well as to results obtained from experimental data.

  14. Universality in voting behavior: an empirical analysis

    Science.gov (United States)

    Chatterjee, Arnab; Mitrović, Marija; Fortunato, Santo

    2013-01-01

    Election data represent a precious source of information to study human behavior at a large scale. In proportional elections with open lists, the number of votes received by a candidate, rescaled by the average performance of all competitors in the same party list, has the same distribution regardless of the country and the year of the election. Here we provide the first thorough assessment of this claim. We analyzed election datasets of 15 countries with proportional systems. We confirm that a class of nations with similar election rules fulfill the universality claim. Discrepancies from this trend in other countries with open-lists elections are always associated with peculiar differences in the election rules, which matter more than differences between countries and historical periods. Our analysis shows that the role of parties in the electoral performance of candidates is crucial: alternative scalings not taking into account party affiliations lead to poor results.

  15. Empirical Analysis of Kyrgyz Trade Patterns

    Directory of Open Access Journals (Sweden)

    Elvira KURMANALIEVA

    2008-05-01

    Full Text Available Being naturally located between two big markets in Europe and Asia, Kyrgyzstan together with other Central Asian countries does not have a direct access to sea ports. Landlockedness limits volumes of international trade and creates obstacles for economic growth. Results of statistical analysis show that Kyrgyz trade neither follows Heckscher-Ohlin model nor intra-industry trade model. Another finding is that open and liberal trade policy of Kyrgyzstan has a large positive effect on trade volumes, suggesting that bilateral trade will expand markedly if country continues liberalization of its trade policy with other countries. Quality of infrastructure and transportation costs play a crucial role for landlocked countries and a free trade agreement with other countries looks like a good opportunity to overcome natural barriers and diversify their trade.

  16. Universality in voting behavior: an empirical analysis

    CERN Document Server

    Chatterjee, Arnab; Fortunato, Santo

    2012-01-01

    Election data represent a precious source of information to study human behavior at a large scale. In proportional elections with open lists, the number of votes received by a candidate, rescaled by the average performance of all competitors in the same party list, has the same distribution regardless of the country and the year of the election. Here we provide the first thorough assessment of this claim. We analyzed election datasets of 15 countries with proportional systems. We confirm that a class of nations with similar election rules fulfill the universality claim. Discrepancies from this trend in other countries with open-lists elections are always associated with peculiar differences in the election rules, which matter more than differences between countries and historical periods. Our analysis shows that the role of parties in the electoral performance of candidates is crucial: alternative scalings not taking into account party affiliations lead to poor results.

  17. DFTCalc: Reliability centered maintenance via fault tree analysis (tool paper)

    NARCIS (Netherlands)

    Guck, Dennis; Spel, Jip; Stoelinga, Mariëlle Ida Antoinette; Butler, Michael; Conchon, Sylvain; Zaïdi, Fatiha

    2015-01-01

    Reliability, availability, maintenance and safety (RAMS) analysis is essential in the evaluation of safety critical systems like nuclear power plants and the railway infrastructure. A widely used methodology within RAMS analysis are fault trees, representing failure propagations throughout a system.

  18. DFTCalc: reliability centered maintenance via fault tree analysis (tool paper)

    NARCIS (Netherlands)

    Guck, Dennis; Spel, Jip; Stoelinga, Mariëlle; Butler, Michael; Conchon, Sylvain; Zaïdi, Fatiha

    2015-01-01

    Reliability, availability, maintenance and safety (RAMS) analysis is essential in the evaluation of safety critical systems like nuclear power plants and the railway infrastructure. A widely used methodology within RAMS analysis are fault trees, representing failure propagations throughout a system.

  19. MULTIDIMENSIONAL RELIABILITY OF INSTRUMENT STUDENTS’ SATISFACTION USING CONFIRMATORY FACTOR ANALYSIS

    Directory of Open Access Journals (Sweden)

    Gaguk Margono

    2014-11-01

    Full Text Available The purpose of this paper is to compare unidimensional reliability and multidimensional reliability of instrument students’ satisfaction as an internal costumer. Multidimensional reliability measurement is rarely used in the field of research. Multidimensional reliability is estimated by using Confirmatory Factor Analysis (CFA on the Structural Equation Model (SEM. Measurements and calculations are described in this article using instrument students’ satisfaction as an internal costumer. Survey method used in this study and sampling used simple random sampling. This instrument has been tried out to 173 students. The result is concluded that the measuringinstrument of students’ satisfaction as an internal costumer by using multidimensional reliability coefficient has higher accuracy when compared with a unidimensional reliability coefficient. Expected in advanced research used another formula multidimensional reliability, including when using SEM.

  20. A Theoretical and Empirical Analysis of Expected Sarsa

    NARCIS (Netherlands)

    van Seijen, Harm; van Hasselt, Hado; Whiteson, Shimon; Wiering, Marco

    2009-01-01

    This paper presents a theoretical and empirical analysis of Expected Sarsa, a variation on Sarsa, the classic on policy temporal-difference method for model-free reinforcement learning. Expected Sarsa exploits knowledge about stochasticity in the behavior policy to perform updates with lower varianc

  1. A theoretical and empirical analysis of expected sarsa

    NARCIS (Netherlands)

    Seijen, H.H. van; Hasselt, H. van; Whiteson, S.; Wiering, M.

    2009-01-01

    This paper presents a theoretical and empirical analysis of Expected Sarsa, a variation on Sarsa, the classic onpolicy temporal-difference method for model-free reinforcement learning. Expected Sarsa exploits knowledge about stochasticity in the behavior policy to perform updates with lower variance

  2. Determinants of Crime in Virginia: An Empirical Analysis

    Science.gov (United States)

    Ali, Abdiweli M.; Peek, Willam

    2009-01-01

    This paper is an empirical analysis of the determinants of crime in Virginia. Over a dozen explanatory variables that current literature suggests as important determinants of crime are collected. The data is from 1970 to 2000. These include economic, fiscal, demographic, political, and social variables. The regression results indicate that crime…

  3. Empirical Bayes Model Comparisons for Differential Methylation Analysis

    Directory of Open Access Journals (Sweden)

    Mingxiang Teng

    2012-01-01

    Full Text Available A number of empirical Bayes models (each with different statistical distribution assumptions have now been developed to analyze differential DNA methylation using high-density oligonucleotide tiling arrays. However, it remains unclear which model performs best. For example, for analysis of differentially methylated regions for conservative and functional sequence characteristics (e.g., enrichment of transcription factor-binding sites (TFBSs, the sensitivity of such analyses, using various empirical Bayes models, remains unclear. In this paper, five empirical Bayes models were constructed, based on either a gamma distribution or a log-normal distribution, for the identification of differential methylated loci and their cell division—(1, 3, and 5 and drug-treatment-(cisplatin dependent methylation patterns. While differential methylation patterns generated by log-normal models were enriched with numerous TFBSs, we observed almost no TFBS-enriched sequences using gamma assumption models. Statistical and biological results suggest log-normal, rather than gamma, empirical Bayes model distribution to be a highly accurate and precise method for differential methylation microarray analysis. In addition, we presented one of the log-normal models for differential methylation analysis and tested its reproducibility by simulation study. We believe this research to be the first extensive comparison of statistical modeling for the analysis of differential DNA methylation, an important biological phenomenon that precisely regulates gene transcription.

  4. Multiplicity of data in trial reports and the reliability of meta-analyses: empirical study

    DEFF Research Database (Denmark)

    Tendal, Britta; Nüesch, Eveline; Higgins, Julian P T;

    2011-01-01

    To examine the extent of multiplicity of data in trial reports and to assess the impact of multiplicity on meta-analysis results.......To examine the extent of multiplicity of data in trial reports and to assess the impact of multiplicity on meta-analysis results....

  5. Reliability analysis of PLC safety equipment

    Energy Technology Data Exchange (ETDEWEB)

    Yu, J.; Kim, J. Y. [Chungnam Nat. Univ., Daejeon (Korea, Republic of)

    2006-06-15

    FMEA analysis for Nuclear Safety Grade PLC, failure rate prediction for nuclear safety grade PLC, sensitivity analysis for components failure rate of nuclear safety grade PLC, unavailability analysis support for nuclear safety system.

  6. Reliability Analysis of Dynamic Stability in Waves

    DEFF Research Database (Denmark)

    Søborg, Anders Veldt

    2004-01-01

    exhibit sufficient characteristics with respect to slope at zero heel (GM value), maximum leverarm, positive range of stability and area below the leverarm curve. The rule-based requirements to calm water leverarm curves are entirely based on experience obtained from vessels in operation and recorded......The assessment of a ship's intact stability is traditionally based on a semi-empirical deterministic concept that evaluates the characteristics of ship's calm water restoring leverarm curves. Today the ship is considered safe with respect to dynamic stability if its calm water leverarm curves...... accidents in the past. The rules therefore only leaves little room for evaluation and improvement of safety of a ship's dynamic stability. A few studies have evaluated the probability of ship stability loss in waves using Monte Carlo simulations. However, since this probability may be in the order of 10...

  7. Earth slope reliability analysis under seismic loadings using neural network

    Institute of Scientific and Technical Information of China (English)

    PENG Huai-sheng; DENG Jian; GU De-sheng

    2005-01-01

    A new method was proposed to cope with the earth slope reliability problem under seismic loadings. The algorithm integrates the concepts of artificial neural network, the first order second moment reliability method and the deterministic stability analysis method of earth slope. The performance function and its derivatives in slope stability analysis under seismic loadings were approximated by a trained multi-layer feed-forward neural network with differentiable transfer functions. The statistical moments calculated from the performance function values and the corresponding gradients using neural network were then used in the first order second moment method for the calculation of the reliability index in slope safety analysis. Two earth slope examples were presented for illustrating the applicability of the proposed approach. The new method is effective in slope reliability analysis. And it has potential application to other reliability problems of complicated engineering structure with a considerably large number of random variables.

  8. Design and Analysis for Reliability of Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Yongxian Song

    2012-12-01

    Full Text Available Reliability is an important performance indicator of wireless sensor network, to some application fields, which have high demands in terms of reliability, it is particularly important to ensure reliability of network. At present, the reliability research findings of wireless sensor network are much more at home and abroad, but they mainly improve network reliability from the networks topology, reliable protocol and application layer fault correction and so on, and reliability of network is comprehensive considered from hardware and software aspects is much less. This paper adopts bionic hardware to implement bionic reconfigurable of wireless sensor network nodes, so as to the nodes have able to change their structure and behavior autonomously and dynamically, in the cases of the part hardware are failure, and the nodes can realize bionic self-healing. Secondly, Markov state diagram and probability analysis method are adopted to realize solution of functional model for reliability, establish the relationship between reliability and characteristic parameters for sink nodes, analyze sink nodes reliability model, so as to determine the reasonable parameters of the model and ensure reliability of sink nodes.

  9. Reliability-Analysis of Offshore Structures using Directional Loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Bloch, Allan; Sterndorff, M. J.

    2000-01-01

    Reliability analyses of offshore structures such as steel jacket platforms are usually performed using stochastic models for the wave loads based on the omnidirectional wave height. However, reliability analyses with respect to structural failure modes such as total collapse of a structure...... heights from the central part of the North Sea. It is described how the stochastic model for the directional wave heights can be used in a reliability analysis where total collapse of offshore steel jacket platforms is considered....

  10. Statistical analysis on reliability and serviceability of caterpillar tractor

    Institute of Scientific and Technical Information of China (English)

    WANG Jinwu; LIU Jiafu; XU Zhongxiang

    2007-01-01

    For further understanding reliability and serviceability of tractor and to furnish scientific and technical theories, based on the promotion and application of it, the following experiments and statistical analysis on reliability (reliability and MTBF) serviceability (service and MTTR) of Donfanghong-1002 and Dongfanghong-802 were conducted. The result showed that the intervals of average troubles of these two tractors were 182.62 h and 160.2 h, respectively, and the weakest assembly of them was engine part.

  11. Modified Bayesian Kriging for Noisy Response Problems for Reliability Analysis

    Science.gov (United States)

    2015-01-01

    surrogate model is used to do the MCS prediction for the reliability analysis for the sampling- based reliability-based design optimization ( RBDO ) method...D., Choi, K. K., Noh, Y., & Zhao, L. (2011). Sampling-based stochastic sensitivity analysis using score functions for RBDO problems with correlated...K., and Zhao, L., (2011). Sampling- based RBDO using the stochastic sensitivity analysis and dynamic Kriging method. Structural and

  12. Reliability analysis of large, complex systems using ASSIST

    Science.gov (United States)

    Johnson, Sally C.

    1988-01-01

    The SURE reliability analysis program is discussed as well as the ASSIST model generation program. It is found that semi-Markov modeling using model reduction strategies with the ASSIST program can be used to accurately solve problems at least as complex as other reliability analysis tools can solve. Moreover, semi-Markov analysis provides the flexibility needed for modeling realistic fault-tolerant systems.

  13. Evaluating some Reliability Analysis Methodologies in Seismic Design

    Directory of Open Access Journals (Sweden)

    A. E. Ghoulbzouri

    2011-01-01

    Full Text Available Problem statement: Accounting for uncertainties that are present in geometric and material data of reinforced concrete buildings is performed in this study within the context of performance based seismic engineering design. Approach: Reliability of the expected performance state is assessed by using various methodologies based on finite element nonlinear static pushover analysis and specialized reliability software package. Reliability approaches that were considered included full coupling with an external finite element code and surface response based methods in conjunction with either first order reliability method or importance sampling method. Various types of probability distribution functions that model parameters uncertainties were introduced. Results: The probability of failure according to the used reliability analysis method and to the selected distribution of probabilities was obtained. Convergence analysis of the importance sampling method was performed. The required duration of analysis as function of the used reliability method was evaluated. Conclusion/Recommendations: It was found that reliability results are sensitive to the used reliability analysis method and to the selected distribution of probabilities. Durations of analysis for coupling methods were found to be higher than those associated to surface response based methods; one should however include time needed to derive these lasts. For the reinforced concrete building considered in this study, it was found that significant variations exist between all the considered reliability methodologies. The full coupled importance sampling method is recommended, but the first order reliability method applied on a surface response model can be used with good accuracy. Finally, the distributions of probabilities should be carefully identified since giving the mean and the standard deviation were found to be insufficient.

  14. Reliability Distribution of Numerical Control Lathe Based on Correlation Analysis

    Institute of Scientific and Technical Information of China (English)

    Xiaoyan Qi; Guixiang Shen; Yingzhi Zhang; Shuguang Sun; Bingkun Chen

    2016-01-01

    Combined Reliability distribution with correlation analysis, a new method has been proposed to make Reliability distribution where considering the elements about structure correlation and failure correlation of subsystems. Firstly, we make a sequence for subsystems by means of TOPSIS which comprehends the considerations of Reliability allocation, and introducing a Copula connecting function to set up a distribution model based on structure correlation, failure correlation and target correlation, and then acquiring reliability target area of all subsystems by Matlab. In this method, not only the traditional distribution considerations are concerned, but also correlation influences are involved, to achieve supplementing information and optimizing distribution.

  15. Reliability and safety analysis of redundant vehicle management computer system

    Institute of Scientific and Technical Information of China (English)

    Shi Jian; Meng Yixuan; Wang Shaoping; Bian Mengmeng; Yan Dungong

    2013-01-01

    Redundant techniques are widely adopted in vehicle management computer (VMC) to ensure that VMC has high reliability and safety. At the same time, it makes VMC have special char-acteristics, e.g., failure correlation, event simultaneity, and failure self-recovery. Accordingly, the reliability and safety analysis to redundant VMC system (RVMCS) becomes more difficult. Aimed at the difficulties in RVMCS reliability modeling, this paper adopts generalized stochastic Petri nets to establish the reliability and safety models of RVMCS. Then this paper analyzes RVMCS oper-ating states and potential threats to flight control system. It is verified by simulation that the reli-ability of VMC is not the product of hardware reliability and software reliability, and the interactions between hardware and software faults can reduce the real reliability of VMC obviously. Furthermore, the failure undetected states and false alarming states inevitably exist in RVMCS due to the influences of limited fault monitoring coverage and false alarming probability of fault mon-itoring devices (FMD). RVMCS operating in some failure undetected states will produce fatal threats to the safety of flight control system. RVMCS operating in some false alarming states will reduce utility of RVMCS obviously. The results abstracted in this paper can guide reliable VMC and efficient FMD designs. The methods adopted in this paper can also be used to analyze other intelligent systems’ reliability.

  16. Seismic reliability analysis of large electric power systems

    Institute of Scientific and Technical Information of China (English)

    何军; 李杰

    2004-01-01

    Based on the De. Morgan laws and Boolean simplification, a recursive decomposition method is introduced in this paper to identity the main exclusive safe paths and failed paths of a network. The reliability or the reliability bound of a network can be conveniently expressed as the summation of the joint probabilities of these paths. Under the multivariate normal distribution assumption, a conditioned reliability index method is developed to evaluate joint probabilities of various exclusive safe paths and failed paths, and, finally, the seismic reliability or the reliability bound of an electric power system.Examples given in thc paper show that the method is very simple and provides accurate results in the seismic reliability analysis.

  17. Simulation Approach to Mission Risk and Reliability Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  18. Reliability analysis of ship structure system with multi-defects

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    This paper analyzes the influence of multi-defects including the initial distortions,welding residual stresses,cracks and local dents on the ultimate strength of the plate element,and has worked out expressions of reliability calculation and sensitivity analysis of the plate element.Reliability analysis is made for the system with multi-defects plate elements.Failure mechanism,failure paths and the calculating approach to global reliability index are also worked out.After plate elements with multi-defects fail,the formula of reverse node forces which affect the residual structure is deduced,so are the sensitivity expressions of the system reliability index.This ensures calculating accuracy and rationality for reliability analysis,and makes it convenient to find weakness plate elements which affect the reliability of the structure system.Finally,for the validity of the approach proposed,we take the numerical example of a ship cabin to compare and contrast the reliability and the sensitivity analysis of the structure system with multi-defects with those of the structure system with no defects.The approach has implications for the structure design,rational maintenance and renewing strategy.

  19. Victim countries of transnational terrorism: an empirical characteristics analysis.

    Science.gov (United States)

    Elbakidze, Levan; Jin, Yanhong

    2012-12-01

    This study empirically investigates the association between country-level socioeconomic characteristics and risk of being victimized in transnational terrorism events. We find that a country's annual financial contribution to the U.N. general operating budget has a positive association with the frequency of being victimized in transnational terrorism events. In addition, per capita GDP, political freedom, and openness to trade are nonlinearly related to the frequency of being victimized in transnational terrorism events. © 2012 Society for Risk Analysis.

  20. Islamic banks and profitability: an empirical analysis of Indonesian banking

    OpenAIRE

    Jordan, Sarah

    2013-01-01

    This paper provides an empirical analysis of the factors that determine the profitability of Indonesian banks between the years 2006-2012. In particular, it investigates whether there are any significant differences in terms of profitability between Islamic banks and commercial banks. The results, obtained by applying the system-GMM estimator to the panel of 54 banks, indicate that the high bank profitability during these years were determined mainly by the size of the banks, the market share...

  1. Requalification of offshore structures. Reliability analysis of platform

    Energy Technology Data Exchange (ETDEWEB)

    Bloch, A.; Dalsgaard Soerensen, J. [Aalborg Univ. (Denmark)

    1999-03-01

    A preliminary reliability analysis has been performed for an example platform. In order to model the structural response such that it is possible to calculate reliability indices, approximate quadratic response surfaces have been determined for cross-sectional forces. Based on a deterministic, code-based analysis the elements and joints which can be expected to be the most critical are selected and response surfaces are established for the cross-sectional forces in those. A stochastic model is established for the uncertain variables. The reliability analysis shows that with this stochastic model the smallest reliability indices for elements are about 3.9. The reliability index for collapse (pushover) is estimated to 6.7 and the reliability index for fatigue failure using a crude model is for the expected most critical detail estimated to 3.2, corresponding to the accumulated damage during the design lifetime of the platform. These reliability indices are considered to be reasonable compared with values recommended by e.g. ISO. The most important stochastic variables are found to be the wave height and the drag coefficient (including the model uncertainty related to estimation of wave forces on the platform). (au)

  2. Maritime shipping as a high reliability industry: A qualitative analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mannarelli, T.; Roberts, K.; Bea, R.

    1994-10-01

    The maritime oil shipping industry has great public demands for safe and reliable organizational performance. Researchers have identified a set of organizations and industries that operate at extremely high levels of reliability, and have labelled them High Reliability Organizations (HRO). Following the Exxon Valdez oil spill disaster of 1989, public demands for HRO-level operations were placed on the oil industry. It will be demonstrated that, despite enormous improvements in safety and reliability, maritime shipping is not operating as an HRO industry. An analysis of the organizational, environmental, and cultural history of the oil industry will help to provide justification and explanation. The oil industry will be contrasted with other HRO industries and the differences will inform the shortfalls maritime shipping experiences with regard to maximizing reliability. Finally, possible solutions for the achievement of HRO status will be offered.

  3. Reliability Analysis of OMEGA Network and Its Variants

    Directory of Open Access Journals (Sweden)

    Suman Lata

    2012-04-01

    Full Text Available The performance of a computer system depends directly on the time required to perform a basic operation and the number of these basic operations that can be performed concurrently. High performance computing systems can be designed using parallel processing. Parallel processing is achieved by using more than one processors or computers together they communicate with each other to solve a givenproblem. MINs provide better way for the communication between different processors or memory modules with less complexity, fast communication, good fault tolerance, high reliability and low cost. Reliability of a system is the probability that it will successfully perform its intended operations for a given time under stated operating conditions. From the reliability analysis it has beenobserved that addition of one stage to Omega networks provide higher reliability in terms of terminal reliability than the addition of two stages in the corresponding network.

  4. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  5. Seismic reliability analysis of urban water distribution network

    Institute of Scientific and Technical Information of China (English)

    Li Jie; Wei Shulin; Liu Wei

    2006-01-01

    An approach to analyze the seismic reliability of water distribution networks by combining a hydraulic analysis with a first-order reliability method (FORM), is proposed in this paper.The hydraulic analysis method for normal conditions is modified to accommodate the special conditions necessary to perform a seismic hydraulic analysis. In order to calculate the leakage area and leaking flow of the pipelines in the hydraulic analysis method, a new leakage model established from the seismic response analysis of buried pipelines is presented. To validate the proposed approach, a network with 17 nodes and 24 pipelines is investigated in detail. The approach is also applied to an actual project consisting of 463 nodes and 767pipelines. Thee results show that the proposed approach achieves satisfactory results in analyzing the seismic reliability of large-scale water distribution networks.

  6. A Passive System Reliability Analysis for a Station Blackout

    Energy Technology Data Exchange (ETDEWEB)

    Brunett, Acacia; Bucknor, Matthew; Grabaskas, David; Sofu, Tanju; Grelle, Austin

    2015-05-03

    The latest iterations of advanced reactor designs have included increased reliance on passive safety systems to maintain plant integrity during unplanned sequences. While these systems are advantageous in reducing the reliance on human intervention and availability of power, the phenomenological foundations on which these systems are built require a novel approach to a reliability assessment. Passive systems possess the unique ability to fail functionally without failing physically, a result of their explicit dependency on existing boundary conditions that drive their operating mode and capacity. Argonne National Laboratory is performing ongoing analyses that demonstrate various methodologies for the characterization of passive system reliability within a probabilistic framework. Two reliability analysis techniques are utilized in this work. The first approach, the Reliability Method for Passive Systems, provides a mechanistic technique employing deterministic models and conventional static event trees. The second approach, a simulation-based technique, utilizes discrete dynamic event trees to treat time- dependent phenomena during scenario evolution. For this demonstration analysis, both reliability assessment techniques are used to analyze an extended station blackout in a pool-type sodium fast reactor (SFR) coupled with a reactor cavity cooling system (RCCS). This work demonstrates the entire process of a passive system reliability analysis, including identification of important parameters and failure metrics, treatment of uncertainties and analysis of results.

  7. Reliability Analysis of Dynamic Stability in Waves

    DEFF Research Database (Denmark)

    Søborg, Anders Veldt

    2004-01-01

    exhibit sufficient characteristics with respect to slope at zero heel (GM value), maximum leverarm, positive range of stability and area below the leverarm curve. The rule-based requirements to calm water leverarm curves are entirely based on experience obtained from vessels in operation and recorded......-4 per ship year such brute force Monte-Carlo simulations are not always feasible due to the required computational resources. Previous studies of dynamic stability of ships in waves typically focused on the capsizing event. In this study the objective is to establish a procedure that can identify...... the distribution of the exceedance probability may be established by an estimation of the out-crossing rate of the "safe set" defined by the utility function. This out-crossing rate will be established using the so-called Madsen's Formula. A bi-product of this analysis is a set of short wave time series...

  8. Ratio Versus Regression Analysis: Some Empirical Evidence in Brazil

    Directory of Open Access Journals (Sweden)

    Newton Carneiro Affonso da Costa Jr.

    2004-06-01

    Full Text Available This work compares the traditional methodology for ratio analysis, applied to a sample of Brazilian firms, with the alternative one of regression analysis both to cross-industry and intra-industry samples. It was tested the structural validity of the traditional methodology through a model that represents its analogous regression format. The data are from 156 Brazilian public companies in nine industrial sectors for the year 1997. The results provide weak empirical support for the traditional ratio methodology as it was verified that the validity of this methodology may differ between ratios.

  9. Reliability Analysis of Fatigue Fracture of Wind Turbine Drivetrain Components

    DEFF Research Database (Denmark)

    Berzonskis, Arvydas; Sørensen, John Dalsgaard

    2016-01-01

    in the volume of the casted ductile iron main shaft, on the reliability of the component. The probabilistic reliability analysis conducted is based on fracture mechanics models. Additionally, the utilization of the probabilistic reliability for operation and maintenance planning and quality control is discussed....... of operation and maintenance. The manufacturing of casted drivetrain components, like the main shaft of the wind turbine, commonly result in many smaller defects through the volume of the component with sizes that depend on the manufacturing method. This paper considers the effect of the initial defect present...

  10. Analysis on Operation Reliability of Generating Units in 2009

    Institute of Scientific and Technical Information of China (English)

    Zhou

    2010-01-01

    This paper presents the data on operation reliability indices and relevant analyses toward China's conventional power generating units in 2009. The units brought into the statistical analysis include 100-MW or above thermal generating units, 40-MW or above hydro generating units, and all nuclear generating units. The reliability indices embodied include utilization hours, times and hours of scheduled outages, times and hours of unscheduled outages, equivalent forced outage rate and equivalent availability factor.

  11. Reliability analysis and initial requirements for FC systems and stacks

    Science.gov (United States)

    Åström, K.; Fontell, E.; Virtanen, S.

    In the year 2000 Wärtsilä Corporation started an R&D program to develop SOFC systems for CHP applications. The program aims to bring to the market highly efficient, clean and cost competitive fuel cell systems with rated power output in the range of 50-250 kW for distributed generation and marine applications. In the program Wärtsilä focuses on system integration and development. System reliability and availability are key issues determining the competitiveness of the SOFC technology. In Wärtsilä, methods have been implemented for analysing the system in respect to reliability and safety as well as for defining reliability requirements for system components. A fault tree representation is used as the basis for reliability prediction analysis. A dynamic simulation technique has been developed to allow for non-static properties in the fault tree logic modelling. Special emphasis has been placed on reliability analysis of the fuel cell stacks in the system. A method for assessing reliability and critical failure predictability requirements for fuel cell stacks in a system consisting of several stacks has been developed. The method is based on a qualitative model of the stack configuration where each stack can be in a functional, partially failed or critically failed state, each of the states having different failure rates and effects on the system behaviour. The main purpose of the method is to understand the effect of stack reliability, critical failure predictability and operating strategy on the system reliability and availability. An example configuration, consisting of 5 × 5 stacks (series of 5 sets of 5 parallel stacks) is analysed in respect to stack reliability requirements as a function of predictability of critical failures and Weibull shape factor of failure rate distributions.

  12. Coverage Modeling and Reliability Analysis Using Multi-state Function

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Fault tree analysis is an effective method for predicting the reliability of a system. It gives a pictorial representation and logical framework for analyzing the reliability. Also, it has been used for a long time as an effective method for the quantitative and qualitative analysis of the failure modes of critical systems. In this paper, we propose a new general coverage model (GCM) based on hardware independent faults. Using this model, an effective software tool can be constructed to detect, locate and recover fault from the faulty system. This model can be applied to identify the key component that can cause the failure of the system using failure mode effect analysis (FMEA).

  13. Reliability analysis of flood defence systems in the Netherlands

    NARCIS (Netherlands)

    Lassing, B.L.; Vrouwenvelder, A.C.W.M.; Waarts, P.H.

    2003-01-01

    In recent years an advanced program for reliability analysis of dike systems has been under de-velopment in the Netherlands. This paper describes the global data requirements for application and the set-up of the models in the Netherlands. The analysis generates an estimate of the probability of sys

  14. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  15. On reliability analysis of multi-categorical forecasts

    Directory of Open Access Journals (Sweden)

    J. Bröcker

    2008-08-01

    Full Text Available Reliability analysis of probabilistic forecasts, in particular through the rank histogram or Talagrand diagram, is revisited. Two shortcomings are pointed out: Firstly, a uniform rank histogram is but a necessary condition for reliability. Secondly, if the forecast is assumed to be reliable, an indication is needed how far a histogram is expected to deviate from uniformity merely due to randomness. Concerning the first shortcoming, it is suggested that forecasts be grouped or stratified along suitable criteria, and that reliability is analyzed individually for each forecast stratum. A reliable forecast should have uniform histograms for all individual forecast strata, not only for all forecasts as a whole. As to the second shortcoming, instead of the observed frequencies, the probability of the observed frequency is plotted, providing and indication of the likelihood of the result under the hypothesis that the forecast is reliable. Furthermore, a Goodness-Of-Fit statistic is discussed which is essentially the reliability term of the Ignorance score. The discussed tools are applied to medium range forecasts for 2 m-temperature anomalies at several locations and lead times. The forecasts are stratified along the expected ranked probability score. Those forecasts which feature a high expected score turn out to be particularly unreliable.

  16. RELIABILITY ANALYSIS OF RING, AGENT AND CLUSTER BASED DISTRIBUTED SYSTEMS

    Directory of Open Access Journals (Sweden)

    R.SEETHALAKSHMI

    2011-08-01

    Full Text Available The introduction of pervasive devices and mobile devices has led to immense growth of real time distributed processing. In such context reliability of the computing environment is very important. Reliability is the probability that the devices, links, processes, programs and files work efficiently for the specified period of time and in the specified condition. Distributed systems are available as conventional ring networks, clusters and agent based systems. Reliability of such systems is focused. These networks are heterogeneous and scalable in nature. There are several factors, which are to be considered for reliability estimation. These include the application related factors like algorithms, data-set sizes, memory usage pattern, input-output, communication patterns, task granularity and load-balancing. It also includes the hardware related factors like processor architecture, memory hierarchy, input-output configuration and network. The software related factors concerning reliability are operating systems, compiler, communication protocols, libraries and preprocessor performance. In estimating the reliability of a system, the performance estimation is an important aspect. Reliability analysis is approached using probability.

  17. The development of a reliable amateur boxing performance analysis template.

    Science.gov (United States)

    Thomson, Edward; Lamb, Kevin; Nicholas, Ceri

    2013-01-01

    The aim of this study was to devise a valid performance analysis system for the assessment of the movement characteristics associated with competitive amateur boxing and assess its reliability using analysts of varying experience of the sport and performance analysis. Key performance indicators to characterise the demands of an amateur contest (offensive, defensive and feinting) were developed and notated using a computerised notational analysis system. Data were subjected to intra- and inter-observer reliability assessment using median sign tests and calculating the proportion of agreement within predetermined limits of error. For all performance indicators, intra-observer reliability revealed non-significant differences between observations (P > 0.05) and high agreement was established (80-100%) regardless of whether exact or the reference value of ±1 was applied. Inter-observer reliability was less impressive for both analysts (amateur boxer and experienced analyst), with the proportion of agreement ranging from 33-100%. Nonetheless, there was no systematic bias between observations for any indicator (P > 0.05), and the proportion of agreement within the reference range (±1) was 100%. A reliable performance analysis template has been developed for the assessment of amateur boxing performance and is available for use by researchers, coaches and athletes to classify and quantify the movement characteristics of amateur boxing.

  18. Reliability analysis of cluster-based ad-hoc networks

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Jason L. [Quality Engineering and System Assurance, Armament Research Development Engineering Center, Picatinny Arsenal, NJ (United States); Ramirez-Marquez, Jose Emmanuel [School of Systems and Enterprises, Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ 07030 (United States)], E-mail: Jose.Ramirez-Marquez@stevens.edu

    2008-10-15

    The mobile ad-hoc wireless network (MAWN) is a new and emerging network scheme that is being employed in a variety of applications. The MAWN varies from traditional networks because it is a self-forming and dynamic network. The MAWN is free of infrastructure and, as such, only the mobile nodes comprise the network. Pairs of nodes communicate either directly or through other nodes. To do so, each node acts, in turn, as a source, destination, and relay of messages. The virtue of a MAWN is the flexibility this provides; however, the challenge for reliability analyses is also brought about by this unique feature. The variability and volatility of the MAWN configuration makes typical reliability methods (e.g. reliability block diagram) inappropriate because no single structure or configuration represents all manifestations of a MAWN. For this reason, new methods are being developed to analyze the reliability of this new networking technology. New published methods adapt to this feature by treating the configuration probabilistically or by inclusion of embedded mobility models. This paper joins both methods together and expands upon these works by modifying the problem formulation to address the reliability analysis of a cluster-based MAWN. The cluster-based MAWN is deployed in applications with constraints on networking resources such as bandwidth and energy. This paper presents the problem's formulation, a discussion of applicable reliability metrics for the MAWN, and illustration of a Monte Carlo simulation method through the analysis of several example networks.

  19. Reliability Analysis of Wireless Sensor Networks Using Markovian Model

    Directory of Open Access Journals (Sweden)

    Jin Zhu

    2012-01-01

    Full Text Available This paper investigates reliability analysis of wireless sensor networks whose topology is switching among possible connections which are governed by a Markovian chain. We give the quantized relations between network topology, data acquisition rate, nodes' calculation ability, and network reliability. By applying Lyapunov method, sufficient conditions of network reliability are proposed for such topology switching networks with constant or varying data acquisition rate. With the conditions satisfied, the quantity of data transported over wireless network node will not exceed node capacity such that reliability is ensured. Our theoretical work helps to provide a deeper understanding of real-world wireless sensor networks, which may find its application in the fields of network design and topology control.

  20. Food labelled Information: An Empirical Analysis of Consumer Preferences

    Directory of Open Access Journals (Sweden)

    Alessandro Banterle

    2012-12-01

    Full Text Available This paper aims at analysing which kinds of currently labelled information are of interest and actually used by consumers, and which additional kinds could improve consumer choices. We investigate the attitude of consumers with respect to innovative strategies for the diffusion of product information, as smart-labels for mobile-phones. The empirical analysis was organised in focus groups followed by a survey on 240 consumers. Results show that the most important nutritional claims are vitamins, energy and fat content. Consumers show a high interest in the origin of the products, GMOs, environmental impact, animal welfare and type of breeding.

  1. An Empirical Analysis of Odd Pricing Using PSM Data

    OpenAIRE

    Okuse, Yoshiyuki

    2016-01-01

    It is evident in our daily lives that most consumer goods are not sold at the just price, but rather at the just-below price. To examine the effect of odd pricing, including just-below pricing, numerous empirical studies have been conducted. In spite of these efforts, a consistent conclusion has not been obtained so far.The goals of this research are: (1) to examine the existence of the effect of odd pricing on consumers' price acceptance using PSM analysis, and (2) to examine the mechanisms ...

  2. Reliability of the Emergency Severity Index: Meta-analysis

    Directory of Open Access Journals (Sweden)

    Amir Mirhaghi

    2015-01-01

    Full Text Available Objectives: Although triage systems based on the Emergency Severity Index (ESI have many advantages in terms of simplicity and clarity, previous research has questioned their reliability in practice. Therefore, the aim of this meta-analysis was to determine the reliability of ESI triage scales. Methods: This metaanalysis was performed in March 2014. Electronic research databases were searched and articles conforming to the Guidelines for Reporting Reliability and Agreement Studies were selected. Two researchers independently examined selected abstracts. Data were extracted in the following categories: version of scale (latest/older, participants (adult/paediatric, raters (nurse, physician or expert, method of reliability (intra/inter-rater, reliability statistics (weighted/unweighted kappa and the origin and publication year of the study. The effect size was obtained by the Z-transformation of reliability coefficients. Data were pooled with random-effects models and a meta-regression was performed based on the method of moments estimator. Results: A total of 19 studies from six countries were included in the analysis. The pooled coefficient for the ESI triage scales was substantial at 0.791 (95% confidence interval: 0.787‒0.795. Agreement was higher with the latest and adult versions of the scale and among expert raters, compared to agreement with older and paediatric versions of the scales and with other groups of raters, respectively. Conclusion: ESI triage scales showed an acceptable level of overall reliability. However, ESI scales require more development in order to see full agreement from all rater groups. Further studies concentrating on other aspects of reliability assessment are needed.

  3. A regional comparative analysis of empirical and theoretical flood peak-volume relationships

    Directory of Open Access Journals (Sweden)

    Szolgay Ján

    2016-12-01

    Full Text Available This paper analyses the bivariate relationship between flood peaks and corresponding flood event volumes modelled by empirical and theoretical copulas in a regional context, with a focus on flood generation processes in general, the regional differentiation of these and the effect of the sample size on reliable discrimination among models. A total of 72 catchments in North-West of Austria are analysed for the period 1976–2007. From the hourly runoff data set, 25 697 flood events were isolated and assigned to one of three flood process types: synoptic floods (including long- and short-rain floods, flash floods or snowmelt floods (both rain-on-snow and snowmelt floods. The first step of the analysis examines whether the empirical peak-volume copulas of different flood process types are regionally statistically distinguishable, separately for each catchment and the role of the sample size on the strength of the statements. The results indicate that the empirical copulas of flash floods tend to be different from those of the synoptic and snowmelt floods. The second step examines how similar are the empirical flood peak-volume copulas between catchments for a given flood type across the region. Empirical copulas of synoptic floods are the least similar between the catchments, however with the decrease of the sample size the difference between the performances of the process types becomes small. The third step examines the goodness-of-fit of different commonly used copula types to the data samples that represent the annual maxima of flood peaks and the respective volumes both regardless of flood generating processes (the traditional engineering approach and also considering the three process-based classes. Extreme value copulas (Galambos, Gumbel and Hüsler-Reiss show the best performance both for synoptic and flash floods, while the Frank copula shows the best performance for snowmelt floods. It is concluded that there is merit in treating flood

  4. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  5. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    Science.gov (United States)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  6. Notes on numerical reliability of several statistical analysis programs

    Science.gov (United States)

    Landwehr, J.M.; Tasker, Gary D.

    1999-01-01

    This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.

  7. Distribution System Reliability Analysis for Smart Grid Applications

    Science.gov (United States)

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  8. Reliability analysis of retaining walls with multiple failure modes

    Institute of Scientific and Technical Information of China (English)

    张道兵; 孙志彬; 朱川曲

    2013-01-01

    In order to reduce the errors of the reliability of the retaining wall structure in the establishment of function, in the estimation of parameter and algorithm, firstly, two new reliability and stability models of anti-slipping and anti-overturning based on the upper-bound theory of limit analysis were established, and two kinds of failure modes were regarded as a series of systems with multiple correlated failure modes. Then, statistical characteristics of parameters of the retaining wall structure were inferred by maximal entropy principle. At last, the structural reliabilities of single failure mode and multiple failure modes were calculated by Monte Carlo method in MATLAB and the results were compared and analyzed on the sensitivity. It indicates that this method, with a high precision, is not only easy to program and quick in calculation, but also without the limit of nonlinear functions and non-normal random variables. And the results calculated by this method which applies both the limit analysis theory, maximal entropy principle and Monte Carlo method into analyzing the reliability of the retaining wall structures is more scientific, accurate and reliable, in comparison with those calculated by traditional method.

  9. Reliability Analysis of a Green Roof Under Different Storm Scenarios

    Science.gov (United States)

    William, R. K.; Stillwell, A. S.

    2015-12-01

    Urban environments continue to face the challenges of localized flooding and decreased water quality brought on by the increasing amount of impervious area in the built environment. Green infrastructure provides an alternative to conventional storm sewer design by using natural processes to filter and store stormwater at its source. However, there are currently few consistent standards available in North America to ensure that installed green infrastructure is performing as expected. This analysis offers a method for characterizing green roof failure using a visual aid commonly used in earthquake engineering: fragility curves. We adapted the concept of the fragility curve based on the efficiency in runoff reduction provided by a green roof compared to a conventional roof under different storm scenarios. We then used the 2D distributed surface water-groundwater coupled model MIKE SHE to model the impact that a real green roof might have on runoff in different storm events. We then employed a multiple regression analysis to generate an algebraic demand model that was input into the Matlab-based reliability analysis model FERUM, which was then used to calculate the probability of failure. The use of reliability analysis as a part of green infrastructure design code can provide insights into green roof weaknesses and areas for improvement. It also supports the design of code that is more resilient than current standards and is easily testable for failure. Finally, the understanding of reliability of a single green roof module under different scenarios can support holistic testing of system reliability.

  10. Semigroup Method for a Mathematical Model in Reliability Analysis

    Institute of Scientific and Technical Information of China (English)

    Geni Gupur; LI Xue-zhi

    2001-01-01

    The system which consists of a reliable machine, an unreliable machine and a storage buffer with infinite many workpieces has been studied. The existence of a unique positive time-dependent solution of the model corresponding to the system has been obtained by using C0-semigroup theory of linear operators in functional analysis.

  11. Reliability-Based Robustness Analysis for a Croatian Sports Hall

    DEFF Research Database (Denmark)

    Čizmar, Dean; Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    2011-01-01

    . A complex timber structure with a large number of failure modes is modelled with only a few dominant failure modes. First, a component based robustness analysis is performed based on the reliability indices of the remaining elements after the removal of selected critical elements. The robustness...

  12. Reliability-Based Robustness Analysis for a Croatian Sports Hall

    DEFF Research Database (Denmark)

    Čizmar, Dean; Kirkegaard, Poul Henning; Sørensen, John Dalsgaard;

    2011-01-01

    This paper presents a probabilistic approach for structural robustness assessment for a timber structure built a few years ago. The robustness analysis is based on a structural reliability based framework for robustness and a simplified mechanical system modelling of a timber truss system. A comp...

  13. System Reliability Analysis Capability and Surrogate Model Application in RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    Rabiti, Cristian; Alfonsi, Andrea; Huang, Dongli; Gleicher, Frederick; Wang, Bei; Adbel-Khalik, Hany S.; Pascucci, Valerio; Smith, Curtis L.

    2015-11-01

    This report collect the effort performed to improve the reliability analysis capabilities of the RAVEN code and explore new opportunity in the usage of surrogate model by extending the current RAVEN capabilities to multi physics surrogate models and construction of surrogate models for high dimensionality fields.

  14. Test-retest reliability of trunk accelerometric gait analysis

    DEFF Research Database (Denmark)

    Henriksen, Marius; Lund, Hans; Moe-Nilssen, R

    2004-01-01

    The purpose of this study was to determine the test-retest reliability of a trunk accelerometric gait analysis in healthy subjects. Accelerations were measured during walking using a triaxial accelerometer mounted on the lumbar spine of the subjects. Six men and 14 women (mean age 35.2; range 18...

  15. Human Reliability Analysis for Digital Human-Machine Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2014-06-01

    This paper addresses the fact that existing human reliability analysis (HRA) methods do not provide guidance on digital human-machine interfaces (HMIs). Digital HMIs are becoming ubiquitous in nuclear power operations, whether through control room modernization or new-build control rooms. Legacy analog technologies like instrumentation and control (I&C) systems are costly to support, and vendors no longer develop or support analog technology, which is considered technologically obsolete. Yet, despite the inevitability of digital HMI, no current HRA method provides guidance on how to treat human reliability considerations for digital technologies.

  16. Modelling application for cognitive reliability and error analysis method

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2013-10-01

    Full Text Available The automation of production systems has delegated to machines the execution of highly repetitive and standardized tasks. In the last decade, however, the failure of the automatic factory model has led to partially automated configurations of production systems. Therefore, in this scenario, centrality and responsibility of the role entrusted to the human operators are exalted because it requires problem solving and decision making ability. Thus, human operator is the core of a cognitive process that leads to decisions, influencing the safety of the whole system in function of their reliability. The aim of this paper is to propose a modelling application for cognitive reliability and error analysis method.

  17. Classification using least squares support vector machine for reliability analysis

    Institute of Scientific and Technical Information of China (English)

    Zhi-wei GUO; Guang-chen BAI

    2009-01-01

    In order to improve the efficiency of the support vector machine (SVM) for classification to deal with a large amount of samples,the least squares support vector machine (LSSVM) for classification methods is introduced into the reliability analysis.To reduce the computational cost,the solution of the SVM is transformed from a quadratic programming to a group of linear equations.The numerical results indicate that the reliability method based on the LSSVM for classification has higher accuracy and requires less computational cost than the SVM method.

  18. The Role of Temperature in Economic Exchange - An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Dimitrijević Bojan

    2015-09-01

    Full Text Available As a synthesis of economics and physics and an attempt to apply the methods and models of statistical physics to economics, econophysics is presently a new, growing and very dynamic branch of modern science. Therefore, the subject of this paper is to analyse the relationship and interdependence between thermodynamics and economics, and it aims to show similarities, analogies and correspondence between the main categories, methods and models of thermodynamics on one hand, and economics on the other. The paper analyses the relation between economics and thermodynamics, as well as the probability distribution in the kinetic theory of gases corresponding to money, income and wealth distribution, connects entropy with utility and the principle of operation of the thermal engine with economic exchange. The final part of the paper empirically analyzes temperature differences in the exchange between Serbia and the selected EU countries. There are differences in temperature between Serbia and the group of selected countries. Results of the empirical analysis shows that the exchange between countries is based on principles of thermodynamics and that developed countries generate more profits and benefits from exchange.

  19. Accident Sequence Evaluation Program: Human reliability analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  20. Strength Reliability Analysis of Turbine Blade Using Surrogate Models

    Directory of Open Access Journals (Sweden)

    Wei Duan

    2014-05-01

    Full Text Available There are many stochastic parameters that have an effect on the reliability of steam turbine blades performance in practical operation. In order to improve the reliability of blade design, it is necessary to take these stochastic parameters into account. In this study, a variable cross-section twisted blade is investigated and geometrical parameters, material parameters and load parameters are considered as random variables. A reliability analysis method as a combination of a Finite Element Method (FEM, a surrogate model and Monte Carlo Simulation (MCS, is applied to solve the blade reliability analysis. Based on the blade finite element parametrical model and the experimental design, two kinds of surrogate models, Polynomial Response Surface (PRS and Artificial Neural Network (ANN, are applied to construct the approximation analytical expressions between the blade responses (including maximum stress and deflection and random input variables, which act as a surrogate of finite element solver to drastically reduce the number of simulations required. Then the surrogate is used for most of the samples needed in the Monte Carlo method and the statistical parameters and cumulative distribution functions of the maximum stress and deflection are obtained by Monte Carlo simulation. Finally, the probabilistic sensitivities analysis, which combines the magnitude of the gradient and the width of the scatter range of the random input variables, is applied to evaluate how much the maximum stress and deflection of the blade are influenced by the random nature of input parameters.

  1. Generating function approach to reliability analysis of structural systems

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The generating function approach is an important tool for performance assessment in multi-state systems. Aiming at strength reliability analysis of structural systems, generating function approach is introduced and developed. Static reliability models of statically determinate, indeterminate systems and fatigue reliability models are built by constructing special generating functions, which are used to describe probability distributions of strength (resistance), stress (load) and fatigue life, by defining composite operators of generating functions and performance structure functions thereof. When composition operators are executed, computational costs can be reduced by a big margin by means of collecting like terms. The results of theoretical analysis and numerical simulation show that the generating function approach can be widely used for probability modeling of large complex systems with hierarchical structures due to the unified form, compact expression, computer program realizability and high universality. Because the new method considers twin loads giving rise to component failure dependency, it can provide a theoretical reference and act as a powerful tool for static, dynamic reliability analysis in civil engineering structures and mechanical equipment systems with multi-mode damage coupling.

  2. Identifying Sources of Difference in Reliability in Content Analysis

    Directory of Open Access Journals (Sweden)

    Elizabeth Murphy

    2005-07-01

    Full Text Available This paper reports on a case study which identifies and illustrates sources of difference in agreement in relation to reliability in a context of quantitative content analysis of a transcript of an online asynchronous discussion (OAD. Transcripts of 10 students in a month-long online asynchronous discussion were coded by two coders using an instrument with two categories, five processes, and 19 indicators of Problem Formulation and Resolution (PFR. Sources of difference were identified in relation to: coders; tasks; and students. Reliability values were calculated at the levels of categories, processes, and indicators. At the most detailed level of coding on the basis of the indicator, findings revealed that the overall level of reliability between coders was .591 when measured with Cohen’s kappa. The difference between tasks at the same level ranged from .349 to .664, and the difference between participants ranged from .390 to .907. Implications for training and research are discussed.

  3. Reliability Analysis of Free Jet Scour Below Dams

    Directory of Open Access Journals (Sweden)

    Chuanqi Li

    2012-12-01

    Full Text Available Current formulas for calculating scour depth below of a free over fall are mostly deterministic in nature and do not adequately consider the uncertainties of various scouring parameters. A reliability-based assessment of scour, taking into account uncertainties of parameters and coefficients involved, should be performed. This paper studies the reliability of a dam foundation under the threat of scour. A model for calculating the reliability of scour and estimating the probability of failure of the dam foundation subjected to scour is presented. The Maximum Entropy Method is applied to construct the probability density function (PDF of the performance function subject to the moment constraints. Monte Carlo simulation (MCS is applied for uncertainty analysis. An example is considered, and there liability of its scour is computed, the influence of various random variables on the probability failure is analyzed.

  4. Modeling and Analysis of Component Faults and Reliability

    DEFF Research Database (Denmark)

    Le Guilly, Thibaut; Olsen, Petur; Ravn, Anders Peter;

    2016-01-01

    that are automatically generated. The stochastic information on the faults is used to estimate the reliability of the fault affected system. The reliability is given with respect to properties of the system state space. We illustrate the process on a concrete example using the Uppaal model checker for validating...... the ideal system model and the fault modeling. Then the statistical version of the tool, UppaalSMC, is used to find reliability estimates.......This chapter presents a process to design and validate models of reactive systems in the form of communicating timed automata. The models are extended with faults associated with probabilities of occurrence. This enables a fault tree analysis of the system using minimal cut sets...

  5. Reliability analysis of two unit parallel repairable industrial system

    Directory of Open Access Journals (Sweden)

    Mohit Kumar Kakkar

    2015-09-01

    Full Text Available The aim of this work is to present a reliability and profit analysis of a two-dissimilar parallel unit system under the assumption that operative unit cannot fail after post repair inspection and replacement and there is only one repair facility. Failure and repair times of each unit are assumed to be uncorrelated. Using regenerative point technique various reliability characteristics are obtained which are useful to system designers and industrial managers. Graphical behaviors of mean time to system failure (MTSF and profit function have also been studied. In this paper, some important measures of reliability characteristics of a two non-identical unit standby system model with repair, inspection and post repair are obtained using regenerative point technique.

  6. Reliability and maintainability analysis of electrical system of drum shearers

    Institute of Scientific and Technical Information of China (English)

    SEYED Hadi Hoseinie; MOHAMMAD Ataei; REZA Khalokakaie; UDAY Kumar

    2011-01-01

    The reliability and maintainability of electrical system of drum shearer at Parvade.l Coal Mine in central Iran was analyzed. The maintenance and failure data were collected during 19 months of shearer operation. According to trend and serial correlation tests, the data were independent and identically distributed (iid) and therefore the statistical techniques were used for modeling. The data analysis show that the time between failures (TBF) and time to repair (TTR) data obey the lognormal and Weibull 3 parameters distribution respectively. Reliability-based preventive maintenance time intervals for electrical system of the drum shearer were calculated with regard to reliability plot. The reliability-based maintenance intervals for 90%, 80%, 70% and 50% reliability level are respectively 9.91, 17.96, 27.56 and 56.1 h. Also the calculations show that time to repair (TTR) of this system varies in range 0.17-4 h with 1.002 h as mean time to repair (MTTR). There is a 80% chance that the electrical system of shearer of Parvade.l mine repair will be accomplished within 1.45 h.

  7. Reliability analysis method for slope stability based on sample weight

    Directory of Open Access Journals (Sweden)

    Zhi-gang YANG

    2009-09-01

    Full Text Available The single safety factor criteria for slope stability evaluation, derived from the rigid limit equilibrium method or finite element method (FEM, may not include some important information, especially for steep slopes with complex geological conditions. This paper presents a new reliability method that uses sample weight analysis. Based on the distribution characteristics of random variables, the minimal sample size of every random variable is extracted according to a small sample t-distribution under a certain expected value, and the weight coefficient of each extracted sample is considered to be its contribution to the random variables. Then, the weight coefficients of the random sample combinations are determined using the Bayes formula, and different sample combinations are taken as the input for slope stability analysis. According to one-to-one mapping between the input sample combination and the output safety coefficient, the reliability index of slope stability can be obtained with the multiplication principle. Slope stability analysis of the left bank of the Baihetan Project is used as an example, and the analysis results show that the present method is reasonable and practicable for the reliability analysis of steep slopes with complex geological conditions.

  8. Semantic Web for Reliable Citation Analysis in Scholarly Publishing

    Directory of Open Access Journals (Sweden)

    Ruben Tous

    2011-03-01

    Full Text Available Analysis of the impact of scholarly artifacts is constrained by current unreliable practices in cross-referencing, citation discovering, and citation indexing and analysis, which have not kept pace with the technological advances that are occurring in several areas like knowledge management and security. Because citation analysis has become the primary component in scholarly impact factor calculation, and considering the relevance of this metric within both the scholarly publishing value chain and (especially important the professional curriculum evaluation of scholarly professionals, we defend that current practices need to be revised. This paper describes a reference architecture that aims to provide openness and reliability to the citation-tracking lifecycle. The solution relies on the use of digitally signed semantic metadata in the different stages of the scholarly publishing workflow in such a manner that authors, publishers, repositories, and citation-analysis systems will have access to independent reliable evidences that are resistant to forgery, impersonation, and repudiation. As far as we know, this is the first paper to combine Semantic Web technologies and public-key cryptography to achieve reliable citation analysis in scholarly publishing

  9. Empirical Analysis of the Vegetable Industry in Hebei Province

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    We first introduce the status quo of the development of vegetable industry in Hebei Province,and then conduct empirical analysis of the development of vegetable industry in Hebei Province.Further,we analyze the development advantage of the vegetable industry in Hebei Province using SAI(Scale Advantage Indices) and SCA(Symmetric Comparative Advantage),drawing the conclusion that the vegetable industry in Hebei Province has much room for development;at the same time,we analyze the factors influencing vegetable consumption of residents in Hebei Province through the regression model,drawing the conclusion that the vegetable consumer price index is the main factor affecting the consumption.Finally we make recommendations for the development of vegetable industry in Hebei Province as follows:increasing financial input,promoting policy guarantee capacity;implementing brand strategy,promoting the competitiveness of products;improving the ecological environment,promoting industrialization of pollution-free vegetables.

  10. Empirical Analysis on Factors Influencing Distribution of Vegetal Production

    Institute of Scientific and Technical Information of China (English)

    Wenjie; WU

    2015-01-01

    Since the reform and opening-up,there has been a great change in spatial pattern of China’s vegetable production. This paper studied vegetable production in provinces of China in 1978- 2013. From the sequential characteristics,China’s vegetable production area is constantly growing and takes on stage characteristic. From the spatial distribution,China’s vegetable production takes on the trend of " going down the south" and " marching the west". In order to grasp rules of changes of vegetable production and the influence factors,this paper made theoretical and empirical analysis on factors possibly influencing distribution of vegetable production. Results show that major factors influencing distribution of China’s vegetable production include irrigation condition,non-agricultural employment,market demand,knowledge spillover,comparative effectiveness,rural road and government policies.

  11. Empirical Analysis: Business Cycles and Inward FDI in China

    Directory of Open Access Journals (Sweden)

    Qiyun Fang

    2007-01-01

    Full Text Available It is well-known that the current speeding-up of globalization has been, on one hand, spreading macro economic effects around the world, while, on the other, fueling firms’ activities of crossing national borders. Then, are there any links between these two influences? In this paper, we chose China as our subject, to try to clarify it. A set of models for Granger Causality test and VAR Impulse Responses were constructed and some econometric estimations and empirical analysis were made by employing the latest 20-year authorized annual statistic data. And the findings clearly indicated that firms’ (foreign activities (inward FDI do respond pro-cyclically to business cycle developments in a long term.

  12. Empirical Analysis of Xinjiang's Bilateral Trade: Gravity Model Approach

    Institute of Scientific and Technical Information of China (English)

    CHEN Xuegang; YANG Zhaoping; LIU Xuling

    2008-01-01

    Based on the basic trade gravity model and Xinjiang's practical situation, new explanatory variables (GDP,GDPpc and SCO) are introduced to build an extended trade gravity model fitting for Xinjiang's bilateral trade. Fromthe empirical analysis of this model, it is proposed that those three variables affect the Xinjiang's bilateral trade posi-tively. Whereas, geographic distance is found to be a significant factor influencing Xinjiang's bilateral trade negatively.Then, by the extended trade gravity model, this article analyzes the present trade situation between Xinjiang and itsmain trade partners quantitatively in 2004. The results indicate that Xinjiang cooperates with its most trade partnerssuccessfully in terms of present economic scale and developing revel. Xinjiang has established successfully trade part-nership with Central Asia, Central Europe and Eastern Europe, Western Europe, East Asia and South Asia. However,the foreign trade development with West Asia is much slower. Finally, some suggestions on developing Xinjiang's for-eign trade are put forward.

  13. Empirical Analysis and Automated Classification of Security Bug Reports

    Science.gov (United States)

    Tyo, Jacob P.

    2016-01-01

    With the ever expanding amount of sensitive data being placed into computer systems, the need for effective cybersecurity is of utmost importance. However, there is a shortage of detailed empirical studies of security vulnerabilities from which cybersecurity metrics and best practices could be determined. This thesis has two main research goals: (1) to explore the distribution and characteristics of security vulnerabilities based on the information provided in bug tracking systems and (2) to develop data analytics approaches for automatic classification of bug reports as security or non-security related. This work is based on using three NASA datasets as case studies. The empirical analysis showed that the majority of software vulnerabilities belong only to a small number of types. Addressing these types of vulnerabilities will consequently lead to cost efficient improvement of software security. Since this analysis requires labeling of each bug report in the bug tracking system, we explored using machine learning to automate the classification of each bug report as a security or non-security related (two-class classification), as well as each security related bug report as specific security type (multiclass classification). In addition to using supervised machine learning algorithms, a novel unsupervised machine learning approach is proposed. An ac- curacy of 92%, recall of 96%, precision of 92%, probability of false alarm of 4%, F-Score of 81% and G-Score of 90% were the best results achieved during two-class classification. Furthermore, an accuracy of 80%, recall of 80%, precision of 94%, and F-score of 85% were the best results achieved during multiclass classification.

  14. Reliability test and failure analysis of high power LED packages*

    Institute of Scientific and Technical Information of China (English)

    Chen Zhaohui; Zhang Qin; Wang Kai; Luo Xiaobing; Liu Sheng

    2011-01-01

    A new type application specific light emitting diode (LED) package (ASLP) with freeform polycarbonate lens for street lighting is developed, whose manufacturing processes are compatible with a typical LED packaging process. The reliability test methods and failure criterions from different vendors are reviewed and compared. It is found that test methods and failure criterions are quite different. The rapid reliability assessment standards are urgently needed for the LED industry. 85 ℃/85 RH with 700 mA is used to test our LED modules with three other vendors for 1000 h, showing no visible degradation in optical performance for our modules, with two other vendors showing significant degradation. Some failure analysis methods such as C-SAM, Nano X-ray CT and optical microscope are used for LED packages. Some failure mechanisms such as delaminations and cracks are detected in the LED packages after the accelerated reliability testing. The finite element simulation method is helpful for the failure analysis and design of the reliability of the LED packaging. One example is used to show one currently used module in industry is vulnerable and may not easily pass the harsh thermal cycle testing.

  15. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    Science.gov (United States)

    Hou, Gene J.-W.; Gumbert, Clyde R.; Newman, Perry A.

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The optimal solutions associated with the MPP provide measurements related to safety probability. This study focuses on two commonly used approximate probability integration methods; i.e., the Reliability Index Approach (RIA) and the Performance Measurement Approach (PMA). Their reliability sensitivity equations are first derived in this paper, based on the derivatives of their respective optimal solutions. Examples are then provided to demonstrate the use of these derivatives for better reliability analysis and Reliability-Based Design Optimization (RBDO).

  16. Fatigue Reliability Analysis of a Mono-Tower Platform

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; Brincker, Rune

    1991-01-01

    In this paper, a fatigue reliability analysis of a Mono-tower platform is presented. The failure mode, fatigue failure in the butt welds, is investigated with two different models. The one with the fatigue strength expressed through SN relations, the other with the fatigue strength expressed thro...... of the natural period, damping ratio, current, stress spectrum and parameters describing the fatigue strength. Further, soil damping is shown to be significant for the Mono-tower.......In this paper, a fatigue reliability analysis of a Mono-tower platform is presented. The failure mode, fatigue failure in the butt welds, is investigated with two different models. The one with the fatigue strength expressed through SN relations, the other with the fatigue strength expressed...

  17. Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm

    Directory of Open Access Journals (Sweden)

    Raj Kumar

    2012-12-01

    Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.

  18. Empirical Analysis of Customer Behaviors in Chinese E-Commerce

    Directory of Open Access Journals (Sweden)

    Jinlong Wang

    2010-10-01

    Full Text Available With the burgeoning e-Business websites, E-Commerce in China has been developing rapidly in recent years. From the analysis of Chinese E-Commerce market, it is possible to discover customer purchasing patterns or behavior characteristics, which are indispensable knowledge for the expansion of Chinese E-Commerce market. This paper presents an empirical analysis on the sale transactions from the 360buy website based on the analysis of time interval distributions in perspectives of customers. Results reveal that in most situations the time intervals approximately obey the power-law distribution over two orders of magnitudes. Additionally, time interval on customer’s successive purchase can reflect how loyal a customer is to a specific product category. Moreover, we also find an interesting phenomenon about human behaviors that could be related to psychology of customers. In general, customers’ requirements in different product categories are similar. The investigation into individual behaviors may help researchers understand how customers’ group behaviors generated.

  19. Satellite time series analysis using Empirical Mode Decomposition

    Science.gov (United States)

    Pannimpullath, R. Renosh; Doolaeghe, Diane; Loisel, Hubert; Vantrepotte, Vincent; Schmitt, Francois G.

    2016-04-01

    Geophysical fields possess large fluctuations over many spatial and temporal scales. Satellite successive images provide interesting sampling of this spatio-temporal multiscale variability. Here we propose to consider such variability by performing satellite time series analysis, pixel by pixel, using Empirical Mode Decomposition (EMD). EMD is a time series analysis technique able to decompose an original time series into a sum of modes, each one having a different mean frequency. It can be used to smooth signals, to extract trends. It is built in a data-adaptative way, and is able to extract information from nonlinear signals. Here we use MERIS Suspended Particulate Matter (SPM) data, on a weekly basis, during 10 years. There are 458 successive time steps. We have selected 5 different regions of coastal waters for the present study. They are Vietnam coastal waters, Brahmaputra region, St. Lawrence, English Channel and McKenzie. These regions have high SPM concentrations due to large scale river run off. Trend and Hurst exponents are derived for each pixel in each region. The energy also extracted using Hilberts Spectral Analysis (HSA) along with EMD method. Normalised energy computed for each mode for each region with the total energy. The total energy computed using all the modes are extracted using EMD method.

  20. Reliability analysis method applied in slope stability: slope prediction and forecast on stability analysis

    Institute of Scientific and Technical Information of China (English)

    Wenjuan ZHANG; Li CHEN; Ning QU; Hai'an LIANG

    2006-01-01

    Landslide is one kind of geologic hazards that often happens all over the world. It brings huge losses to human life and property; therefore, it is very important to research it. This study focused in combination between single and regional landslide, traditional slope stability analysis method and reliability analysis method. Meanwhile, methods of prediction of slopes and reliability analysis were discussed.

  1. Reliability analysis based on the losses from failures.

    Science.gov (United States)

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the

  2. A Sensitivity Analysis on Component Reliability from Fatigue Life Computations

    Science.gov (United States)

    1992-02-01

    AD-A247 430 MTL TR 92-5 AD A SENSITIVITY ANALYSIS ON COMPONENT RELIABILITY FROM FATIGUE LIFE COMPUTATIONS DONALD M. NEAL, WILLIAM T. MATTHEWS, MARK G...HAGI OR GHANI NUMBI:H(s) Donald M. Neal, William T. Matthews, Mark G. Vangel, and Trevor Rudalevige 9. PERFORMING ORGANIZATION NAME AND ADDRESS lU...Technical Information Center, Cameron Station, Building 5, 5010 Duke Street, Alexandria, VA 22304-6145 2 ATTN: DTIC-FDAC I MIAC/ CINDAS , Purdue

  3. THE INVESTMENT RELIABILITY ANALYSIS FOR A SURFACE MINE

    Institute of Scientific and Technical Information of China (English)

    彭世济; 卢明银; 张达贤

    1990-01-01

    It is stipulated in the China national document, named"The Economical Appraisal Methods for Construction Projects" that dynamic analysis should dominate the project economical appraisal methods.This paper has set up a dynamic investment forecast model for Yuanbaoshan Surface Coal Mine. Based on this model, the investment reliability using simulation and analytic methods has been analysed, anti the probability that the designed internal rate of return can reach 8.4%, from economic points of view, have been also studied.

  4. Spectral Analysis of Surface Wave for Empirical Elastic Design of Anchored Foundations

    Directory of Open Access Journals (Sweden)

    S. E. Chen

    2012-01-01

    Full Text Available Helical anchors are vital support components for power transmission lines. Failure of a single anchor can lead to the loss of an entire transmission line structure which results in the loss of power for downstream community. Despite being important, it is not practical to use conventional borehole method of subsurface exploration, which is labor intensive and costly, for estimating soil properties and anchor holding capacity. This paper describes the use of an empirical and elasticity-based design technique coupled with the spectral analysis of surface wave (SASW technique to provide subsurface information for anchor foundation designs. Based on small-strain wave propagation, SASW determines shear wave velocity profile which is then correlated to anchor holding capacity. A pilot project involving over 400 anchor installations has been performed and demonstrated that such technique is reliable and can be implemented into transmission line structure designs.

  5. Reliability analysis for new technology-based transmitters

    Energy Technology Data Exchange (ETDEWEB)

    Brissaud, Florent, E-mail: florent.brissaud.2007@utt.f [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France); Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and STMR UMR CNRS 6279, 12 rue Marie Curie, BP 2060, 10010 Troyes cedex (France); Barros, Anne; Berenguer, Christophe [Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and STMR UMR CNRS 6279, 12 rue Marie Curie, BP 2060, 10010 Troyes cedex (France); Charpentier, Dominique [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France)

    2011-02-15

    The reliability analysis of new technology-based transmitters has to deal with specific issues: various interactions between both material elements and functions, undefined behaviours under faulty conditions, several transmitted data, and little reliability feedback. To handle these particularities, a '3-step' model is proposed, based on goal tree-success tree (GTST) approaches to represent both the functional and material aspects, and includes the faults and failures as a third part for supporting reliability analyses. The behavioural aspects are provided by relationship matrices, also denoted master logic diagrams (MLD), with stochastic values which represent direct relationships between system elements. Relationship analyses are then proposed to assess the effect of any fault or failure on any material element or function. Taking these relationships into account, the probabilities of malfunction and failure modes are evaluated according to time. Furthermore, uncertainty analyses tend to show that even if the input data and system behaviour are not well known, these previous results can be obtained in a relatively precise way. An illustration is provided by a case study on an infrared gas transmitter. These properties make the proposed model and corresponding reliability analyses especially suitable for intelligent transmitters (or 'smart sensors').

  6. Analysis and Reliability Performance Comparison of Different Facial Image Features

    Directory of Open Access Journals (Sweden)

    J. Madhavan

    2014-11-01

    Full Text Available This study performs reliability analysis on the different facial features with weighted retrieval accuracy on increasing facial database images. There are many methods analyzed in the existing papers with constant facial databases mentioned in the literature review. There were not much work carried out to study the performance in terms of reliability and also how the method will perform on increasing the size of the database. In this study certain feature extraction methods were analyzed on the regular performance measure and also the performance measures are modified to fit the real time requirements by giving weight ages for the closer matches. In this study four facial feature extraction methods are performed, they are DWT with PCA, LWT with PCA, HMM with SVD and Gabor wavelet with HMM. Reliability of these methods are analyzed and reported. Among all these methods Gabor wavelet with HMM gives more reliability than other three methods performed. Experiments are carried out to evaluate the proposed approach on the Olivetti Research Laboratory (ORL face database.

  7. New Empirical Evidence on the Validity and the Reliability of the Early Life Stress Questionnaire in a Polish Sample

    Science.gov (United States)

    Sokołowski, Andrzej; Dragan, Wojciech Ł.

    2017-01-01

    Background: The Early Life Stress Questionnaire (ELSQ) is widely used to estimate the prevalence of negative events during childhood, including emotional, physical, verbal, sexual abuse, negligence, severe conflicts, separation, parental divorce, substance abuse, poverty, and so forth. Objective: This study presents the psychometric properties of the Polish adaptation of the ELSQ. It also verifies if early life stress (ELS) is a good predictor of psychopathology symptoms during adulthood. Materials and Methods: We analyzed data from two samples. Sample 1 was selected by random quota method from across the country and included 609 participants aged 18-50 years, 306 women (50.2%) and 303 men (49.8%). Sample 2 contained 503 young adults (253 women and 250 men) aged 18–25. Confirmatory and exploratory factor analyses were used to measure ELSQ internal consistency. The validity was based on the relation to psychopathological symptoms and substance misuse. Results: Results showed good internal consistency and validity. Exploratory factor analysis indicates a six-factor structure of the ELSQ. ELS was related to psychopathology in adulthood, including depressive, sociophobic, vegetative as well as pain symptoms. ELSQ score correlated also with alcohol use, but not nicotine dependence. Moreover, ELS was correlated with stress in adulthood. Conclusion: The findings indicate that the Polish version of the ELSQ is a valid and reliable instrument for assessing ELS in the Polish population and may be applied in both clinical and community samples.

  8. Using Empirical Article Analysis to Assess Research Methods Courses

    Science.gov (United States)

    Bachiochi, Peter; Everton, Wendi; Evans, Melanie; Fugere, Madeleine; Escoto, Carlos; Letterman, Margaret; Leszczynski, Jennifer

    2011-01-01

    Developing students who can apply their knowledge of empirical research is a key outcome of the undergraduate psychology major. This learning outcome was assessed in two research methods courses by having students read and analyze a condensed empirical journal article. At the start and end of the semester, students in multiple sections of an…

  9. Using Empirical Article Analysis to Assess Research Methods Courses

    Science.gov (United States)

    Bachiochi, Peter; Everton, Wendi; Evans, Melanie; Fugere, Madeleine; Escoto, Carlos; Letterman, Margaret; Leszczynski, Jennifer

    2011-01-01

    Developing students who can apply their knowledge of empirical research is a key outcome of the undergraduate psychology major. This learning outcome was assessed in two research methods courses by having students read and analyze a condensed empirical journal article. At the start and end of the semester, students in multiple sections of an…

  10. ANALYSIS OF AVAILABILITY AND RELIABILITY IN RHIC OPERATIONS.

    Energy Technology Data Exchange (ETDEWEB)

    PILAT, F.; INGRASSIA, P.; MICHNOFF, R.

    2006-06-26

    RHIC has been successfully operated for 5 years as a collider for different species, ranging from heavy ions including gold and copper, to polarized protons. We present a critical analysis of reliability data for RHIC that not only identifies the principal factors limiting availability but also evaluates critical choices at design times and assess their impact on present machine performance. RHIC availability data are typical when compared to similar high-energy colliders. The critical analysis of operations data is the basis for studies and plans to improve RHIC machine availability beyond the 50-60% typical of high-energy colliders.

  11. Using functional analysis diagrams to improve product reliability and cost

    Directory of Open Access Journals (Sweden)

    Ioannis Michalakoudis

    2016-12-01

    Full Text Available Failure mode and effects analysis and value engineering are well-established methods in the manufacturing industry, commonly applied to optimize product reliability and cost, respectively. Both processes, however, require cross-functional teams to identify and evaluate the product/process functions and are resource-intensive, hence their application is mostly limited to large organizations. In this article, we present a methodology involving the concurrent execution of failure mode and effects analysis and value engineering, assisted by a set of hierarchical functional analysis diagram models, along with the outcomes of a pilot application in a UK-based manufacturing small and medium enterprise. Analysis of the results indicates that this new approach could significantly enhance the resource efficiency and effectiveness of both failure mode and effects analysis and value engineering processes.

  12. Mutation Analysis Approach to Develop Reliable Object-Oriented Software

    Directory of Open Access Journals (Sweden)

    Monalisa Sarma

    2014-01-01

    Full Text Available In general, modern programs are large and complex and it is essential that they should be highly reliable in applications. In order to develop highly reliable software, Java programming language developer provides a rich set of exceptions and exception handling mechanisms. Exception handling mechanisms are intended to help developers build robust programs. Given a program with exception handling constructs, for an effective testing, we are to detect whether all possible exceptions are raised and caught or not. However, complex exception handling constructs make it tedious to trace which exceptions are handled and where and which exceptions are passed on. In this paper, we address this problem and propose a mutation analysis approach to develop reliable object-oriented programs. We have applied a number of mutation operators to create a large set of mutant programs with different type of faults. We then generate test cases and test data to uncover exception related faults. The test suite so obtained is applied to the mutant programs measuring the mutation score and hence verifying whether mutant programs are effective or not. We have tested our approach with a number of case studies to substantiate the efficacy of the proposed mutation analysis technique.

  13. Strength Reliability Analysis of Stiffened Cylindrical Shells Considering Failure Correlation

    Institute of Scientific and Technical Information of China (English)

    Xu Bai; Liping Sun; Wei Qin; Yongkun Lv

    2014-01-01

    The stiffened cylindrical shell is commonly used for the pressure hull of submersibles and the legs of offshore platforms. There are various failure modes because of uncertainty with the structural size and material properties, uncertainty of the calculation model and machining errors. Correlations among failure modes must be considered with the structural reliability of stiffened cylindrical shells. However, the traditional method cannot consider the correlations effectively. The aim of this study is to present a method of reliability analysis for stiffened cylindrical shells which considers the correlations among failure modes. Firstly, the joint failure probability calculation formula of two related failure modes is derived through use of the 2D joint probability density function. Secondly, the full probability formula of the tandem structural system is given with consideration to the correlations among failure modes. At last, the accuracy of the system reliability calculation is verified through use of the Monte Carlo simulation. Result of the analysis shows the failure probability of stiffened cylindrical shells can be gained through adding the failure probability of each mode.

  14. Reliability Analysis of Penetration Systems Using Nondeterministic Methods

    Energy Technology Data Exchange (ETDEWEB)

    FIELD JR.,RICHARD V.; PAEZ,THOMAS L.; RED-HORSE,JOHN R.

    1999-10-27

    Device penetration into media such as metal and soil is an application of some engineering interest. Often, these devices contain internal components and it is of paramount importance that all significant components survive the severe environment that accompanies the penetration event. In addition, the system must be robust to perturbations in its operating environment, some of which exhibit behavior which can only be quantified to within some level of uncertainty. In the analysis discussed herein, methods to address the reliability of internal components for a specific application system are discussed. The shock response spectrum (SRS) is utilized in conjunction with the Advanced Mean Value (AMV) and Response Surface methods to make probabilistic statements regarding the predicted reliability of internal components. Monte Carlo simulation methods are also explored.

  15. Analytical reliability analysis of soil-water characteristic curve

    Directory of Open Access Journals (Sweden)

    Johari A.

    2016-01-01

    Full Text Available The Soil Water Characteristic Curve (SWCC, also known as the soil water-retention curve, is an important part of any constitutive relationship for unsaturated soils. Deterministic assessment of SWCC has received considerable attention in the past few years. However the uncertainties of the parameters which affect SWCC dictate that the problem is of a probabilistic nature rather than being deterministic. In this research, a Gene Expression Programming (GEP-based SWCC model is employed to assess the reliability of SWCC. For this purpose, the Jointly Distributed Random Variables (JDRV method is used as an analytical method for reliability analysis. All input parameters of the model which are initial void ratio, initial water content, silt and clay contents are set to be stochastic and modelled using truncated normal probability density functions. The results are compared with those of the Monte Carlo (MC simulation. It is shown that the initial water content is the most effective parameter in SWCC.

  16. Optimization Based Efficiencies in First Order Reliability Analysis

    Science.gov (United States)

    Peck, Jeffrey A.; Mahadevan, Sankaran

    2003-01-01

    This paper develops a method for updating the gradient vector of the limit state function in reliability analysis using Broyden's rank one updating technique. In problems that use commercial code as a black box, the gradient calculations are usually done using a finite difference approach, which becomes very expensive for large system models. The proposed method replaces the finite difference gradient calculations in a standard first order reliability method (FORM) with Broyden's Quasi-Newton technique. The resulting algorithm of Broyden updates within a FORM framework (BFORM) is used to run several example problems, and the results compared to standard FORM results. It is found that BFORM typically requires fewer functional evaluations that FORM to converge to the same answer.

  17. A Reliable and Valid Weighted Scoring Instrument for Use in Grading APA-Style Empirical Research Report

    Science.gov (United States)

    Greenberg, Kathleen Puglisi

    2012-01-01

    The scoring instrument described in this article is based on a deconstruction of the seven sections of an American Psychological Association (APA)-style empirical research report into a set of learning outcomes divided into content-, expression-, and format-related categories. A double-weighting scheme used to score the report yields a final grade…

  18. A Reliable and Valid Weighted Scoring Instrument for Use in Grading APA-Style Empirical Research Report

    Science.gov (United States)

    Greenberg, Kathleen Puglisi

    2012-01-01

    The scoring instrument described in this article is based on a deconstruction of the seven sections of an American Psychological Association (APA)-style empirical research report into a set of learning outcomes divided into content-, expression-, and format-related categories. A double-weighting scheme used to score the report yields a final grade…

  19. POSSIBILITY AND EVIDENCE-BASED RELIABILITY ANALYSIS AND DESIGN OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    Hong-Zhong Huang

    2013-01-01

    Full Text Available Engineering design under uncertainty has gained considerable attention in recent years. A great multitude of new design optimization methodologies and reliability analysis approaches are put forth with the aim of accommodating various uncertainties. Uncertainties in practical engineering applications are commonly classified into two categories, i.e., aleatory uncertainty and epistemic uncertainty. Aleatory uncertainty arises because of unpredictable variation in the performance and processes of systems, it is irreducible even adding more data or knowledge. On the other hand, epistemic uncertainty stems from lack of knowledge of the system due to limited data, measurement limitations, or simplified approximations in modeling system behavior and it can be reduced by obtaining more data or knowledge. More specifically, aleatory uncertainty is naturally represented by a statistical distribution and its associated parameters can be characterized by sufficient data. If, however, the data is limited and can be quantified in a statistical sense, epistemic uncertainty can be considered as an alternative tool in such a situation. Of the several optional treatments for epistemic uncertainty, possibility theory and evidence theory have proved to be the most computationally efficient and stable for reliability analysis and engineering design optimization. This study first attempts to provide a better understanding of uncertainty in engineering design by giving a comprehensive overview of its classifications, theories and design considerations. Then a review is conducted of general topics such as the foundations and applications of possibility theory and evidence theory. This overview includes the most recent results from theoretical research, computational developments and performance improvement of possibility theory and evidence theory with an emphasis on revealing the capability and characteristics of quantifying uncertainty from different perspectives

  20. An Empirical Analysis of Foreign Direct Investment in Pakistan

    Directory of Open Access Journals (Sweden)

    Akbar Minhas

    2015-04-01

    Full Text Available The aim of this paper is to explore the trends in Foreign Direct Investment (FDI inflows in Pakistan and to identify the key determinants of FDI for the period of 2000-2013. The country experienced a continuous surge in FDI inflows from 2000-2008. On the contrary, the phase of 2009-2013 has been characterized by a persistent decline in FDI in Pakistan. This slump is mainly attributed to political and economic instability as wells as poor law and order situation in the country. Keeping these periods with differing results in perspective, multiple regression analysis is employed to empirically analyze the key determinants that are expected to explain variation in FDI in Pakistan. The selected variables were found significant determinants of FDI in Pakistan. Gross Domestic Product (GDP, degree of trade openness and regime of dictatorship have a significant positive effect on FDI. While, terrorism attacks foreign debt, exchange rate, political instability, and domestic capital formation are negatively significant determinants of FDI inflows in Pakistan. Considering the dynamic changes in the broad macro factors in economy, this study provides a fresh perspective on the factors that determine FDI in Pakistan. Moreover, the study findings provide important insights to policy makers to design policy measures that enhance FDI inflows in Pakistan.

  1. RELIABILITY ANALYSIS OF URBAN RAINWATER HARVESTING FOR THREE TEXAS CITIES

    Directory of Open Access Journals (Sweden)

    Dustin Lawrence

    2016-01-01

    Full Text Available The purpose of this study was to inform decision makers at state and local levels, as well as property owners about the amount of water that can be supplied by rainwater harvesting systems in Texas so that it may be included in any future planning. Reliability of a rainwater tank is important because people want to know that a source of water can be depended on. Performance analyses were conducted on rainwater harvesting tanks for three Texas cities under different rainfall conditions and multiple scenarios to demonstrate the importance of optimizing rainwater tank design. Reliability curves were produced and reflect the percentage of days in a year that water can be supplied by a tank. Operational thresholds were reached in all scenarios and mark the point at which reliability increases by only 2% or less with an increase in tank size. A payback period analysis was conducted on tank sizes to estimate the amount of time it would take to recoup the cost of installing a rainwater harvesting system.

  2. A Bayesian Framework for Reliability Analysis of Spacecraft Deployments

    Science.gov (United States)

    Evans, John W.; Gallo, Luis; Kaminsky, Mark

    2012-01-01

    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.

  3. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  4. Reliability and risk analysis using artificial neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, D.G. [Sandia National Labs., Albuquerque, NM (United States)

    1995-12-31

    This paper discusses preliminary research at Sandia National Laboratories into the application of artificial neural networks for reliability and risk analysis. The goal of this effort is to develop a reliability based methodology that captures the complex relationship between uncertainty in material properties and manufacturing processes and the resulting uncertainty in life prediction estimates. The inputs to the neural network model are probability density functions describing system characteristics and the output is a statistical description of system performance. The most recent application of this methodology involves the comparison of various low-residue, lead-free soldering processes with the desire to minimize the associated waste streams with no reduction in product reliability. Model inputs include statistical descriptions of various material properties such as the coefficients of thermal expansion of solder and substrate. Consideration is also given to stochastic variation in the operational environment to which the electronic components might be exposed. Model output includes a probabilistic characterization of the fatigue life of the surface mounted component.

  5. Towards MOOC for Technical Courses: A Blended Learning Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Siti Feirusz Ahmad Fesol

    2016-12-01

    Full Text Available Massive Open Online Learning (MOOC is one of the rapidly growing and the most trending online learning platform throughout the world. As reported by Class Central up until December 2015, there are more than a total of 4200 courses, which enrolled more than 35 million students and adopted by more than 500 universities all over the world. Thus, the objective of this study is to identify the students’ readiness towards MOOC technical courses based on blended learning approach. This study adapted quantitative based approach to analyse the data gathered.  Descriptive analysis and factor analysis are used to empirically analyse a total of 39 items on student attitude towards blended learning. This study successfully in developing six dimensions of student attitude towards the implementation of MOOC learning. The attributes namely are attitude towards learning flexibility, online learning, study management, technology, online interaction, and classroom learning. The findings summarized that, when students had a positive attitude towards learning flexibility, online learning, study management, technology, and online interaction, the students were more likely to adapt to blended learning and highly ready towards MOOC learning. On the other hand, when students had a positive attitude towards classroom learning, they were less likely ready towards MOOC learning, as they would prefer to meet their lecturers and friends in a physical lecture class compared to on the web-based. Understanding of student’s readiness towards MOOC learning based on blended learning approach is one of the critical success factors for implementing successful MOOC by higher learning institutions.

  6. Fifty Years of THERP and Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    In 1962 at a Human Factors Society symposium, Alan Swain presented a paper introducing a Technique for Human Error Rate Prediction (THERP). This was followed in 1963 by a Sandia Laboratories monograph outlining basic human error quantification using THERP and, in 1964, by a special journal edition of Human Factors on quantification of human performance. Throughout the 1960s, Swain and his colleagues focused on collecting human performance data for the Sandia Human Error Rate Bank (SHERB), primarily in connection with supporting the reliability of nuclear weapons assembly in the US. In 1969, Swain met with Jens Rasmussen of Risø National Laboratory and discussed the applicability of THERP to nuclear power applications. By 1975, in WASH-1400, Swain had articulated the use of THERP for nuclear power applications, and the approach was finalized in the watershed publication of the NUREG/CR-1278 in 1983. THERP is now 50 years old, and remains the most well known and most widely used HRA method. In this paper, the author discusses the history of THERP, based on published reports and personal communication and interviews with Swain. The author also outlines the significance of THERP. The foundations of human reliability analysis are found in THERP: human failure events, task analysis, performance shaping factors, human error probabilities, dependence, event trees, recovery, and pre- and post-initiating events were all introduced in THERP. While THERP is not without its detractors, and it is showing signs of its age in the face of newer technological applications, the longevity of THERP is a testament of its tremendous significance. THERP started the field of human reliability analysis. This paper concludes with a discussion of THERP in the context of newer methods, which can be seen as extensions of or departures from Swain’s pioneering work.

  7. Reliability and Robustness Analysis of the Masinga Dam under Uncertainty

    Directory of Open Access Journals (Sweden)

    Hayden Postle-Floyd

    2017-02-01

    Full Text Available Kenya’s water abstraction must meet the projected growth in municipal and irrigation demand by the end of 2030 in order to achieve the country’s industrial and economic development plan. The Masinga dam, on the Tana River, is the key to meeting this goal to satisfy the growing demands whilst also continuing to provide hydroelectric power generation. This study quantitatively assesses the reliability and robustness of the Masinga dam system under uncertain future supply and demand using probabilistic climate and population projections, and examines how long-term planning may improve the longevity of the dam. River flow and demand projections are used alongside each other as inputs to the dam system simulation model linked to an optimisation engine to maximise water availability. Water availability after demand satisfaction is assessed for future years, and the projected reliability of the system is calculated for selected years. The analysis shows that maximising power generation on a short-term year-by-year basis achieves 80%, 50% and 1% reliability by 2020, 2025 and 2030 onwards, respectively. Longer term optimal planning, however, has increased system reliability to up to 95% in 2020, 80% in 2025, and more than 40% in 2030 onwards. In addition, increasing the capacity of the reservoir by around 25% can significantly improve the robustness of the system for all future time periods. This study provides a platform for analysing the implication of different planning and management of Masinga dam and suggests that careful consideration should be given to account for growing municipal needs and irrigation schemes in both the immediate and the associated Tana River basin.

  8. An Empirical Analysis of Money Supply Process in Nepal

    OpenAIRE

    Prakash Kumar Shrestha Ph.D.

    2013-01-01

    This paper examines the money supply process in Nepal empirically on the basis of mainstream and Post-Keynesian theoretical perspectives for both pre and post-liberalization period covering the sample period of 1965/66-2009/10. The relative contribution of different components of money supply has been computed and the money supply as well as money multiplier function has been estimated. Empirical results show that disposable high powered money is found to be a major contributor to the change ...

  9. Human Performance Modeling for Dynamic Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory

    2015-08-01

    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  10. Reliability Analysis of a Mono-Tower Platform

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Enevoldsen, I.; Sørensen, John Dalsgaard;

    In this paper a reliability analysis of a Mono-tower platform is presented. The failure modes, considered, are yelding in the tube cross-sections, and fatigue failure in the butt welds. The fatigue failure mode is investigated with a fatigue model, where the fatigue strength is expressed through SN...... for the fatigue limit state is a significant failure mode for the Mono.tower platform. Further, it is shown for the fatigue failure mode the the largest contributions to the overall uncertainty are due to the damping ratio, the inertia coefficient, the stress concentration factor, the model uncertainties...

  11. Reliability Analysis of a Mono-Tower Platform

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Enevoldsen, I.; Sørensen, John Dalsgaard;

    1990-01-01

    In this paper, a reliability analysis of a Mono-tower platform is presented. Te failure modes considered are yielding in the tube cross sections and fatigue failure in the butts welds. The fatigue failrue mode is investigated with a fatigue model, where the fatigue strength is expressed through SN...... that the fatigue limit state is a significant failure mode for the Mono-tower platform. Further, it is shown for the fatigue failure mode that the largest contributions to the overall uncertainty are due to the damping ratio, the inertia coefficient, the stress concentration factor, the model uncertainties...

  12. Fault Diagnosis and Reliability Analysis Using Fuzzy Logic Method

    Institute of Scientific and Technical Information of China (English)

    Miao Zhinong; Xu Yang; Zhao Xiangyu

    2006-01-01

    A new fuzzy logic fault diagnosis method is proposed. In this method, fuzzy equations are employed to estimate the component state of a system based on the measured system performance and the relationship between component state and system performance which is called as "performance-parameter" knowledge base and constructed by expert. Compared with the traditional fault diagnosis method, this fuzzy logic method can use humans intuitive knowledge and dose not need a precise mapping between system performance and component state. Simulation proves its effectiveness in fault diagnosis. Then, the reliability analysis is performed based on the fuzzy logic method.

  13. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    G. W. Parry; J.A Forester; V.N. Dang; S. M. L. Hendrickson; M. Presley; E. Lois; J. Xing

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure event (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.

  14. Genetic diversity analysis of highly incomplete SNP genotype data with imputations: an empirical assessment.

    Science.gov (United States)

    Fu, Yong-Bi

    2014-03-13

    Genotyping by sequencing (GBS) recently has emerged as a promising genomic approach for assessing genetic diversity on a genome-wide scale. However, concerns are not lacking about the uniquely large unbalance in GBS genotype data. Although some genotype imputation has been proposed to infer missing observations, little is known about the reliability of a genetic diversity analysis of GBS data, with up to 90% of observations missing. Here we performed an empirical assessment of accuracy in genetic diversity analysis of highly incomplete single nucleotide polymorphism genotypes with imputations. Three large single-nucleotide polymorphism genotype data sets for corn, wheat, and rice were acquired, and missing data with up to 90% of missing observations were randomly generated and then imputed for missing genotypes with three map-independent imputation methods. Estimating heterozygosity and inbreeding coefficient from original, missing, and imputed data revealed variable patterns of bias from assessed levels of missingness and genotype imputation, but the estimation biases were smaller for missing data without genotype imputation. The estimates of genetic differentiation were rather robust up to 90% of missing observations but became substantially biased when missing genotypes were imputed. The estimates of topology accuracy for four representative samples of interested groups generally were reduced with increased levels of missing genotypes. Probabilistic principal component analysis based imputation performed better in terms of topology accuracy than those analyses of missing data without genotype imputation. These findings are not only significant for understanding the reliability of the genetic diversity analysis with respect to large missing data and genotype imputation but also are instructive for performing a proper genetic diversity analysis of highly incomplete GBS or other genotype data.

  15. Integration of human reliability analysis into the high consequence process

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.; Morzinski, J.

    1998-12-01

    When performing a hazards analysis (HA) for a high consequence process, human error often plays a significant role in the hazards analysis. In order to integrate human error into the hazards analysis, a human reliability analysis (HRA) is performed. Human reliability is the probability that a person will correctly perform a system-required activity in a required time period and will perform no extraneous activity that will affect the correct performance. Even though human error is a very complex subject that can only approximately be addressed in risk assessment, an attempt must be made to estimate the effect of human errors. The HRA provides data that can be incorporated in the hazard analysis event. This paper will discuss the integration of HRA into a HA for the disassembly of a high explosive component. The process was designed to use a retaining fixture to hold the high explosive in place during a rotation of the component. This tool was designed as a redundant safety feature to help prevent a drop of the explosive. This paper will use the retaining fixture to demonstrate the following HRA methodology`s phases. The first phase is to perform a task analysis. The second phase is the identification of the potential human, both cognitive and psychomotor, functions performed by the worker. During the last phase the human errors are quantified. In reality, the HRA process is an iterative process in which the stages overlap and information gathered in one stage may be used to refine a previous stage. The rationale for the decision to use or not use the retaining fixture and the role the HRA played in the decision will be discussed.

  16. Development of Items for a Pedagogical Content Knowledge Test Based on Empirical Analysis of Pupils' Errors

    Science.gov (United States)

    Jüttner, Melanie; Neuhaus, Birgit J.

    2012-05-01

    In view of the lack of instruments for measuring biology teachers' pedagogical content knowledge (PCK), this article reports on a study about the development of PCK items for measuring teachers' knowledge of pupils' errors and ways for dealing with them. This study investigated 9th and 10th grade German pupils' (n = 461) drawings in an achievement test about the knee-jerk in biology, which were analysed by using the inductive qualitative analysis of their content. The empirical data were used for the development of the items in the PCK test. The validation of the items was determined with think-aloud interviews of German secondary school teachers (n = 5). If the item was determined, the reliability was tested by the results of German secondary school biology teachers (n = 65) who took the PCK test. The results indicated that these items are satisfactorily reliable (Cronbach's alpha values ranged from 0.60 to 0.65). We suggest a larger sample size and American biology teachers be used in our further studies. The findings of this study about teachers' professional knowledge from the PCK test could provide new information about the influence of teachers' knowledge on their pupils' understanding of biology and their possible errors in learning biology.

  17. ON LINE COMPUTATION OF EMPIRICAL FORMULA BY MEANS OF ELEMENTAL ANALYSIS AND ITS PROGRAM DESIGN

    Institute of Scientific and Technical Information of China (English)

    钱朴; 胡培荣; 金正成

    1995-01-01

    On line determination of empirical formula for organic compounds with an automatic elemental analyzer and its program design method were investigated. The computational results shews that the reliability of computational results depends on the accuracy of the analytical data and the organic compound itself.

  18. Tailoring a Human Reliability Analysis to Your Industry Needs

    Science.gov (United States)

    DeMott, D. L.

    2016-01-01

    Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed

  19. Empirical Markov Chain Monte Carlo Bayesian analysis of fMRI data.

    Science.gov (United States)

    de Pasquale, F; Del Gratta, C; Romani, G L

    2008-08-01

    In this work an Empirical Markov Chain Monte Carlo Bayesian approach to analyse fMRI data is proposed. The Bayesian framework is appealing since complex models can be adopted in the analysis both for the image and noise model. Here, the noise autocorrelation is taken into account by adopting an AutoRegressive model of order one and a versatile non-linear model is assumed for the task-related activation. Model parameters include the noise variance and autocorrelation, activation amplitudes and the hemodynamic response function parameters. These are estimated at each voxel from samples of the Posterior Distribution. Prior information is included by means of a 4D spatio-temporal model for the interaction between neighbouring voxels in space and time. The results show that this model can provide smooth estimates from low SNR data while important spatial structures in the data can be preserved. A simulation study is presented in which the accuracy and bias of the estimates are addressed. Furthermore, some results on convergence diagnostic of the adopted algorithm are presented. To validate the proposed approach a comparison of the results with those from a standard GLM analysis, spatial filtering techniques and a Variational Bayes approach is provided. This comparison shows that our approach outperforms the classical analysis and is consistent with other Bayesian techniques. This is investigated further by means of the Bayes Factors and the analysis of the residuals. The proposed approach applied to Blocked Design and Event Related datasets produced reliable maps of activation.

  20. Liquefaction of Tangier soils by using physically based reliability analysis modelling

    Directory of Open Access Journals (Sweden)

    Dubujet P.

    2012-07-01

    Full Text Available Approaches that are widely used to characterize propensity of soils to liquefaction are mainly of empirical type. The potential of liquefaction is assessed by using correlation formulas that are based on field tests such as the standard and the cone penetration tests. These correlations depend however on the site where they were derived. In order to adapt them to other sites where seismic case histories are not available, further investigation is required. In this work, a rigorous one-dimensional modelling of the soil dynamics yielding liquefaction phenomenon is considered. Field tests consisting of core sampling and cone penetration testing were performed. They provided the necessary data for numerical simulations performed by using DeepSoil software package. Using reliability analysis, the probability of liquefaction was estimated and the obtained results were used to adapt Juang method to the particular case of sandy soils located in Tangier.

  1. Competition in the German pharmacy market: an empirical analysis.

    Science.gov (United States)

    Heinsohn, Jörg G; Flessa, Steffen

    2013-10-10

    Pharmaceutical products are an important component of expenditure on public health insurance in the Federal Republic of Germany. For years, German policy makers have regulated public pharmacies in order to limit the increase in costs. One reform has followed another, main objective being to increase competition in the pharmacy market. It is generally assumed that an increase in competition would reduce healthcare costs. However, there is a lack of empirical proof of a stronger orientation of German public pharmacies towards competition thus far. This paper analyses the self-perceptions of owners of German public pharmacies and their orientation towards competition in the pharmacy markets. It is based on a cross-sectional survey (N = 289) and distinguishes between successful and less successful pharmacies, the location of the pharmacies (e.g. West German States and East German States) and the gender of the pharmacy owner. The data are analysed descriptively by survey items and employing bivariate and structural equation modelling. The analysis reveals that the majority of owners of public pharmacies in Germany do not currently perceive very strong competitive pressure in the market. However, the innovativeness of the pharmacist is confirmed as most relevant for net revenue development and the profit margin. Some differences occur between regions, e.g. public pharmacies in West Germany have a significantly higher profit margin. This study provides evidence that the German healthcare reforms aimed at increasing the competition between public pharmacies in Germany have not been completely successful. Many owners of public pharmacies disregard instruments of active customer-orientated management (such as customer loyalty or an offensive position and economies of scale), which could give them a competitive advantage. However, it is clear that those pharmacists who strive for systematic and innovative management and adopt an offensive and competitive stance are quite

  2. Competition in the German pharmacy market: an empirical analysis

    Science.gov (United States)

    2013-01-01

    Background Pharmaceutical products are an important component of expenditure on public health insurance in the Federal Republic of Germany. For years, German policy makers have regulated public pharmacies in order to limit the increase in costs. One reform has followed another, main objective being to increase competition in the pharmacy market. It is generally assumed that an increase in competition would reduce healthcare costs. However, there is a lack of empirical proof of a stronger orientation of German public pharmacies towards competition thus far. Methods This paper analyses the self-perceptions of owners of German public pharmacies and their orientation towards competition in the pharmacy markets. It is based on a cross-sectional survey (N = 289) and distinguishes between successful and less successful pharmacies, the location of the pharmacies (e.g. West German States and East German States) and the gender of the pharmacy owner. The data are analysed descriptively by survey items and employing bivariate and structural equation modelling. Results The analysis reveals that the majority of owners of public pharmacies in Germany do not currently perceive very strong competitive pressure in the market. However, the innovativeness of the pharmacist is confirmed as most relevant for net revenue development and the profit margin. Some differences occur between regions, e.g. public pharmacies in West Germany have a significantly higher profit margin. Conclusions This study provides evidence that the German healthcare reforms aimed at increasing the competition between public pharmacies in Germany have not been completely successful. Many owners of public pharmacies disregard instruments of active customer-orientated management (such as customer loyalty or an offensive position and economies of scale), which could give them a competitive advantage. However, it is clear that those pharmacists who strive for systematic and innovative management and adopt an

  3. An empirical analysis of cigarette demand in Argentina

    Science.gov (United States)

    Martinez, Eugenio; Mejia, Raul; Pérez-Stable, Eliseo J

    2014-01-01

    Objective To estimate the long-term and short-term effects on cigarette demand in Argentina based on changes in cigarette price and income per person >14 years old. Method Public data from the Ministry of Economics and Production were analysed based on monthly time series data between 1994 and 2010. The econometric analysis used cigarette consumption per person >14 years of age as the dependent variable and the real income per person >14 years old and the real average price of cigarettes as independent variables. Empirical analyses were done to verify the order of integration of the variables, to test for cointegration to capture the long-term effects and to capture the short-term dynamics of the variables. Results The demand for cigarettes in Argentina was affected by changes in real income and the real average price of cigarettes. The long-term income elasticity was equal to 0.43, while the own-price elasticity was equal to −0.31, indicating a 10% increase in the growth of real income led to an increase in cigarette consumption of 4.3% and a 10% increase in the price produced a fall of 3.1% in cigarette consumption. The vector error correction model estimated that the short-term income elasticity was 0.25 and the short-term own-price elasticity of cigarette demand was −0.15. A simulation exercise showed that increasing the price of cigarettes by 110% would maximise revenues and result in a potentially large decrease in total cigarette consumption. Conclusion Econometric analyses of cigarette consumption and their relationship with cigarette price and income can provide valuable information for developing cigarette price policy. PMID:23760657

  4. An empirical analysis of cigarette demand in Argentina.

    Science.gov (United States)

    Martinez, Eugenio; Mejia, Raul; Pérez-Stable, Eliseo J

    2015-01-01

    To estimate the long-term and short-term effects on cigarette demand in Argentina based on changes in cigarette price and income per person >14 years old. Public data from the Ministry of Economics and Production were analysed based on monthly time series data between 1994 and 2010. The econometric analysis used cigarette consumption per person >14 years of age as the dependent variable and the real income per person >14 years old and the real average price of cigarettes as independent variables. Empirical analyses were done to verify the order of integration of the variables, to test for cointegration to capture the long-term effects and to capture the short-term dynamics of the variables. The demand for cigarettes in Argentina was affected by changes in real income and the real average price of cigarettes. The long-term income elasticity was equal to 0.43, while the own-price elasticity was equal to -0.31, indicating a 10% increase in the growth of real income led to an increase in cigarette consumption of 4.3% and a 10% increase in the price produced a fall of 3.1% in cigarette consumption. The vector error correction model estimated that the short-term income elasticity was 0.25 and the short-term own-price elasticity of cigarette demand was -0.15. A simulation exercise showed that increasing the price of cigarettes by 110% would maximise revenues and result in a potentially large decrease in total cigarette consumption. Econometric analyses of cigarette consumption and their relationship with cigarette price and income can provide valuable information for developing cigarette price policy. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Fatigue Reliability Analysis of Wind Turbine Cast Components

    Directory of Open Access Journals (Sweden)

    Hesam Mirzaei Rafsanjani

    2017-04-01

    Full Text Available The fatigue life of wind turbine cast components, such as the main shaft in a drivetrain, is generally determined by defects from the casting process. These defects may reduce the fatigue life and they are generally distributed randomly in components. The foundries, cutting facilities and test facilities can affect the verification of properties by testing. Hence, it is important to have a tool to identify which foundry, cutting and/or test facility produces components which, based on the relevant uncertainties, have the largest expected fatigue life or, alternatively, have the largest reliability to be used for decision-making if additional cost considerations are added. In this paper, a statistical approach is presented based on statistical hypothesis testing and analysis of covariance (ANCOVA which can be applied to compare different groups (manufacturers, suppliers, test facilities, etc. and to quantify the relevant uncertainties using available fatigue tests. Illustrative results are presented as obtained by statistical analysis of a large set of fatigue data for casted test components typically used for wind turbines. Furthermore, the SN curves (fatigue life curves based on applied stress for fatigue assessment are estimated based on the statistical analyses and by introduction of physical, model and statistical uncertainties used for the illustration of reliability assessment.

  6. Inclusion of fatigue effects in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, Candice D. [Vanderbilt University, Nashville, TN (United States); Mahadevan, Sankaran, E-mail: sankaran.mahadevan@vanderbilt.edu [Vanderbilt University, Nashville, TN (United States)

    2011-11-15

    The effect of fatigue on human performance has been observed to be an important factor in many industrial accidents. However, defining and measuring fatigue is not easily accomplished. This creates difficulties in including fatigue effects in probabilistic risk assessments (PRA) of complex engineering systems that seek to include human reliability analysis (HRA). Thus the objectives of this paper are to discuss (1) the importance of the effects of fatigue on performance, (2) the difficulties associated with defining and measuring fatigue, (3) the current status of inclusion of fatigue in HRA methods, and (4) the future directions and challenges for the inclusion of fatigue, specifically sleep deprivation, in HRA. - Highlights: >We highlight the need for fatigue and sleep deprivation effects on performance to be included in human reliability analysis (HRA) methods. Current methods do not explicitly include sleep deprivation effects. > We discuss the difficulties in defining and measuring fatigue. > We review sleep deprivation research, and discuss the limitations and future needs of the current HRA methods.

  7. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  8. Transient Reliability Analysis Capability Developed for CARES/Life

    Science.gov (United States)

    Nemeth, Noel N.

    2001-01-01

    The CARES/Life software developed at the NASA Glenn Research Center provides a general-purpose design tool that predicts the probability of the failure of a ceramic component as a function of its time in service. This award-winning software has been widely used by U.S. industry to establish the reliability and life of a brittle material (e.g., ceramic, intermetallic, and graphite) structures in a wide variety of 21st century applications.Present capabilities of the NASA CARES/Life code include probabilistic life prediction of ceramic components subjected to fast fracture, slow crack growth (stress corrosion), and cyclic fatigue failure modes. Currently, this code can compute the time-dependent reliability of ceramic structures subjected to simple time-dependent loading. For example, in slow crack growth failure conditions CARES/Life can handle sustained and linearly increasing time-dependent loads, whereas in cyclic fatigue applications various types of repetitive constant-amplitude loads can be accounted for. However, in real applications applied loads are rarely that simple but vary with time in more complex ways such as engine startup, shutdown, and dynamic and vibrational loads. In addition, when a given component is subjected to transient environmental and or thermal conditions, the material properties also vary with time. A methodology has now been developed to allow the CARES/Life computer code to perform reliability analysis of ceramic components undergoing transient thermal and mechanical loading. This means that CARES/Life will be able to analyze finite element models of ceramic components that simulate dynamic engine operating conditions. The methodology developed is generalized to account for material property variation (on strength distribution and fatigue) as a function of temperature. This allows CARES/Life to analyze components undergoing rapid temperature change in other words, components undergoing thermal shock. In addition, the capability has

  9. Univariate and Bivariate Empirical Mode Decomposition for Postural Stability Analysis

    Directory of Open Access Journals (Sweden)

    Jacques Duchêne

    2008-05-01

    Full Text Available The aim of this paper was to compare empirical mode decomposition (EMD and two new extended methods of  EMD named complex empirical mode decomposition (complex-EMD and bivariate empirical mode decomposition (bivariate-EMD. All methods were used to analyze stabilogram center of pressure (COP time series. The two new methods are suitable to be applied to complex time series to extract complex intrinsic mode functions (IMFs before the Hilbert transform is subsequently applied on the IMFs. The trace of the analytic IMF in the complex plane has a circular form, with each IMF having its own rotation frequency. The area of the circle and the average rotation frequency of IMFs represent efficient indicators of the postural stability status of subjects. Experimental results show the effectiveness of these indicators to identify differences in standing posture between groups.

  10. Productivity enhancement and reliability through AutoAnalysis

    Science.gov (United States)

    Garetto, Anthony; Rademacher, Thomas; Schulz, Kristian

    2015-09-01

    The decreasing size and increasing complexity of photomask features, driven by the push to ever smaller technology nodes, places more and more challenges on the mask house, particularly in terms of yield management and cost reduction. Particularly challenging for mask shops is the inspection, repair and review cycle which requires more time and skill from operators due to the higher number of masks required per technology node and larger nuisance defect counts. While the measurement throughput of the AIMS™ platform has been improved in order to keep pace with these trends, the analysis of aerial images has seen little advancement and remains largely a manual process. This manual analysis of aerial images is time consuming, dependent on the skill level of the operator and significantly contributes to the overall mask manufacturing process flow. AutoAnalysis, the first application available for the FAVOR® platform, offers a solution to these problems by providing fully automated analysis of AIMS™ aerial images. Direct communication with the AIMS™ system allows automated data transfer and analysis parallel to the measurements. User defined report templates allow the relevant data to be output in a manner that can be tailored to various internal needs and support the requests of your customers. Productivity is significantly improved due to the fast analysis, operator time is saved and made available for other tasks and reliability is no longer a concern as the most defective region is always and consistently captured. In this paper the concept and approach of AutoAnalysis will be presented as well as an update to the status of the project. The benefits arising from the use of AutoAnalysis will be discussed in more detail and a study will be performed in order to demonstrate.

  11. What Is a Reference Book? A Theoretical and Empirical Analysis.

    Science.gov (United States)

    Bates, Marcia J.

    1986-01-01

    Provides a definition of reference books based on their organizational structure and describes an empirical study which was conducted in three libraries to identify types of book organization and determine their frequency in reference departments and stack collections. The data are analyzed and shown to support the definition. (EM)

  12. WHAT FACTORS INFLUENCE QUALITY SERVICE IMPROVEMENT IN MONTENEGRO: EMPIRICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Djurdjica Perovic

    2013-03-01

    Full Text Available In this paper, using an Ordinary Least Square regression (OLS, we investigate whether intangible elements influence tourist's perception about service quality. Our empirical results based on tourist survey in Montenegro, indicate that intangible elements of tourism product have positive impact on tourist's overall perception of service quality in Montenegro.

  13. An empirical analysis of asset-backed securitization

    NARCIS (Netherlands)

    Vink, D.; Thibeault, A.

    2007-01-01

    In this study we provide empirical evidence demonstrating a relationship between the nature of the assets and the primary market spread. The model also provides predictions on how other pricing characteristics affect spread, since little is known about how and why spreads of asset-backed securities

  14. An empirical analysis of asset-backed securitization

    NARCIS (Netherlands)

    Vink, D.; Thibeault, A.

    2007-01-01

    In this study we provide empirical evidence demonstrating a relationship between the nature of the assets and the primary market spread. The model also provides predictions on how other pricing characteristics affect spread, since little is known about how and why spreads of asset-backed securities

  15. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    Science.gov (United States)

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue.

  16. Reliability analysis and updating of deteriorating systems with subset simulation

    DEFF Research Database (Denmark)

    Schneider, Ronald; Thöns, Sebastian; Straub, Daniel

    2017-01-01

    Bayesian updating of the system deterioration model. The updated system reliability is then obtained through coupling the updated deterioration model with a probabilistic structural model. The underlying high-dimensional structural reliability problems are solved using subset simulation, which...

  17. Suitability Analysis of Continuous-Use Reliability Growth Projection Models

    Science.gov (United States)

    2015-03-26

    exists for all types, shapes, and sizes. The primary focus of this study is a comparison of reliability growth projection models designed for...requirements to use reliability growth models, recent studies have noted trends in reliability failures throughout the DoD. In [14] Dr. Michael Gilmore...so a strict exponential distribu- tion was used to stay within their assumptions. In reality, however, reliability growth models often must be used

  18. New Mathematical Derivations Applicable to Safety and Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, J.A.; Ferson, S.

    1999-04-19

    Boolean logic expressions are often derived in safety and reliability analysis. Since the values of the operands are rarely exact, accounting for uncertainty with the tightest justifiable bounds is important. Accurate determination of result bounds is difficult when the inputs have constraints. One example of a constraint is that an uncertain variable that appears multiple times in a Boolean expression must always have the same value, although the value cannot be exactly specified. A solution for this repeated variable problem is demonstrated for two Boolean classes. The classes, termed functions with unate variables (including, but not limited to unate functions), and exclusive-or functions, frequently appear in Boolean equations for uncertain outcomes portrayed by logic trees (event trees and fault trees).

  19. Applicability of simplified human reliability analysis methods for severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Boring, R.; St Germain, S. [Idaho National Lab., Idaho Falls, Idaho (United States); Banaseanu, G.; Chatri, H.; Akl, Y. [Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2016-03-15

    Most contemporary human reliability analysis (HRA) methods were created to analyse design-basis accidents at nuclear power plants. As part of a comprehensive expansion of risk assessments at many plants internationally, HRAs will begin considering severe accident scenarios. Severe accidents, while extremely rare, constitute high consequence events that significantly challenge successful operations and recovery. Challenges during severe accidents include degraded and hazardous operating conditions at the plant, the shift in control from the main control room to the technical support center, the unavailability of plant instrumentation, and the need to use different types of operating procedures. Such shifts in operations may also test key assumptions in existing HRA methods. This paper discusses key differences between design basis and severe accidents, reviews efforts to date to create customized HRA methods suitable for severe accidents, and recommends practices for adapting existing HRA methods that are already being used for HRAs at the plants. (author)

  20. Time-dependent reliability analysis and condition assessment of structures

    Energy Technology Data Exchange (ETDEWEB)

    Ellingwood, B.R. [Johns Hopkins Univ., Baltimore, MD (United States)

    1997-01-01

    Structures generally play a passive role in assurance of safety in nuclear plant operation, but are important if the plant is to withstand the effect of extreme environmental or abnormal events. Relative to mechanical and electrical components, structural systems and components would be difficult and costly to replace. While the performance of steel or reinforced concrete structures in service generally has been very good, their strengths may deteriorate during an extended service life as a result of changes brought on by an aggressive environment, excessive loading, or accidental loading. Quantitative tools for condition assessment of aging structures can be developed using time-dependent structural reliability analysis methods. Such methods provide a framework for addressing the uncertainties attendant to aging in the decision process.

  1. Reliability analysis for the quench detection in the LHC machine

    CERN Document Server

    Denz, R; Vergara-Fernández, A

    2002-01-01

    The Large Hadron Collider (LHC) will incorporate a large amount of superconducting elements that require protection in case of a quench. Key elements in the quench protection system are the electronic quench detectors. Their reliability will have an important impact on the down time as well as on the operational cost of the collider. The expected rates of both false and missed quenches have been computed for several redundant detection schemes. The developed model takes account of the maintainability of the system to optimise the frequency of foreseen checks, and evaluate their influence on the performance of different detection topologies. Seen the uncertainty of the failure rate of the components combined with the LHC tunnel environment, the study has been completed with a sensitivity analysis of the results. The chosen detection scheme and the maintainability strategy for each detector family are given.

  2. A reliability analysis of the revised competitiveness index.

    Science.gov (United States)

    Harris, Paul B; Houston, John M

    2010-06-01

    This study examined the reliability of the Revised Competitiveness Index by investigating the test-retest reliability, interitem reliability, and factor structure of the measure based on a sample of 280 undergraduates (200 women, 80 men) ranging in age from 18 to 28 years (M = 20.1, SD = 2.1). The findings indicate that the Revised Competitiveness Index has high test-retest reliability, high inter-item reliability, and a stable factor structure. The results support the assertion that the Revised Competitiveness Index assesses competitiveness as a stable trait rather than a dynamic state.

  3. Derivation of reliable empirical models describing lead transfer from metal-polluted soils to radish (Raphanus sativa L.): Determining factors and soil criteria.

    Science.gov (United States)

    Zhang, Sha; Song, Jing; Cheng, Yinwen; Christie, Peter; Long, Jian; Liu, Lingfei

    2017-09-10

    Reliable models describing Pb transfer from soils to food crops are useful in the improvement of soil protection guidelines. This study provides mechanistic insights from in-situ soil solution measurement on the Pb uptake in the root tissues (RF) of radish, grown in 25 representative Pb-contaminated agricultural soils. Lead speciation and regression analysis indicate that >88.6% of the variation in RF Pb is attributable to free Pb(2+) activity (aPb(2+)) in the soil solution, which is predominantly controlled by pH and DOC. Higher DOC would increase the total dissolved Pb (CSol-Pb) in the soil solution but reduce the bioavailability of Pb to radish. CSol-Pb performs poorly in predicting RF Pb unless pH and DOC are included. However, 0.01M CaCl2 extractable Pb (CCC-Pb) alone can satisfactorily predict RF Pb, attributable to the fact that CCC-Pb is consistent with aPb(2+). CCC-Pb can be predicted using CSol-Pb and pH. Total soil Pb (CT-Pb), or 0.43M HNO3 extractable Pb (CNA-Pb) has a strong, non-linear correlation with CSol-Pb or CCC-Pb and it is therefore not surprising that CT-Pb or CNA-Pb, together with pH and CEC, can also satisfactorily predict RF Pb. Derived models are effective in identification of soils where RF Pb exceeds the food quality standard (FQS). Soil Pb criteria based on CT-Pb, CNA-Pb and CCC-Pb are derived by inverse use of empirical models. The derived Pb criterion (target value) based on CCC-Pb is 0.02mgkg(-1) and the stricter criterion (safe value) is 0.01mgkg(-1), which allows a 5% probability for RF Pb to exceed FQS. Safe values based on CT-Pb and CNA-Pb ranged from 26 to 1036mgkg(-1) and 9 to 745mgkg(-1), respectively. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. The Environmental Kuznets Curve. An empirical analysis for OECD countries

    Energy Technology Data Exchange (ETDEWEB)

    Georgiev, E.

    2008-09-15

    This paper tests the Environmental Kuznets Curve hypothesis for four local (SOx, NOx, CO, VOC) and two global (CO2, GHG) air pollutants. Using a new panel data set of thirty OECD countries, the paper finds that the postulated inverted U-shaped relationship between income and pollution does not hold for all gases. A meaningful Environmental Kuznets Curve exists only for CO, VOC and NOx, where for CO2 the curve is monotonically increasing. For GHG there is indication of an inverted U-shaped relationship between income and pollution, but still most countries are on the increasing path of the curve and hence the future development of the curve is uncertain. For SOx it was found that emissions follow an U-shaped curve. Based on the empirical results, the paper concludes that the Environmental Kuznets Curve does not hold for all gases, it is rather an empirical artefact than a regularity.

  5. Lagrangian analysis of fluid transport in empirical vortex ring flows

    OpenAIRE

    Shadden, Shawn C.; Dabiri, John O.; Marsden, Jerrold E.

    2006-01-01

    In this paper we apply dynamical systems analyses and computational tools to fluid transport in empirically measured vortex ring flows. Measurements of quasisteadily propagating vortex rings generated by a mechanical piston-cylinder apparatus reveal lobe dynamics during entrainment and detrainment that are consistent with previous theoretical and numerical studies. In addition, the vortex ring wake of a free-swimming Aurelia aurita jellyfish is measured and analyzed in the framework of dynami...

  6. Islamic Banks and Financial Stability; An Empirical Analysis

    OpenAIRE

    Martin Cihak; Heiko Hesse

    2008-01-01

    The relative financial strength of Islamic banks is assessed empirically based on evidence covering individual Islamic and commercial banks in 18 banking systems with a substantial presence of Islamic banking. We find that (i) small Islamic banks tend to be financially stronger than small commercial banks; (ii) large commercial banks tend to be financially stronger than large Islamic banks; and (iii) small Islamic banks tend to be financially stronger than large Islamic banks, which may refle...

  7. Tax morale : theory and empirical analysis of tax compliance

    OpenAIRE

    Torgler, Benno

    2003-01-01

    Tax morale is puzzling in our society. Observations show that tax compliance cannot be satisfactorily explained by the level of enforcement. Other factors may well be relevant. This paper contains a short survey of important theoretical and empirical findings in the tax morale literature, focussing on personal income tax morale. The following three key topics are discussed: moral sentiments, fairness and the relationship between taxpayer and government. The survey stresses the ...

  8. An empirical analysis of after sales service and customer satisfaction

    OpenAIRE

    Hussain, Nazim; Waheed Akbar BHATTI; Azhar JILANI

    2011-01-01

    In today’s ever changing competitive environment, business cannot survive unless they satisfy their customers. The delivery of after sales service by a company is critical in satisfying customer needs and perceptions. In order to have quality after sales service a proper delivery system has to be in place. This is an empirical study on after sales quality of Pakistan’s automotive battery manufacturer. The research measured the quality of service in Atlas Battery, selling product with the bran...

  9. Islamic Banks and Financial Stability; An Empirical Analysis

    OpenAIRE

    Martin Cihak; Heiko Hesse

    2008-01-01

    The relative financial strength of Islamic banks is assessed empirically based on evidence covering individual Islamic and commercial banks in 18 banking systems with a substantial presence of Islamic banking. We find that (i) small Islamic banks tend to be financially stronger than small commercial banks; (ii) large commercial banks tend to be financially stronger than large Islamic banks; and (iii) small Islamic banks tend to be financially stronger than large Islamic banks, which may refle...

  10. Credit risk determinants analysis: Empirical evidence from Chinese commercial banks

    OpenAIRE

    LU, ZONGQI

    2013-01-01

    Abstract In order to investigate the potential determinants of credit risk in Chinese commercial banks, a panel dataset includes 342 bank-year observations from 2003 to 2012 in Chinese commercial banks are used to quantify the relationship between the selected variables and Chinese bank’s credit risk. Based on several robust test, the empirical results suggest the inflation rate and loan loss provision is significantly positive to Chinese commercial banks’ credit risk, on the other hand, m...

  11. The Rules of Standard Setting Organizations: an Empirical Analysis

    OpenAIRE

    Chiao, Benjamin; Lerner, Josh; Tirole, Jean

    2006-01-01

    This paper empirically explores the procedures employed by standard-setting organizations. Consistent with Lerner-Tirole (2004), we find (a) a negative relationship between the extent to which an SSO is oriented to technology sponsors and the concession level required of sponsors and (b) a positive correlation between the sponsor-friendliness of the selected SSO and the quality of the standard. We also develop and test two extensions of the earlier model: the presence of provisions mandating ...

  12. Failure Analysis towards Reliable Performance of Aero-Engines

    Directory of Open Access Journals (Sweden)

    T. Jayakumar

    1999-10-01

    Full Text Available Aero-engines are critical components whose reliable performance decides the primary safety of anaircrafthelicopter. This is met by rigorous maintenance schedule with periodic inspection/nondestructive testingof various engine components. In spite of these measures, failure of areo-engines do occur rather frequentlyin comparison to failure of other components. Systematic failure analysis helps one to identify root causeof the failure, thus enabling remedial measures to prevent recurrence of such failures. Turbine blades madeof nickel or cobalt-based alloys are used in aero-engines. These blades are subjected to complex loadingconditions at elevated temperatures. The main causes of failure of blades are attributed to creep, thermalfatigue and hot corrosion. Premature failure of blades in the combustion zone was reported in one of theaero-engines. The engine had both the compressor and the free-turbine in a common shaft. Detailedfailure analysis revealed the presence of creep voids in the blades that failed. Failure of turbine bladeswas also detected in another aero-engine operating in a coastal environment. In this failure, the protectivecoating on the blades was cracked at many locations. Grain boundary spikes were observed on these locations.The primary cause of this failure was the hot corrosion followed by creep damage

  13. Multi-Unit Considerations for Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    St. Germain, S.; Boring, R.; Banaseanu, G.; Akl, Y.; Chatri, H.

    2017-03-01

    This paper uses the insights from the Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) methodology to help identify human actions currently modeled in the single unit PSA that may need to be modified to account for additional challenges imposed by a multi-unit accident as well as identify possible new human actions that might be modeled to more accurately characterize multi-unit risk. In identifying these potential human action impacts, the use of the SPAR-H strategy to include both errors in diagnosis and errors in action is considered as well as identifying characteristics of a multi-unit accident scenario that may impact the selection of the performance shaping factors (PSFs) used in SPAR-H. The lessons learned from the Fukushima Daiichi reactor accident will be addressed to further help identify areas where improved modeling may be required. While these multi-unit impacts may require modifications to a Level 1 PSA model, it is expected to have much more importance for Level 2 modeling. There is little currently written specifically about multi-unit HRA issues. A review of related published research will be presented. While this paper cannot answer all issues related to multi-unit HRA, it will hopefully serve as a starting point to generate discussion and spark additional ideas towards the proper treatment of HRA in a multi-unit PSA.

  14. Fuzzy Reliability Analysis of the Shaft of a Steam Turbine

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Field surveying shows that the failure of the steam turbine's coupling is due to fatigue that is caused by compound stress. Fuzzy mathematics was applied to get the membership function of the fatigue strength rule. A formula of fuzzy reliability of the coupling was derived and a theory of coupling's fuzzy reliability is set up. The calculating method of the fuzzy reliability is explained by an illustrative example.

  15. A reliability generalization meta-analysis of coefficient alpha for the Reynolds Adolescent Depression Scale.

    Science.gov (United States)

    Vassar, Matt; Bradley, Greg

    2012-10-01

    The purpose of this study was to use a meta-analytic method known as reliability generalization to investigate the score reliability for a popular depression measure: The Reynolds Adolescent Depression Scale. We used the technique to provide an aggregate estimate of coefficient alpha across empirical studies that have employed the measure over time and across populations. Furthermore, we identified sample and demographic characteristics associated with variance in coefficient alpha. We discuss conditions associated with variability in coefficient alpha and alert researchers and practitioners to appropriate uses of the scale based on common reliability benchmarks.

  16. Reliability of videotaped observational gait analysis in patients with orthopedic impairments

    NARCIS (Netherlands)

    Brunnekreef, J.J.; Uden, C. van; Moorsel, S. van; Kooloos, J.G.M.

    2005-01-01

    BACKGROUND: In clinical practice, visual gait observation is often used to determine gait disorders and to evaluate treatment. Several reliability studies on observational gait analysis have been described in the literature and generally showed moderate reliability. However, patients with orthopedic

  17. Wind energy Computerized Maintenance Management System (CMMS) : data collection recommendations for reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Peters, Valerie A.; Ogilvie, Alistair; Veers, Paul S.

    2009-09-01

    This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a list of the data needed to support reliability and availability analysis, and gives specific recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of fielded wind turbines. This report is intended to help the reader develop a basic understanding of what data are needed from a Computerized Maintenance Management System (CMMS) and other data systems, for reliability analysis. The report provides: (1) a list of the data needed to support reliability and availability analysis; and (2) specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and a wider variety of analysis and reporting needs.

  18. Impact Factors of Energy Productivity in China: An Empirical Analysis

    Institute of Scientific and Technical Information of China (English)

    Wei Chu; Shen Manhong

    2007-01-01

    This article developed a decomposition model of energy productivity on the basis of the economic growth model. Four factors were considered which may influence China's energy productivity according to this model: technology improvement, resource allocation structure, industrial structure and institute arrangement. Then, an econometric model was employed to test the four factors empirically on the basis of China's statistical data from 1978 to 2004. Results indicated that capital deepening contributes the most (207%) to energy efficiency improvement, and impact from labor forces (13%) is the weakest one in resource factor; industrial structure (7%) and institute innovation (9.5%) positively improve the energy productivity.

  19. Reliable Classification of Geologic Surfaces Using Texture Analysis

    Science.gov (United States)

    Foil, G.; Howarth, D.; Abbey, W. J.; Bekker, D. L.; Castano, R.; Thompson, D. R.; Wagstaff, K.

    2012-12-01

    Communication delays and bandwidth constraints are major obstacles for remote exploration spacecraft. Due to such restrictions, spacecraft could make use of onboard science data analysis to maximize scientific gain, through capabilities such as the generation of bandwidth-efficient representative maps of scenes, autonomous instrument targeting to exploit targets of opportunity between communications, and downlink prioritization to ensure fast delivery of tactically-important data. Of particular importance to remote exploration is the precision of such methods and their ability to reliably reproduce consistent results in novel environments. Spacecraft resources are highly oversubscribed, so any onboard data analysis must provide a high degree of confidence in its assessment. The TextureCam project is constructing a "smart camera" that can analyze surface images to autonomously identify scientifically interesting targets and direct narrow field-of-view instruments. The TextureCam instrument incorporates onboard scene interpretation and mapping to assist these autonomous science activities. Computer vision algorithms map scenes such as those encountered during rover traverses. The approach, based on a machine learning strategy, trains a statistical model to recognize different geologic surface types and then classifies every pixel in a new scene according to these categories. We describe three methods for increasing the precision of the TextureCam instrument. The first uses ancillary data to segment challenging scenes into smaller regions having homogeneous properties. These subproblems are individually easier to solve, preventing uncertainty in one region from contaminating those that can be confidently classified. The second involves a Bayesian approach that maximizes the likelihood of correct classifications by abstaining from ambiguous ones. We evaluate these two techniques on a set of images acquired during field expeditions in the Mojave Desert. Finally, the

  20. Reliability Analysis and Modeling of ZigBee Networks

    Science.gov (United States)

    Lin, Cheng-Min

    The architecture of ZigBee networks focuses on developing low-cost, low-speed ubiquitous communication between devices. The ZigBee technique is based on IEEE 802.15.4, which specifies the physical layer and medium access control (MAC) for a low rate wireless personal area network (LR-WPAN). Currently, numerous wireless sensor networks have adapted the ZigBee open standard to develop various services to promote improved communication quality in our daily lives. The problem of system and network reliability in providing stable services has become more important because these services will be stopped if the system and network reliability is unstable. The ZigBee standard has three kinds of networks; star, tree and mesh. The paper models the ZigBee protocol stack from the physical layer to the application layer and analyzes these layer reliability and mean time to failure (MTTF). Channel resource usage, device role, network topology and application objects are used to evaluate reliability in the physical, medium access control, network, and application layers, respectively. In the star or tree networks, a series system and the reliability block diagram (RBD) technique can be used to solve their reliability problem. However, a division technology is applied here to overcome the problem because the network complexity is higher than that of the others. A mesh network using division technology is classified into several non-reducible series systems and edge parallel systems. Hence, the reliability of mesh networks is easily solved using series-parallel systems through our proposed scheme. The numerical results demonstrate that the reliability will increase for mesh networks when the number of edges in parallel systems increases while the reliability quickly drops when the number of edges and the number of nodes increase for all three networks. More use of resources is another factor impact on reliability decreasing. However, lower network reliability will occur due to

  1. Effectiveness and reliability analysis of emergency measures for flood prevention

    NARCIS (Netherlands)

    Lendering, K.T.; Jonkman, S.N.; Kok, M.

    2014-01-01

    During flood events emergency measures are used to prevent breaches in flood defences. However, there is still limited insight in their reliability and effectiveness. The objective of this paper is to develop a method to determine the reliability and effectiveness of emergency measures for flood

  2. Effectiveness and reliability analysis of emergency measures for flood prevention

    NARCIS (Netherlands)

    Lendering, K.T.; Jonkman, S.N.; Kok, M.

    2014-01-01

    During flood events emergency measures are used to prevent breaches in flood defences. However, there is still limited insight in their reliability and effectiveness. The objective of this paper is to develop a method to determine the reliability and effectiveness of emergency measures for flood def

  3. Empirical analysis on temporal statistics of human correspondence patterns

    Science.gov (United States)

    Li, Nan-Nan; Zhang, Ning; Zhou, Tao

    2008-11-01

    Recently, extensive empirical evidence shows that the timing of human behaviors obeys non-Possion statistics with heavy-tailed interevent time distribution. In this paper, we empirically study the correspondence pattern of a great Chinese scientist, named Hsue-Shen Tsien. Both the interevent time distribution and response time distributions deviate from the Poisson statistics, showing an approximate power-law decaying. The two power-law exponents are more or less the same (about 2.1), which strongly support the hypothesis in [A. Vázquez, J.G. Oliveira, Z. Dezsö, K.-I. Goh, I. Kondor, A.-L. Barabási, Phys. Rev. E 73 (2006) 036127] that the response time distribution of the tasks could in fact drive the interevent time distribution, and both the two distributions should decay with the same exponent. Our result is against the claim in [A. Vázquez, J.G. Oliveira, Z. Dezsö, K.-I. Goh, I. Kondor, A.-L. Barabási, Phys. Rev. E 73 (2006) 036127], which suggests the human correspondence pattern belongs to a universality class with exponent 1.5.

  4. Procedure for conducting a human-reliability analysis for nuclear power plants. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Bell, B.J.; Swain, A.D.

    1983-05-01

    This document describes in detail a procedure to be followed in conducting a human reliability analysis as part of a probabilistic risk assessment when such an analysis is performed according to the methods described in NUREG/CR-1278, Handbook for Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications. An overview of the procedure describing the major elements of a human reliability analysis is presented along with a detailed description of each element and an example of an actual analysis. An appendix consists of some sample human reliability analysis problems for further study.

  5. Wind turbine reliability : a database and analysis approach.

    Energy Technology Data Exchange (ETDEWEB)

    Linsday, James (ARES Corporation); Briand, Daniel; Hill, Roger Ray; Stinebaugh, Jennifer A.; Benjamin, Allan S. (ARES Corporation)

    2008-02-01

    The US wind Industry has experienced remarkable growth since the turn of the century. At the same time, the physical size and electrical generation capabilities of wind turbines has also experienced remarkable growth. As the market continues to expand, and as wind generation continues to gain a significant share of the generation portfolio, the reliability of wind turbine technology becomes increasingly important. This report addresses how operations and maintenance costs are related to unreliability - that is the failures experienced by systems and components. Reliability tools are demonstrated, data needed to understand and catalog failure events is described, and practical wind turbine reliability models are illustrated, including preliminary results. This report also presents a continuing process of how to proceed with controlling industry requirements, needs, and expectations related to Reliability, Availability, Maintainability, and Safety. A simply stated goal of this process is to better understand and to improve the operable reliability of wind turbine installations.

  6. Advanced response surface method for mechanical reliability analysis

    Institute of Scientific and Technical Information of China (English)

    L(U) Zhen-zhou; ZHAO Jie; YUE Zhu-feng

    2007-01-01

    Based on the classical response surface method (RSM), a novel RSM using improved experimental points (EPs) is presented for reliability analysis. Two novel points are included in the presented method. One is the use of linear interpolation, from which the total EPs for determining the RS are selected to be closer to the actual failure surface;the other is the application of sequential linear interpolation to control the distance between the surrounding EPs and the center EP, by which the presented method can ensure that the RS fits the actual failure surface in the region of maximum likelihood as the center EPs converge to the actual most probable point (MPP). Since the fitting precision of the RS to the actual failure surface in the vicinity of the MPP, which has significant contribution to the probability of the failure surface being exceeded, is increased by the presented method, the precision of the failure probability calculated by RS is increased as well. Numerical examples illustrate the accuracy and efficiency of the presented method.

  7. Extending Failure Modes and Effects Analysis Approach for Reliability Analysis at the Software Architecture Design Level

    NARCIS (Netherlands)

    Sozer, Hasan; Tekinerdogan, Bedir; Aksit, Mehmet; Lemos, de Rogerio; Gacek, Cristina

    2007-01-01

    Several reliability engineering approaches have been proposed to identify and recover from failures. A well-known and mature approach is the Failure Mode and Effect Analysis (FMEA) method that is usually utilized together with Fault Tree Analysis (FTA) to analyze and diagnose the causes of failures.

  8. Spinal appearance questionnaire: factor analysis, scoring, reliability, and validity testing.

    Science.gov (United States)

    Carreon, Leah Y; Sanders, James O; Polly, David W; Sucato, Daniel J; Parent, Stefan; Roy-Beaudry, Marjolaine; Hopkins, Jeffrey; McClung, Anna; Bratcher, Kelly R; Diamond, Beverly E

    2011-08-15

    Cross sectional. This study presents the factor analysis of the Spinal Appearance Questionnaire (SAQ) and its psychometric properties. Although the SAQ has been administered to a large sample of patients with adolescent idiopathic scoliosis (AIS) treated surgically, its psychometric properties have not been fully evaluated. This study presents the factor analysis and scoring of the SAQ and evaluates its psychometric properties. The SAQ and the Scoliosis Research Society-22 (SRS-22) were administered to AIS patients who were being observed, braced or scheduled for surgery. Standard demographic data and radiographic measures including Lenke type and curve magnitude were also collected. Of the 1802 patients, 83% were female; with a mean age of 14.8 years and mean initial Cobb angle of 55.8° (range, 0°-123°). From the 32 items of the SAQ, 15 loaded on two factors with consistent and significant correlations across all Lenke types. There is an Appearance (items 1-10) and an Expectations factor (items 12-15). Responses are summed giving a range of 5 to 50 for the Appearance domain and 5 to 20 for the Expectations domain. The Cronbach's α was 0.88 for both domains and Total score with a test-retest reliability of 0.81 for Appearance and 0.91 for Expectations. Correlations with major curve magnitude were higher for the SAQ Appearance and SAQ Total scores compared to correlations between the SRS Appearance and SRS Total scores. The SAQ and SRS-22 Scores were statistically significantly different in patients who were scheduled for surgery compared to those who were observed or braced. The SAQ is a valid measure of self-image in patients with AIS with greater correlation to curve magnitude than SRS Appearance and Total score. It also discriminates between patients who require surgery from those who do not.

  9. Empirical modeling and data analysis for engineers and applied scientists

    CERN Document Server

    Pardo, Scott A

    2016-01-01

    This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

  10. Reliability Modeling and Analysis of SCI Topological Network

    Directory of Open Access Journals (Sweden)

    Hongzhe Xu

    2012-03-01

    Full Text Available The problem of reliability modeling on the Scalable Coherent Interface (SCI rings and topological network is studied. The reliability models of three SCI rings are developed and the factors which influence the reliability of SCI rings are studied. By calculating the shortest path matrix and the path quantity matrix of different types SCI network topology, the communication characteristics of SCI network are obtained. For the situations of the node-damage and edge-damage, the survivability of SCI topological network is studied.

  11. System Reliability Analysis of Redundant Condition Monitoring Systems

    Institute of Scientific and Technical Information of China (English)

    YI Pengxing; HU Youming; YANG Shuzi; WU Bo; CUI Feng

    2006-01-01

    The development and application of new reliability models and methods are presented to analyze the system reliability of complex condition monitoring systems. The methods include a method analyzing failure modes of a type of redundant condition monitoring systems (RCMS) by invoking failure tree model, Markov modeling techniques for analyzing system reliability of RCMS, and methods for estimating Markov model parameters. Furthermore, a computing case is investigated and many conclusions upon this case are summarized. Results show that the method proposed here is practical and valuable for designing condition monitoring systems and their maintenance.

  12. Application of Reliability Analysis for Optimal Design of Monolithic Vertical Wall Breakwaters

    DEFF Research Database (Denmark)

    Burcharth, H. F.; Sørensen, John Dalsgaard; Christiani, E.

    1995-01-01

    Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of some of the most important failure modes are described. The failures are sliding and slip surface failure of a rubble mound and a clay foundation. Relevant design...... variables are identified and a reliability-based design optimization procedure is formulated. Results from an illustrative example are given....

  13. Reliability analysis of wind turbines exposed to dynamic loads

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2014-01-01

    . Therefore the turbine components should be designed to have sufficient reliability with respect to both extreme and fatigue loads also not be too costly (and safe). This paper presents models for uncertainty modeling and reliability assessment of especially the structural components such as tower, blades...... the reliability of the structural components. Illustrative examples are presented considering uncertainty modeling and reliability assessment for structural wind turbine components exposed to extreme loads and fatigue, respectively.......Wind turbines are exposed to highly dynamic loads that cause fatigue and extreme load effects which are subject to significant uncertainties. Further, reduction of cost of energy for wind turbines are very important in order to make wind energy competitive compared to other energy sources...

  14. Operation of Reliability Analysis Center (FY85-87)

    Science.gov (United States)

    1988-08-01

    environmental conditions at the time of the reported failure as well as the exact nature of the failure. 4 The diskette format (FMDR-21A) contains...based upon the reliability and maintainability standards and tasks delineated in NAC R&M-STD-ROO010 (Reliability Program Requirements Seleccion ). These...characteristics, environmental conditions at the time of the reported failure, and the exact nature of the failure, which has been categorized as follows

  15. reliability reliability

    African Journals Online (AJOL)

    eobe

    The design variables for the design of the sla. The design ... The presence of uncertainty in the analysis and de of engineering .... however, for certain complex elements, the methods ..... Standard BS EN 1990, CEN, European Committee for.

  16. Detection of Decreasing Vegetation Cover Based on Empirical Orthogonal Function and Temporal Unmixing Analysis

    OpenAIRE

    Di Xu; Ruishan Chen; Xiaoshi Xing; Wenpeng Lin

    2017-01-01

    Vegetation plays an important role in the energy exchange of the land surface, biogeochemical cycles, and hydrological cycles. MODIS (MODerate-resolution Imaging Spectroradiometer) EVI (Enhanced Vegetation Index) is considered as a quantitative indicator for examining dynamic vegetation changes. This paper applied a new method of integrated empirical orthogonal function (EOF) and temporal unmixing analysis (TUA) to detect the vegetation decreasing cover in Jiangsu Province of China. The empir...

  17. RELIABILITY ANALYSIS OF THE PRIMARY CYLINDER OF THE 10 MN HYDRAULIC PRESS

    Institute of Scientific and Technical Information of China (English)

    Zhao Jingyi; Zhuoru; Wang Yiqun

    2000-01-01

    According to the demand of high reliability of the primary cylinder of the hydraulic press,the reliability model of the primary cylinder is built after its reliability analysis.The stress of the primary cylinder is analyzed by finite element software-MARC,and the structure reliability of the cylinder based on stress-strength model is predicted,which would provide the reference to the design.

  18. Data envelopment analysis in service quality evaluation: an empirical study

    Science.gov (United States)

    Najafi, Seyedvahid; Saati, Saber; Tavana, Madjid

    2015-10-01

    Service quality is often conceptualized as the comparison between service expectations and the actual performance perceptions. It enhances customer satisfaction, decreases customer defection, and promotes customer loyalty. Substantial literature has examined the concept of service quality, its dimensions, and measurement methods. We introduce the perceived service quality index (PSQI) as a single measure for evaluating the multiple-item service quality construct based on the SERVQUAL model. A slack-based measure (SBM) of efficiency with constant inputs is used to calculate the PSQI. In addition, a non-linear programming model based on the SBM is proposed to delineate an improvement guideline and improve service quality. An empirical study is conducted to assess the applicability of the method proposed in this study. A large number of studies have used DEA as a benchmarking tool to measure service quality. These models do not propose a coherent performance evaluation construct and consequently fail to deliver improvement guidelines for improving service quality. The DEA models proposed in this study are designed to evaluate and improve service quality within a comprehensive framework and without any dependency on external data.

  19. Security Governance – An Empirical Analysis of the Norwegian Context

    Directory of Open Access Journals (Sweden)

    Martin Nøkleberg

    2016-05-01

    Full Text Available This article explores the local security governance in the city of Bergen, and it thus highlights what characterizes security governance within a Norwegian context. The burgeoning policing literature suggests that we live in a pluralized and networked society – ideas of cooperation have thus been perceived as important features for the effectiveness in security governance. Cooperative relations between public and private actors are the main focus of this article and such arrangements are empirically explored in the city of Bergen. These relations are explored on the basis of the theoretical framework state anchored pluralism and nodal governance. The key finding is that there seems to be an unfulfilled potential in the security governance in Bergen. The public police have difficulties with cooperating with and exploiting the potential possessed by the private security industry. It is suggested that these difficulties are related to a mentality problem within the police institution, derived from nodal governance, that is, the police are influenced by a punishment mentality and view themselves as the only possible actor which can and should maintain the security.

  20. An Empirical Analysis on Credit Risk Models and its Application

    Directory of Open Access Journals (Sweden)

    Joocheol Kim

    2014-08-01

    Full Text Available This study intends to focus on introducing credit default risk with widely used credit risk models in an effort to empirically test whether the models hold their validity, apply to financial institutions which usually are highly levered with various types of debts, and finally reinterpret the results in computing adequate collateral level in the over-the-counter derivatives market. By calculating the distance-to-default values using historical market data for South Korean banks and brokerage firms as suggested in Merton model and KMV’s EDF model, we find that the performance of the introduced models well reflect the credit quality of the sampled financial institutions. Moreover, we suggest that in addition to the given credit ratings of different financial institutions, their distance-to-default values can be utilized in determining the sufficient level of credit support. Our suggested “smoothened” collateral level allows both contractual parties to minimize their costs caused from provision of collateral without undertaking additional credit risk and achieve efficient collateral management.

  1. Regime switching model for financial data: Empirical risk analysis

    Science.gov (United States)

    Salhi, Khaled; Deaconu, Madalina; Lejay, Antoine; Champagnat, Nicolas; Navet, Nicolas

    2016-11-01

    This paper constructs a regime switching model for the univariate Value-at-Risk estimation. Extreme value theory (EVT) and hidden Markov models (HMM) are combined to estimate a hybrid model that takes volatility clustering into account. In the first stage, HMM is used to classify data in crisis and steady periods, while in the second stage, EVT is applied to the previously classified data to rub out the delay between regime switching and their detection. This new model is applied to prices of numerous stocks exchanged on NYSE Euronext Paris over the period 2001-2011. We focus on daily returns for which calibration has to be done on a small dataset. The relative performance of the regime switching model is benchmarked against other well-known modeling techniques, such as stable, power laws and GARCH models. The empirical results show that the regime switching model increases predictive performance of financial forecasting according to the number of violations and tail-loss tests. This suggests that the regime switching model is a robust forecasting variant of power laws model while remaining practical to implement the VaR measurement.

  2. RELIABILITY ANALYSIS FOR AN AERO ENGINE TURBINE DISK UNDER LOW CYCLE FATIGUE CONDITION

    Institute of Scientific and Technical Information of China (English)

    C.L. Liu; Z.Z. Lü; Y.L. Xu

    2004-01-01

    Reliability analysis methods based on the linear damage accumulation law (LDAL) and load-life interference model are studied in this paper. According to the equal probability rule, the equivalent loads are derived, and the reliability analysis method based on load-life interference model and recurrence formula is constructed. In conjunction with finite element analysis (FEA) program, the reliability of an aero engine turbine disk under low cycle fatigue (LCF) condition has been analyzed. The results show the turbine disk is safety and the above reliability analysis methods are feasible.

  3. Reliability Analysis for the Fatigue Limit State of the ASTRID Offshore Platform

    NARCIS (Netherlands)

    Vrouwenvelder, A.C.W.M.; Gostelie, E.M.

    1986-01-01

    A reliability analysis with respect to fatigue failure was performed for a concrete gravity platform designed for the Troll field. The reliability analysis was incorporated in the practical design-loop to gain more insight into the complex fatigue problem. In the analysis several parameters relating

  4. A 16-year examination of domestic violence among Asians and Asian Americans in the empirical knowledge base: a content analysis.

    Science.gov (United States)

    Yick, Alice G; Oomen-Early, Jody

    2008-08-01

    Until recently, research studies have implied that domestic violence does not affect Asian American and immigrant communities, or even Asians abroad, because ethnicity or culture has not been addressed. In this content analysis, the authors examined trends in publications in leading scholarly journals on violence relating to Asian women and domestic violence. A coding schema was developed, with two raters coding the data with high interrater reliability. Sixty articles were published over the 16 years studied, most atheoretical and focusing on individual levels of analysis. The terms used in discussing domestic violence reflected a feminist perspective. Three quarters of the studies were empirical, with most guided by logical positivism using quantitative designs. Most targeted specific Asian subgroups (almost a third focused on Asian Indians) rather than categorizing Asians as a general ethnic category. The concept of "Asian culture" was most often assessed by discussing Asian family structure. Future research is discussed in light of the findings.

  5. Gold price analysis based on ensemble empirical model decomposition and independent component analysis

    Science.gov (United States)

    Xian, Lu; He, Kaijian; Lai, Kin Keung

    2016-07-01

    In recent years, the increasing level of volatility of the gold price has received the increasing level of attention from the academia and industry alike. Due to the complexity and significant fluctuations observed in the gold market, however, most of current approaches have failed to produce robust and consistent modeling and forecasting results. Ensemble Empirical Model Decomposition (EEMD) and Independent Component Analysis (ICA) are novel data analysis methods that can deal with nonlinear and non-stationary time series. This study introduces a new methodology which combines the two methods and applies it to gold price analysis. This includes three steps: firstly, the original gold price series is decomposed into several Intrinsic Mode Functions (IMFs) by EEMD. Secondly, IMFs are further processed with unimportant ones re-grouped. Then a new set of data called Virtual Intrinsic Mode Functions (VIMFs) is reconstructed. Finally, ICA is used to decompose VIMFs into statistically Independent Components (ICs). The decomposition results reveal that the gold price series can be represented by the linear combination of ICs. Furthermore, the economic meanings of ICs are analyzed and discussed in detail, according to the change trend and ICs' transformation coefficients. The analyses not only explain the inner driving factors and their impacts but also conduct in-depth analysis on how these factors affect gold price. At the same time, regression analysis has been conducted to verify our analysis. Results from the empirical studies in the gold markets show that the EEMD-ICA serve as an effective technique for gold price analysis from a new perspective.

  6. A Bivariate Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Hilger, Klaus Baggesen; Andersen, Ole Baltazar

    2002-01-01

    This paper describes the application of canonical correlations analysis to the joint analysis of global monthly mean values of 1996-1997 sea surface temperature (SST) and height (SSH) data. The SST data are considered as one set and the SSH data as another set of multivariate observations, both...... as for example an increase in the SST will lead to an increase in the SSH. The analysis clearly shows the build-up of one of the largest El Niño events on record. Also the analysis indicates a phase lag of approximately one month between the SST and SSH fields....

  7. Methods for communication-network reliability analysis - Probabilistic graph reduction

    Science.gov (United States)

    Shooman, Andrew M.; Kershenbaum, Aaron

    The authors have designed and implemented a graph-reduction algorithm for computing the k-terminal reliability of an arbitrary network with possibly unreliable nodes. The two contributions of the present work are a version of the delta-y transformation for k-terminal reliability and an extension of Satyanarayana and Wood's polygon to chain transformations to handle graphs with imperfect vertices. The exact algorithm is faster than or equal to that of Satyanarayana and Wood and the simple algorithm without delta-y and polygon to chain transformations for every problem considered. The exact algorithm runs in linear time on series-parallel graphs and is faster than the above-stated algorithms for huge problems which run in exponential time. The approximate algorithms reduce the computation time for the network reliability problem by two to three orders of magnitude for large problems, while providing reasonably accurate answers in most cases.

  8. Reliability Analysis of Random Vibration Transmission Path Systems

    Directory of Open Access Journals (Sweden)

    Wei Zhao

    2017-01-01

    Full Text Available The vibration transmission path systems are generally composed of the vibration source, the vibration transfer path, and the vibration receiving structure. The transfer path is the medium of the vibration transmission. Moreover, the randomness of transfer path influences the transfer reliability greatly. In this paper, based on the matrix calculus, the generalized second moment technique, and the stochastic finite element theory, the effective approach for the transfer reliability of vibration transfer path systems was provided. The transfer reliability of vibration transfer path system with uncertain path parameters including path mass and path stiffness was analyzed theoretically and computed numerically, and the correlated mathematical expressions were derived. Thus, it provides the theoretical foundation for the dynamic design of vibration systems in practical project, so that most random path parameters can be considered to solve the random problems for vibration transfer path systems, which can avoid the system resonance failure.

  9. Windfarm Generation Assessment for Reliability Analysis of Power Systems

    DEFF Research Database (Denmark)

    Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  10. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    DEFF Research Database (Denmark)

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  11. Technical information report: Plasma melter operation, reliability, and maintenance analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hendrickson, D.W. [ed.

    1995-03-14

    This document provides a technical report of operability, reliability, and maintenance of a plasma melter for low-level waste vitrification, in support of the Hanford Tank Waste Remediation System (TWRS) Low-Level Waste (LLW) Vitrification Program. A process description is provided that minimizes maintenance and downtime and includes material and energy balances, equipment sizes and arrangement, startup/operation/maintence/shutdown cycle descriptions, and basis for scale-up to a 200 metric ton/day production facility. Operational requirements are provided including utilities, feeds, labor, and maintenance. Equipment reliability estimates and maintenance requirements are provided which includes a list of failure modes, responses, and consequences.

  12. Reliability modeling and analysis of smart power systems

    CERN Document Server

    Karki, Rajesh; Verma, Ajit Kumar

    2014-01-01

    The volume presents the research work in understanding, modeling and quantifying the risks associated with different ways of implementing smart grid technology in power systems in order to plan and operate a modern power system with an acceptable level of reliability. Power systems throughout the world are undergoing significant changes creating new challenges to system planning and operation in order to provide reliable and efficient use of electrical energy. The appropriate use of smart grid technology is an important drive in mitigating these problems and requires considerable research acti

  13. Embedded mechatronic systems 1 analysis of failures, predictive reliability

    CERN Document Server

    El Hami, Abdelkhalak

    2015-01-01

    In operation, mechatronics embedded systems are stressed by loads of different causes: climate (temperature, humidity), vibration, electrical and electromagnetic. These stresses in components which induce failure mechanisms should be identified and modeled for better control. AUDACE is a collaborative project of the cluster Mov'eo that address issues specific to mechatronic reliability embedded systems. AUDACE means analyzing the causes of failure of components of mechatronic systems onboard. The goal of the project is to optimize the design of mechatronic devices by reliability. The projec

  14. Empirical analysis between industrial structure and economic growth of china

    Institute of Scientific and Technical Information of China (English)

    姚西龙; 尤津; 郝鹏飞

    2008-01-01

    In recent years,the relationship between the industrial structure and economic growth is more and more concerned by scholars. According to the theory of industrial structure and economic development, this article use regression analysis method that estimates the three major industries’ contribute to Chinese economic growth and use cluster analysis methods, then discuss how to optimize the indus-trial structure.

  15. Cost of Illness and Cost Containment Analysis Using Empirical Antibiotic Therapy in Sepsis Patients in Bandung

    Directory of Open Access Journals (Sweden)

    Rano K. Sinuraya

    2012-12-01

    Full Text Available The aims of this study were to analyze cost of illness (COI and cost containment analysis using empirical antibiotic therapy in sepsis patients with respiratory infection in a hospital in Bandung. A cross sectional method was conducted retrospectively. Data were collected from medical record of inpatients sepsis patients with respiratory infections with empirical antibiotic therapy ceftazidime-levofloxacin or cefotaxime-erythromycin. Direct and indirect cost were calculated and analyzed in this study. The result showed that the average COI for patients with combination ceftazidime-levofloxaxin was 13,369,055 IDR whereas combination of cefotaxime-erythromycin was 22,250,495 IDR. In summary, the COI empirical antibiotic therapy ceftazidime-levofloxacin was lower than cefotaxime-erythromycin. Cost containment using empirical antibiotic therapy ceftazidime-levofloxacin which without reducing the service quality was 8,881,440 IDR.

  16. Exploring Advertising in Higher Education: An Empirical Analysis in North America, Europe, and Japan

    Science.gov (United States)

    Papadimitriou, Antigoni; Blanco Ramírez, Gerardo

    2015-01-01

    This empirical study explores higher education advertising campaigns displayed in five world cities: Boston, New York, Oslo, Tokyo, and Toronto. The study follows a mixed-methods research design relying on content analysis and multimodal semiotic analysis and employs a conceptual framework based on the knowledge triangle of education, research,…

  17. Exploring Advertising in Higher Education: An Empirical Analysis in North America, Europe, and Japan

    Science.gov (United States)

    Papadimitriou, Antigoni; Blanco Ramírez, Gerardo

    2015-01-01

    This empirical study explores higher education advertising campaigns displayed in five world cities: Boston, New York, Oslo, Tokyo, and Toronto. The study follows a mixed-methods research design relying on content analysis and multimodal semiotic analysis and employs a conceptual framework based on the knowledge triangle of education, research,…

  18. Architecture-Based Reliability Analysis of Web Services

    Science.gov (United States)

    Rahmani, Cobra Mariam

    2012-01-01

    In a Service Oriented Architecture (SOA), the hierarchical complexity of Web Services (WS) and their interactions with the underlying Application Server (AS) create new challenges in providing a realistic estimate of WS performance and reliability. The current approaches often treat the entire WS environment as a black-box. Thus, the sensitivity…

  19. Windfarm generation assessment for reliability analysis of power systems

    DEFF Research Database (Denmark)

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.;

    2007-01-01

    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...

  20. Reliability analysis of common hazardous waste treatment processes

    Energy Technology Data Exchange (ETDEWEB)

    Waters, R.D. [Vanderbilt Univ., Nashville, TN (United States)

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  1. Fiber Access Networks: Reliability Analysis and Swedish Broadband Market

    Science.gov (United States)

    Wosinska, Lena; Chen, Jiajia; Larsen, Claus Popp

    Fiber access network architectures such as active optical networks (AONs) and passive optical networks (PONs) have been developed to support the growing bandwidth demand. Whereas particularly Swedish operators prefer AON, this may not be the case for operators in other countries. The choice depends on a combination of technical requirements, practical constraints, business models, and cost. Due to the increasing importance of reliable access to the network services, connection availability is becoming one of the most crucial issues for access networks, which should be reflected in the network owner's architecture decision. In many cases protection against failures is realized by adding backup resources. However, there is a trade off between the cost of protection and the level of service reliability since improving reliability performance by duplication of network resources (and capital expenditures CAPEX) may be too expensive. In this paper we present the evolution of fiber access networks and compare reliability performance in relation to investment and management cost for some representative cases. We consider both standard and novel architectures for deployment in both sparsely and densely populated areas. While some recent works focused on PON protection schemes with reduced CAPEX the current and future effort should be put on minimizing the operational expenditures (OPEX) during the access network lifetime.

  2. Statistical Analysis of Human Reliability of Armored Equipment

    Institute of Scientific and Technical Information of China (English)

    LIU Wei-ping; CAO Wei-guo; REN Jing

    2007-01-01

    Human errors of seven types of armored equipment, which occur during the course of field test, are statistically analyzed. The human error-to-armored equipment failure ratio is obtained. The causes of human errors are analyzed. The distribution law of human errors is acquired. The ratio of human errors and human reliability index are also calculated.

  3. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  4. Institutionalism and Commissions Executive Discretion: an Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Fabio Franchino

    1998-07-01

    Full Text Available Theory: The adoption of EC secondary legislation can be analyzed from the perspective of agency theory whereby Member States and the Parliament delegate policy authority to the Commission and design ex-post control procedures (i.e. Comitology. Rational choice and sociological institutionalisms differ in their predictions on the way rules and norms affect the extent of executive discretion. Hypothesis: Three institutionalist hypotheses are tested. The rationalist one derives from a Bayesian game developed by the author. It posits that Commissions executive discretion in non amending secondary legislation is a function of: 1 formal legislative procedure, 2 information asymmetry and 3 distribution of principals preferences. A fourth variable, legislative instrument, is also included. The diluted rationalist hypothesis substitutes formal with informal procedure in one policy area. The socio-rational hypothesis adds two new variables, that is the opinions of the Parliament and the Economic and Social Committee. A final co-graduation test is conducted on whether more discretion leads to more stringent ex-post control. Methods: Given the bimodal error structure of the regression model, I have bootstrapped the regression coefficients and computed the 95% confidence intervals of the null hypothesis. Bootstrapping has also been used to test the role of the European Parliament, of opinions and the co-graduation between discretion and ex-post control. A stratified sample of non amending secondary legislation adopted from 1987 to 1993 has been drawn to test the hypotheses. Results: The diluted rationalist hypothesis is the most accurate. Information asymmetry, informal legislative procedures and legislative instruments are statistically and substantively relevant in explaining executive discretion. Distribution of preferences has weak explanatory power probably because of the lack of reliable data and appropriate measurement. The Parliament and opinions do

  5. Institutionalism and Commissions Executive Discretion: an Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Fabio Franchino

    1998-07-01

    Full Text Available Theory: The adoption of EC secondary legislation can be analyzed from the perspective of agency theory whereby Member States and the Parliament delegate policy authority to the Commission and design ex-post control procedures (i.e. Comitology. Rational choice and sociological institutionalisms differ in their predictions on the way rules and norms affect the extent of executive discretion. Hypothesis: Three institutionalist hypotheses are tested. The rationalist one derives from a Bayesian game developed by the author. It posits that Commissions executive discretion in non amending secondary legislation is a function of: 1 formal legislative procedure, 2 information asymmetry and 3 distribution of principals preferences. A fourth variable, legislative instrument, is also included. The diluted rationalist hypothesis substitutes formal with informal procedure in one policy area. The socio-rational hypothesis adds two new variables, that is the opinions of the Parliament and the Economic and Social Committee. A final co-graduation test is conducted on whether more discretion leads to more stringent ex-post control. Methods: Given the bimodal error structure of the regression model, I have bootstrapped the regression coefficients and computed the 95% confidence intervals of the null hypothesis. Bootstrapping has also been used to test the role of the European Parliament, of opinions and the co-graduation between discretion and ex-post control. A stratified sample of non amending secondary legislation adopted from 1987 to 1993 has been drawn to test the hypotheses. Results: The diluted rationalist hypothesis is the most accurate. Information asymmetry, informal legislative procedures and legislative instruments are statistically and substantively relevant in explaining executive discretion. Distribution of preferences has weak explanatory power probably because of the lack of reliable data and appropriate measurement. The Parliament and opinions do

  6. An empirical EEG analysis in brain death diagnosis for adults.

    Science.gov (United States)

    Chen, Zhe; Cao, Jianting; Cao, Yang; Zhang, Yue; Gu, Fanji; Zhu, Guoxian; Hong, Zhen; Wang, Bin; Cichocki, Andrzej

    2008-09-01

    Electroencephalogram (EEG) is often used in the confirmatory test for brain death diagnosis in clinical practice. Because EEG recording and monitoring is relatively safe for the patients in deep coma, it is believed to be valuable for either reducing the risk of brain death diagnosis (while comparing other tests such as the apnea) or preventing mistaken diagnosis. The objective of this paper is to study several statistical methods for quantitative EEG analysis in order to help bedside or ambulatory monitoring or diagnosis. We apply signal processing and quantitative statistical analysis for the EEG recordings of 32 adult patients. For EEG signal processing, independent component analysis (ICA) was applied to separate the independent source components, followed by Fourier and time-frequency analysis. For quantitative EEG analysis, we apply several statistical complexity measures to the EEG signals and evaluate the differences between two groups of patients: the subjects in deep coma, and the subjects who were categorized as brain death. We report statistically significant differences of quantitative statistics with real-life EEG recordings in such a clinical study, and we also present interpretation and discussions on the preliminary experimental results.

  7. An Empirical Kaiser Criterion.

    Science.gov (United States)

    Braeken, Johan; van Assen, Marcel A L M

    2016-03-31

    In exploratory factor analysis (EFA), most popular methods for dimensionality assessment such as the screeplot, the Kaiser criterion, or-the current gold standard-parallel analysis, are based on eigenvalues of the correlation matrix. To further understanding and development of factor retention methods, results on population and sample eigenvalue distributions are introduced based on random matrix theory and Monte Carlo simulations. These results are used to develop a new factor retention method, the Empirical Kaiser Criterion. The performance of the Empirical Kaiser Criterion and parallel analysis is examined in typical research settings, with multiple scales that are desired to be relatively short, but still reliable. Theoretical and simulation results illustrate that the new Empirical Kaiser Criterion performs as well as parallel analysis in typical research settings with uncorrelated scales, but much better when scales are both correlated and short. We conclude that the Empirical Kaiser Criterion is a powerful and promising factor retention method, because it is based on distribution theory of eigenvalues, shows good performance, is easily visualized and computed, and is useful for power analysis and sample size planning for EFA. (PsycINFO Database Record

  8. Using Confirmatory Factor Analysis for Construct Validation: An Empirical Review

    Science.gov (United States)

    DiStefano, Christine; Hess, Brian

    2005-01-01

    This study investigated the psychological assessment literature to determine what applied researchers are using and reporting from confirmatory factor analysis (CFA) studies for evidence of construct validation. One hundred and one articles published in four major psychological assessment journals between 1990 and 2002 were systematically…

  9. INVESTIGATING THE "COMPLEMENTARITY HYPOTHESIS" IN GREEK AGRICULTURE: AN EMPIRICAL ANALYSIS

    OpenAIRE

    Katrakilidis, Constantinos P.; Tabakis, Nikolaos M.

    2001-01-01

    This study investigates determinants of private capital formation in Greek agriculture and tests the "complementarity" against the "crowding out" hypothesis using multivariate cointegration techniques and ECVAR modeling in conjunction with variance decomposition and impulse response analysis. The results provide evidence of a significant positive causal effect of government spending on private capital formation, thus supporting the "complementarity" hypothesis for Greek agriculture.

  10. 188 An Empirical Investigation of Value-Chain Analysis and ...

    African Journals Online (AJOL)

    User

    This research work was designed to examine the impact of the Value-Chain. Analysis on Competitive ... market create economic value and when few competing firms are engaging in .... Trace costs of activities – The company needs an accounting ... margins, return on assets, benchmarking, and capital budgeting. When a.

  11. Preventive Replacement Decisions for Dragline Components Using Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Nuray Demirel

    2016-05-01

    Full Text Available Reliability-based maintenance policies allow qualitative and quantitative evaluation of system downtimes via revealing main causes of breakdowns and discussing required preventive activities against failures. Application of preventive maintenance is especially important for mining machineries since production is highly affected from machinery breakdowns. Overburden stripping operations are one of the integral parts in surface coal mine productions. Draglines are extensively utilized in overburden stripping operations and they achieve earthmoving activities with bucket capacities up to 168 m3. The massive structure and operational severity of these machines increase the importance of performance awareness for individual working components. Research on draglines is rarely observed in the literature and maintenance studies for these earthmovers have been generally ignored. On this basis, this paper offered a comprehensive reliability assessment for two draglines currently operating in the Tunçbilek coal mine and discussed preventive replacement for wear-out components of the draglines considering cost factors.

  12. Reliability Analysis and Standardization of Spacecraft Command Generation Processes

    Science.gov (United States)

    Meshkat, Leila; Grenander, Sven; Evensen, Ken

    2011-01-01

    center dot In order to reduce commanding errors that are caused by humans, we create an approach and corresponding artifacts for standardizing the command generation process and conducting risk management during the design and assurance of such processes. center dot The literature review conducted during the standardization process revealed that very few atomic level human activities are associated with even a broad set of missions. center dot Applicable human reliability metrics for performing these atomic level tasks are available. center dot The process for building a "Periodic Table" of Command and Control Functions as well as Probabilistic Risk Assessment (PRA) models is demonstrated. center dot The PRA models are executed using data from human reliability data banks. center dot The Periodic Table is related to the PRA models via Fault Links.

  13. Analysis on Operation Reliability of Generating Units in 2005

    Institute of Scientific and Technical Information of China (English)

    Zuo Xiaowen; Chu Xue

    2007-01-01

    @@ The weighted average equivalent availability factor of thermal power units in 2005 was 92.34%, an increase of 0.64 percentage points as compared to that in 2004. The average equivalent availability factor in 2005 was 92.22%, a decrease of 0.95 percentage points as compared to that in 2004. The nationwide operation reliability of generating units in 2005 was analyzed completely in this paper.

  14. Reliability Analysis for Tunnel Supports System by Using Finite Element Method

    Directory of Open Access Journals (Sweden)

    E. Bukaçi

    2016-09-01

    Full Text Available Reliability analysis is a method that can be used in almost any geotechnical engineering problem. Using this method requires the knowledge of parameter uncertainties, which can be expressed by their standard deviation value. By performing reliability analysis to tunnel supports design, can be obtained a range of safety factors and by using them, probability of failure can be calculated. Problem becomes more complex when this analysis is performed for numerical methods, such as Finite Element Method. This paper gives a solution to how reliability analysis can be performed to design tunnel supports, by using Point Estimate Method to calculate reliability index. As a case study, is chosen one of the energy tunnels at Fan Hydropower plant, in Rrëshen Albania. As results, values of factor of safety and probability of failure are calculated. Also some suggestions using reliability analysis with numerical methods are given.

  15. Reliability importance analysis of Markovian systems at steady state using perturbation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Phuc Do Van [Institut Charles Delaunay - FRE CNRS 2848, Systems Modeling and Dependability Group, Universite de technologie de Troyes, 12, rue Marie Curie, BP 2060-10010 Troyes cedex (France); Barros, Anne [Institut Charles Delaunay - FRE CNRS 2848, Systems Modeling and Dependability Group, Universite de technologie de Troyes, 12, rue Marie Curie, BP 2060-10010 Troyes cedex (France)], E-mail: anne.barros@utt.fr; Berenguer, Christophe [Institut Charles Delaunay - FRE CNRS 2848, Systems Modeling and Dependability Group, Universite de technologie de Troyes, 12, rue Marie Curie, BP 2060-10010 Troyes cedex (France)

    2008-11-15

    Sensitivity analysis has been primarily defined for static systems, i.e. systems described by combinatorial reliability models (fault or event trees). Several structural and probabilistic measures have been proposed to assess the components importance. For dynamic systems including inter-component and functional dependencies (cold spare, shared load, shared resources, etc.), and described by Markov models or, more generally, by discrete events dynamic systems models, the problem of sensitivity analysis remains widely open. In this paper, the perturbation method is used to estimate an importance factor, called multi-directional sensitivity measure, in the framework of Markovian systems. Some numerical examples are introduced to show why this method offers a promising tool for steady-state sensitivity analysis of Markov processes in reliability studies.

  16. Actuarial and actual analysis of surgical results: empirical validation.

    Science.gov (United States)

    Grunkemeier, G L; Anderson, R P; Starr, A

    2001-06-01

    This report validates the use of the Kaplan-Meier (actuarial) method of computing survival curves by comparing 12-year estimates published in 1978 with current assessments. It also contrasts cumulative incidence curves, referred to as "actual" analysis in the cardiac-related literature with Kaplan-Meier curves for thromboembolism and demonstrates that with the former estimate the percentage of events that will actually occur.

  17. Dissecting Situational Strength: Theoretical Analysis and Empirical Tests

    Science.gov (United States)

    2012-09-01

    approaches such as difference scores and profile similarity indices (see Edwards , 2007; Shanock, Baran, Gentry, Pattison, & Heggestad, 2010). In addition...and (2) via analysis of indirect, actual measures of fit through polynomial regression and response surfaces ( Edwards , 2007; Shanock et al., 2010...Thoresen, Bono , & Patton, 2001; see also Herman, 1973; Smith, 1977) suggests that job attitudes are related to job performance more strongly in situations

  18. Empirical Analysis of Religiosity as Predictor of Social Media Addiction

    Directory of Open Access Journals (Sweden)

    Jamal J Almenayes

    2015-10-01

    Full Text Available This study sought to examine the dimensions of social media addiction and its relationship to religiosity.  To investigate the matter, the present research utilized a well-known Internet addiction scale and modified it to fit social media (Young, 1996.  Factor analysis of items generated by a sample of 1326 participants, three addiction factors were apparent.  These factors were later regressed on a scale of religiosity.  This scale contained a single factor based on factor analysis.  Results indicated that social media addiction had three factors; "Social Consequences", "Time Displacement" and "Compulsive feelings.  Religiosity, on the other hand, contained a single factor.  Both of these results were arrived at using factor analysis of their respective scales. The relationship between religiosity and social media addiction was then examined using linear regression.  The results indicated that only two of the addiction factors were significantly related to religiosity.  Future research should address the operationalization of the concept of religiosity to account for multiple dimensions.

  19. Calculation of Empirical and True Maintenance Coefficients by Flux Balance Analysis

    Institute of Scientific and Technical Information of China (English)

    MaHongwu; ZhaoXueming; 等

    2002-01-01

    The stoichiometric matrix of a simplified metabolic network in Bacillus Subtillis was contructed from the flux balance equations, which were used for reconciliation of the measured rates and determination of the inner metabolic rates. Thus more reliable results of the true and empirical maintenance coefficients were obtained. The true maintenance coefficient is linearly related to the specific growth rate and changes with the P/O ratiol. The neasured biomass yield of adenosine triphosphate (ATP) is also linearly related to the P/O ratio.

  20. Reliability Analysis of Bearing Capacity of Large-Diameter Piles under Osterberg Test

    Directory of Open Access Journals (Sweden)

    Lei Nie

    2013-05-01

    Full Text Available This study gives the reliability analysis of bearing capacity of large-diameter piles under osterberg test. The limit state equation of dimensionless random variables is utilized in the reliability analysis of vertical bearing capacity of large-diameter piles based on Osterberg loading tests. And the reliability index and the resistance partial coefficient under the current specifications are calculated using calibration method. The results show: the reliable index of large-diameter piles is correlated with the load effect ratio and is smaller than the ordinary piles; resistance partial coefficient of 1.53 is proper in design of large-diameter piles.

  1. Analysis of Syetem Reliability in Manufacturing Cell Based on Triangular Fuzzy Number

    Institute of Scientific and Technical Information of China (English)

    ZHANG Caibo; HAN Botang; SUN Changsen; XU Chunjie

    2006-01-01

    Due to lacking of test-data and field-data in reliability research during the design stage of manufacturing cell system. The degree of manufacturing cell system reliability research is increased. In order to deal with the deficient data and the uncertainty occurred from analysis and judgment, the paper discussed a method for studying reliability of manufacturing cell system through the analysis of fuzzy fault tree, which was based on triangular fuzzy number. At last, calculation case indicated that it would have great significance for ascertaining reliability index, maintenance and establishing keeping strategy towards manufacturing cell system.

  2. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  3. IFRS and Stock Returns: An Empirical Analysis in Brazil

    Directory of Open Access Journals (Sweden)

    Rodrigo F. Malaquias

    2016-09-01

    Full Text Available In recent years, the convergence of accounting standards has been an issue that motivated new studies in the accounting field. It is expected that the convergence provides users, especially external users of accounting information, with comparable reports among different economies. Considering this scenario, this article was developed in order to compare the effect of accounting numbers on the stock market before and after the accounting convergence in Brazil. The sample of the study involved Brazilian listed companies at BM&FBOVESPA that had American Depository Receipts (levels II and III at the New York Stock Exchange (NYSE. For data analysis, descriptive statistics and graphic analysis were employed in order to analyze the behavior of stock returns around the publication dates. The main results indicate that the stock market reacts to the accounting reports. Therefore, the accounting numbers contain relevant information for the decision making of investors in the stock market. Moreover, it is observed that after the accounting convergence, the stock returns of the companies seem to present lower volatility.

  4. Tourism Competitiveness Index – An Empirical Analysis Romania vs. Bulgaria

    Directory of Open Access Journals (Sweden)

    Mihai CROITORU

    2011-09-01

    Full Text Available In the conditions of the current economic downturn, many specialists consider tourism as one of the sectors with the greatest potential to provide worldwide economic growth and development. A growing tourism sector can contribute effectively to employment, increase national income, and can also make a decisive mark on the balance of payments. Thus, tourism can be an important driving force for growth and prosperity, especially in emerging economies, being a key element in reducing poverty and regional disparities. Despite its contribution to economic growth, tourism sector development can be undermined by a series of economic and legislative barriers that can affect the competitiveness of this sector. In this context, the World Economic Forum proposes, via the Tourism Competitiveness Index (TCI, in addition to a methodology to identify key factors that contribute to increasing tourism competitiveness, tools for analysis and evaluation of these factors. In this context, this paper aims to analyze the underlying determinants of TCI from the perspective of two directly competing states, Romania and Bulgaria in order to highlight the effects of communication on the competitiveness of the tourism sector. The purpose of this analysis is to provide some answers, especially in terms of communication strategies, which may explain the completely different performances of the two national economies in the tourism sector.

  5. Hybrid modeling and empirical analysis of automobile supply chain network

    Science.gov (United States)

    Sun, Jun-yan; Tang, Jian-ming; Fu, Wei-ping; Wu, Bing-ying

    2017-05-01

    Based on the connection mechanism of nodes which automatically select upstream and downstream agents, a simulation model for dynamic evolutionary process of consumer-driven automobile supply chain is established by integrating ABM and discrete modeling in the GIS-based map. Firstly, the rationality is proved by analyzing the consistency of sales and changes in various agent parameters between the simulation model and a real automobile supply chain. Second, through complex network theory, hierarchical structures of the model and relationships of networks at different levels are analyzed to calculate various characteristic parameters such as mean distance, mean clustering coefficients, and degree distributions. By doing so, it verifies that the model is a typical scale-free network and small-world network. Finally, the motion law of this model is analyzed from the perspective of complex self-adaptive systems. The chaotic state of the simulation system is verified, which suggests that this system has typical nonlinear characteristics. This model not only macroscopically illustrates the dynamic evolution of complex networks of automobile supply chain but also microcosmically reflects the business process of each agent. Moreover, the model construction and simulation of the system by means of combining CAS theory and complex networks supplies a novel method for supply chain analysis, as well as theory bases and experience for supply chain analysis of auto companies.

  6. Acquisition and statistical analysis of reliability data for I and C parts in plant protection system

    Energy Technology Data Exchange (ETDEWEB)

    Lim, T. J.; Byun, S. S.; Han, S. H.; Lee, H. J.; Lim, J. S.; Oh, S. J.; Park, K. Y.; Song, H. S. [Soongsil Univ., Seoul (Korea)

    2001-04-01

    This project has been performed in order to construct I and C part reliability databases for detailed analysis of plant protection system and to develop a methodology for analysing trip set point drifts. Reliability database for the I and C parts of plant protection system is required to perform the detailed analysis. First, we have developed an electronic part reliability prediction code based on MIL-HDBK-217F. Then we have collected generic reliability data for the I and C parts in plant protection system. Statistical analysis procedure has been developed to process the data. Then the generic reliability database has been constructed. We have also collected plant specific reliability data for the I and C parts in plant protection system for YGN 3,4 and UCN 3,4 units. Plant specific reliability database for I and C parts has been developed by the Bayesian procedure. We have also developed an statistical analysis procedure for set point drift, and performed analysis of drift effects for trip set point. The basis for the detailed analysis can be provided from the reliability database for the PPS I and C parts. The safety of the KSNP and succeeding NPPs can be proved by reducing the uncertainty of PSA. Economic and efficient operation of NPP can be possible by optimizing the test period to reduce utility's burden. 14 refs., 215 figs., 137 tabs. (Author)

  7. Empirical Analysis of Urban Residents’ Perceived Climatic Change Risks

    Institute of Scientific and Technical Information of China (English)

    Peihui; DAI; Lingling; HUANG

    2014-01-01

    The impact of climate change on human survival and security,urban development is even more profound,and receives more and more attention. To explore the perceived status of urban residents for the risks of climate change and put forward corresponding countermeasures and suggestions,taking Wuhan for example,from the microscopic point of urban residents,we use factor analysis to classify the perceived risks and recognized risk reduction measures,use cluster analysis to divide the urban residents into five groups,and use variance analysis to explore differences in the choice of measures between different cluster groups. We draw the following conclusions: the risk of deterioration of the ecological environment,the risk of economic damage,the risk of damage to the mental health,the risk of damage to the physical health and the risk of damage to the political harmony are the main risks of climate change for urban residents; individuals and families to develop good habits,businesses and governments to strengthen energy conservation,schools and other agencies to carry on the propaganda and education,carrying out multi-agent environment improvement,learn from the West are their recognized risk reduction measures. Depending on the perceived risk,the urban residents are clustered into five groups: those who are concerned about the body and politics,those who are concerned about the mental health,those who are concerned about the economic development,those who are concerned about the ecological safety,and those who ignore the climatic change. For the roles of individual and the family,business and government in the environmental protection,different groups have unanimous views,while for other measures,different groups have different understanding. It is concluded that individuals and families to develop environmentally friendly habits,government to strengthen regulation,businesses to take environmental responsibility,schools to strengthen publicity and education,and exploring

  8. Reliability analysis of a wastewater treatment plant using fault tree analysis and Monte Carlo simulation.

    Science.gov (United States)

    Taheriyoun, Masoud; Moradinejad, Saber

    2015-01-01

    The reliability of a wastewater treatment plant is a critical issue when the effluent is reused or discharged to water resources. Main factors affecting the performance of the wastewater treatment plant are the variation of the influent, inherent variability in the treatment processes, deficiencies in design, mechanical equipment, and operational failures. Thus, meeting the established reuse/discharge criteria requires assessment of plant reliability. Among many techniques developed in system reliability analysis, fault tree analysis (FTA) is one of the popular and efficient methods. FTA is a top down, deductive failure analysis in which an undesired state of a system is analyzed. In this study, the problem of reliability was studied on Tehran West Town wastewater treatment plant. This plant is a conventional activated sludge process, and the effluent is reused in landscape irrigation. The fault tree diagram was established with the violation of allowable effluent BOD as the top event in the diagram, and the deficiencies of the system were identified based on the developed model. Some basic events are operator's mistake, physical damage, and design problems. The analytical method is minimal cut sets (based on numerical probability) and Monte Carlo simulation. Basic event probabilities were calculated according to available data and experts' opinions. The results showed that human factors, especially human error had a great effect on top event occurrence. The mechanical, climate, and sewer system factors were in subsequent tier. Literature shows applying FTA has been seldom used in the past wastewater treatment plant (WWTP) risk analysis studies. Thus, the developed FTA model in this study considerably improves the insight into causal failure analysis of a WWTP. It provides an efficient tool for WWTP operators and decision makers to achieve the standard limits in wastewater reuse and discharge to the environment.

  9. Non-probabilistic fuzzy reliability analysis of pile foundation stability by interval theory

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Randomness and fuzziness are among the attributes of the influential factors for stability assessment of pile foundation.According to these two characteristics, the triangular fuzzy number analysis approach was introduced to determine the probability-distributed function of mechanical parameters. Then the functional function of reliability analysis was constructed based on the study of bearing mechanism of pile foundation, and the way to calculate interval values of the functional function was developed by using improved interval-truncation approach and operation rules of interval numbers. Afterwards, the non-probabilistic fuzzy reliability analysis method was applied to assessing the pile foundation, from which a method was presented for nonprobabilistic fuzzy reliability analysis of pile foundation stability by interval theory. Finally, the probability distribution curve of nonprobabilistic fuzzy reliability indexes of practical pile foundation was concluded. Its failure possibility is 0.91%, which shows that the pile foundation is stable and reliable.

  10. Structural Reliability Analysis for Implicit Performance with Legendre Orthogonal Neural Network Method

    Institute of Scientific and Technical Information of China (English)

    Lirong Sha; Tongyu Wang

    2016-01-01

    In order to evaluate the failure probability of a complicated structure, the structural responses usually need to be estimated by some numerical analysis methods such as finite element method ( FEM) . The response surface method ( RSM) can be used to reduce the computational effort required for reliability analysis when the performance functions are implicit. However, the conventional RSM is time⁃consuming or cumbersome if the number of random variables is large. This paper proposes a Legendre orthogonal neural network ( LONN)⁃based RSM to estimate the structural reliability. In this method, the relationship between the random variables and structural responses is established by a LONN model. Then the LONN model is connected to a reliability analysis method, i.e. first⁃order reliability methods (FORM) to calculate the failure probability of the structure. Numerical examples show that the proposed approach is applicable to structural reliability analysis, as well as the structure with implicit performance functions.

  11. Reliability and Sensitivity Analysis of Cast Iron Water Pipes for Agricultural Food Irrigation

    Directory of Open Access Journals (Sweden)

    Yanling Ni

    2014-07-01

    Full Text Available This study aims to investigate the reliability and sensitivity of cast iron water pipes for agricultural food irrigation. The Monte Carlo simulation method is used for fracture assessment and reliability analysis of cast iron pipes for agricultural food irrigation. Fracture toughness is considered as a limit state function for corrosion affected cast iron pipes. Then the influence of failure mode on the probability of pipe failure has been discussed. Sensitivity analysis also is carried out to show the effect of changing basic parameters on the reliability and life time of the pipe. The analysis results show that the applied methodology can consider different random variables for estimating of life time of the pipe and it can also provide scientific guidance for rehabilitation and maintenance plans for agricultural food irrigation. In addition, the results of the failure and reliability analysis in this study can be useful for designing of more reliable new pipeline systems for agricultural food irrigation.

  12. Financial development and economic growth. An empirical analysis for Ireland

    Directory of Open Access Journals (Sweden)

    Antonios Adamopoulos

    2010-07-01

    Full Text Available This study investigated the relationship between financial development and economicgrowth for Ireland for the period 1965-2007 using a vector error correction model (VECM.Questions were raised whether financial development causes economic growth or reverselytaking into account the positive effect of industrial production index. Financial marketdevelopment is estimated by the effect of credit market development and stock marketdevelopment on economic growth. The objective of this study was to examine the long-runrelationship between these variables applying the Johansen cointegration analysis takinginto account the maximum eigenvalues and trace statistics tests. Granger causality testsindicated that economic growth causes credit market development, while there is a bilateralcausal relationship between stock market development and economic growth. Therefore, itcan be inferred that economic growth has a positive effect on stock market development andcredit market development taking into account the positive effect of industrial productiongrowth on economic growth for Ireland.

  13. Empirical analysis of the effects of cyber security incidents.

    Science.gov (United States)

    Davis, Ginger; Garcia, Alfredo; Zhang, Weide

    2009-09-01

    We analyze the time series associated with web traffic for a representative set of online businesses that have suffered widely reported cyber security incidents. Our working hypothesis is that cyber security incidents may prompt (security conscious) online customers to opt out and conduct their business elsewhere or, at the very least, to refrain from accessing online services. For companies relying almost exclusively on online channels, this presents an important business risk. We test for structural changes in these time series that may have been caused by these cyber security incidents. Our results consistently indicate that cyber security incidents do not affect the structure of web traffic for the set of online businesses studied. We discuss various public policy considerations stemming from our analysis.

  14. Modeling for Determinants of Human Trafficking: An Empirical Analysis

    Directory of Open Access Journals (Sweden)

    Seo-Young Cho

    2015-02-01

    Full Text Available This study aims to identify robust push and pull factors of human trafficking. I test for the robustness of 70 push and 63 pull factors suggested in the literature. In doing so, I employ an extreme bound analysis, running more than two million regressions with all possible combinations of variables for up to 153 countries during the period of 1995–2010. My results show that crime prevalence robustly explains human trafficking both in destination and origin countries. Income level also has a robust impact, suggesting that the cause of human trafficking shares that of economic migration. Law enforcement matters more in origin countries than destination countries. Interestingly, a very low level of gender equality may have constraining effects on human trafficking outflow, possibly because gender discrimination limits female mobility that is necessary for the occurrence of human trafficking.

  15. Empirical Analysis of Agricultural Production Efficiency in Shaanxi Province

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    This article analyses the agricultural production efficiency of all cities and areas in Shaanxi Province in the period 2006-2009 using data envelopment analysis method,and compares the agricultural production efficiency between all cities and areas.The results show that the agricultural production efficiency and scale efficiency of agriculture of Shaanxi Province are high on the whole,but the efficiency of agricultural technology is very low,agricultural development still relies on factor inputs,and the driving role of technological progress is not conspicuous.Finally the following countermeasures are put forward to promote agricultural productivity in Shaanxi Province:improve the construction of agricultural infrastructure,and increase agricultural input;accelerate the project of extending agricultural technology into households,and promote the conversion and use rate of agricultural scientific and technological achievements;establish and improve industrial system of agriculture,and speed up the building of various agricultural cooperative economic organizations.

  16. Empirical Analysis of Bagged SVM Classifier for Data Mining Applications

    Directory of Open Access Journals (Sweden)

    M.Govindarajan

    2013-11-01

    Full Text Available Data mining is the use of algorithms to extract the information and patterns derived by the knowledge discovery in databases process. Classification maps data into predefined groups or classes. It is often referred to as supervised learning because the classes are determined before examining the data. The feasibility and the benefits of the proposed approaches are demonstrated by the means of data mining applications like intrusion detection, direct marketing, and signature verification. A variety of techniques have been employed for analysis ranging from traditional statistical methods to data mining approaches. Bagging and boosting are two relatively new but popular methods for producing ensembles. In this work, bagging is evaluated on real and benchmark data sets of intrusion detection, direct marketing, and signature verification in conjunction with as the base learner. The proposed is superior to individual approach for data mining applications in terms of classification accuracy.

  17. Empirical Analysis II: Business Cycles and Inward FDI in China

    Directory of Open Access Journals (Sweden)

    Yahya Sharahili

    2008-01-01

    Full Text Available It is well-known that the current speeding-up of globalization has been, on one hand, spreading macro economic effects around the world, while, on the other, fueling firms’ activities of crossing national borders. Then, are there any links between these two influences? As we have concluded in previous research that inward FDI and business cycle development do pro-cyclically relate on Granger base, this paper will further discuss “how do they react to each other?” Again, we chose China as subject and employed the 1983~2004 authorized annual statistic data. By constructing an Endogenous Growth model, we, after processing Correlation Analysis and testing the coefficient significance of each variable, found out the original momentum of Chinese economic growth and explored whether there exist some long-term relationship through Johansen Co-integration Test.

  18. STOCK MARKET DEVELOPMENT AND ECONOMIC GROWTH AN EMPIRICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vazakidis Athanasios

    2012-01-01

    Full Text Available This study investigated the causal relationship between stock market development and economic growth for Greece for the period 1978-2007 using a Vector Error Correction Model (VECM. Questions were raised whether stock market development causes economic growth taking into account the negative effect of interest rate on stock market development. The purpose of this study was to investigate the short-run and the long-run relationship between the examined variables applying the Johansen co-integration analysis. To achieve this objective unit root tests were carried out for all time series data in their levels and their first differences. Johansen co-integration analysis was applied to examine whether the variables are co-integrated of the same order taking into account the maximum eigenvalues and trace statistics tests. Finally, a vector error correction model was selected to investigate the long-run relationship between stock market development and economic growth. A short-run increase of economic growth per 1% induced an increase of stock market index 0.41% in Greece, while an increase of interest rate per 1% induced a relative decrease of stock market index per 1.42% in Greece. The estimated coefficient of error correction term was statistically significant and had a negative sign, which confirmed that there was not any problem in the long-run equilibrium between the examined variables. The results of Granger causality tests indicated that there is a unidirectional causality between stock market development and economic growth with direction from economic growth to stock market development and a unidirectional causal relationship between economic growth and interest rate with direction from economic growth to interest rate. Therefore, it can be inferred that economic growth has a direct positive effect on stock market development while interest rate has a negative effect on stock market development and economic growth respectively.

  19. Reliability of the ATD Angle in Dermatoglyphic Analysis.

    Science.gov (United States)

    Brunson, Emily K; Hohnan, Darryl J; Giovas, Christina M

    2015-09-01

    The "ATD" angle is a dermatoglyphic trait formed by drawing lines between the triradii below the first and last digits and the most proximal triradius on the hypothenar region of the palm. This trait has been widely used in dermatoglyphic studies, but several researchers have questioned its utility, specifically whether or not it can be measured reliably. The purpose of this research was to examine the measurement reliability of this trait. Finger and palm prints were taken using the carbon paper and tape method from the right and left hands of 100 individuals. Each "ATD" angle was read twice, at different times, by Reader A, using a goniometer and a magnifying glass, and three times by a Reader B, using Adobe Photoshop. Inter-class correlation coefficients were estimated for the intra- and inter-reader measurements of the "ATD" angles. Reader A was able to quantify ATD angles on 149 out of 200 prints (74.5%), and Reader B on 179 out of 200 prints (89.5%). Both readers agreed on whether an angle existed on a print 89.8% of the time for the right hand and 78.0% for the left. Intra-reader correlations were 0.97 or greater for both readers. Inter-reader correlations for "ATD" angles measured by both readers ranged from 0.92 to 0.96. These results suggest that the "ATD" angle can be measured reliably, and further imply that measurement using a software program may provide an advantage over other methods.

  20. Windfarm Generation Assessment for Reliability Analysis of Power Systems

    DEFF Research Database (Denmark)

    Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.

    2007-01-01

    a significant role in this assessment and different models have been created for it, but a representation which includes all of them has not been developed yet. This paper deals with this issue. First, a list of nine influencing Factors is presented and discussed. Secondly, these Factors are included...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  1. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    DEFF Research Database (Denmark)

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte

    2007-01-01

    a significant role in this assessment and different models have been created for it, but a representation which includes all of them has not been developed yet. This paper deals with this issue. First, a list of nine influencing Factors is presented and discussed. Secondly, these Factors are included...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  2. Reliability Analysis of Timber Structures through NDT Data Upgrading

    DEFF Research Database (Denmark)

    Sousa, Hélder; Sørensen, John Dalsgaard; Kirkegaard, Poul Henning

    for reliability calculation. In chapter 4, updating methods are conceptualized and defined. Special attention is drawn upon Bayesian methods and its implementation. Also a topic for updating based in inspection of deterioration is provided. State of the art definitions and proposed measurement indices......The first part of this document presents, in chapter 2, a description of timber characteristics and common used NDT and MDT for timber elements. Stochastic models for timber properties and damage accumulation models are also referred. According to timber’s properties a framework is proposed...

  3. A disjoint algorithm for seismic reliability analysis of lifeline networks

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The algorithm is based on constructing a disjoin kg t set of the minimal paths in a network system. In this paper,cubic notation was used to describe the logic function of a network in a well-balanced state, and then the sharp-product operation was used to construct the disjoint minimal path set of the network. A computer program has been developed, and when combined with decomposition technology, the reliability of a general lifeline network can be effectively and automatically calculated.

  4. Reliability and maintenance analysis of the CERN PS booster

    CERN Document Server

    Staff, P S B

    1977-01-01

    The PS Booster Synchrotron being a complex accelerator with four superposed rings and substantial additional equipment for beam splitting and recombination, doubts were expressed at the time of project authorization as to its likely operational reliability. For 1975 and 1976, the average down time was 3.2% (at least one ring off) or 1.5% (all four rings off). The items analysed are: operational record, design features, maintenance, spare parts policy, operating temperature, effects of thunderstorms, fault diagnostics, role of operations staff and action by experts. (15 refs).

  5. Reliability analysis of the bulk cargo loading system including dependent components

    Science.gov (United States)

    Blokus-Roszkowska, Agnieszka

    2016-06-01

    In the paper an innovative approach to the reliability analysis of multistate series-parallel systems assuming their components' dependency is presented. The reliability function of a multistate series system with components dependent according to the local load sharing rule is determined. Linking these results for series systems with results for parallel systems with independent components, we obtain the reliability function of a multistate series-parallel system assuming dependence of components' departures from the reliability states subsets in series subsystem and assuming independence between these subsystems. As a particular case, the reliability function of a multistate series-parallel system composed of dependent components having exponential reliability functions is fixed. Theoretical results are applied practically to the reliability evaluation of a bulk cargo transportation system, which main area is to load bulk cargo on board the ships. The reliability function and other reliability characteristics of the loading system are determined in case its components have exponential reliability functions with interdependent departures rates from the subsets of their reliability states. Finally, the obtained results are compared with results for the bulk cargo transportation system composed of independent components.

  6. The Effect of Shocks: An Empirical Analysis of Ethiopia

    Directory of Open Access Journals (Sweden)

    Yilebes Addisu Damtie

    2015-07-01

    Full Text Available Besides striving for the increase of production and development, it is also necessary to reduce the losses created by the shocks. The people of Ethiopia are exposed to the impact of both natural and man-made shocks. Following this, policy makers, governmental and non-governmental organizations need to identify the important shocks and their effect and use as an input. This study was conducted to identify the food insecurity shocks and to estimate their effect based on the conceptual framework developed in Ethiopia, Amhara National Regional State of Libo Kemkem District. Descriptive statistical analysis, multiple regression, binary logistic regression, chi-squared and independent sample t-test were used as a data analysis technique. The results showed eight shocks affecting households which were weather variability, weed, plant insect and pest infestation, soil fertility problem, animal disease and epidemics, human disease and epidemics, price fluctuation problem and conflict. Weather variability, plant insect and pest infestation, weed, animal disease and epidemics created a mean loss of 3,821.38, 886.06, 508.04 and 1,418.32 Birr, respectively. In addition, human disease and epidemics, price fluctuation problem and conflict affected 68.11%, 88.11% and 14.59% of households, respectively. Among the sample households 28,1 % were not able to meet their food need throughout the year while 71,9 % could. The result of the multiple regression models revealed that weed existence (β = –0,142, p < 0,05, plant insect and pest infestation (β = –0,279, p < 0,01 and soil fertility problem (β = –0,321, p < 0,01 had significant effect on income. Asset was found significantly affected by plant insect and pest infestation (β = –0,229, p < 0,01, human disease and epidemics (β = 0,145, p < 0,05, and soil fertility problem (β = –0,317, p < 0,01 while food production was affected by soil fertility problem (β = –0,314, p < 0,01. Binary logistic

  7. Using a Hybrid Cost-FMEA Analysis for Wind Turbine Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Nacef Tazi

    2017-02-01

    Full Text Available Failure mode and effects analysis (FMEA has been proven to be an effective methodology to improve system design reliability. However, the standard approach reveals some weaknesses when applied to wind turbine systems. The conventional criticality assessment method has been criticized as having many limitations such as the weighting of severity and detection factors. In this paper, we aim to overcome these drawbacks and develop a hybrid cost-FMEA by integrating cost factors to assess the criticality, these costs vary from replacement costs to expected failure costs. Then, a quantitative comparative study is carried out to point out average failure rate, main cause of failure, expected failure costs and failure detection techniques. A special reliability analysis of gearbox and rotor-blades are presented.

  8. An empirical study of tourist preferences using conjoint analysis

    Directory of Open Access Journals (Sweden)

    Tripathi, S.N.

    2010-01-01

    Full Text Available Tourism and hospitality have become key global economic activities as expectations with regard to our use of leisure time have evolved, attributing greater meaning to our free time. While the growth in tourism has been impressive, India's share in total global tourism arrivals and earnings is quite insignificant. It is an accepted fact that India has tremendous potential for development of tourism. This anomaly and the various underlying factors responsible for it are the focus of our study. The objective being determination of customer preferences for multi attribute hybrid services like tourism, so as to enable the state tourism board to deliver a desired combination of intrinsic attributes, helping it to create a sustainable competitive advantage, leading to greater customer satisfaction and positive word of mouth. Conjoint analysis has been used for this purpose, which estimates the structure of a consumer’s preferences, given his/her overall evaluations of a set of alternatives that are pre-specified in terms of levels of different attributes.

  9. EMPIRICAL ANALYSIS OF EMPLOYEES WITH TERTIARY EDUCATION OCCUPATIONAL IMBALANCES

    Directory of Open Access Journals (Sweden)

    Andrei B. Ankudinov

    2013-01-01

    Full Text Available High percentage of graduates (among the highest in the world with university degrees combined with unacceptably low level of utilization of acquired qualifications and generally low quality of education in the majority of Russian universities result in huge structural imbalances. The article presents quantitative estimates of disproportions between the educational levels of employees with higher education and their professional occupation for different branches of economy. A logit models based analysis is performed of how much the professional functions performed by employees match their professional qualifications, levels of education and the extent to which their previously acquired expertise and skills are put into use. The sample used represents working population of Russia with tertiary education. The results obtained allow to the conclusion that worst disproportions between education levels attained and job requirements are observed in trade and services sector, transport and communications, housing and utilities, consumer goods and food industries. The matters are compounded by inert and informationally inefficient labor market, incapable of sending proper signals to national system of tertiary education.

  10. Empirical Analysis on CSR Communication in Romania: Transparency and Participation

    Directory of Open Access Journals (Sweden)

    Irina-Eugenia Iamandi

    2012-12-01

    Full Text Available In the specific field of corporate social responsibility (CSR, the participation of companies in supporting social and environmental issues is mainly analysed and/or measured based on their CSR communication policy; in this way, the transparency of the CSR reporting procedures is one of the most precise challenges for researchers and practitioners in the field. The main research objective of the present paper is to distinguish between different types of CSR participation by identifying the reasons behind CSR communication for a series of companies acting on the Romanian market. The descriptive analysis – conducted both at integrated and corporate level for the Romanian companies – took into account five main CSR communication related issues: CSR site, CSR report, CSR listing, CSR budget and CSR survey. The results highlight both the declarative/prescriptive and practical/descriptive perspectives of CSR communication in Romania, showing that the Romanian CSR market is reaching its full maturity. In more specific terms, the majority of the investigated companies are already using different types of CSR participation, marking the transition from CSR just for commercial purposes to CSR for long-term strategic use. The achieved results are broadly analysed in the paper and specific conclusions are emphasized.

  11. An empirical analysis of the Ebola outbreak in West Africa

    Science.gov (United States)

    Khaleque, Abdul; Sen, Parongama

    2017-02-01

    The data for the Ebola outbreak that occurred in 2014-2016 in three countries of West Africa are analysed within a common framework. The analysis is made using the results of an agent based Susceptible-Infected-Removed (SIR) model on a Euclidean network, where nodes at a distance l are connected with probability P(l) ∝ l-δ, δ determining the range of the interaction, in addition to nearest neighbors. The cumulative (total) density of infected population here has the form , where the parameters depend on δ and the infection probability q. This form is seen to fit well with the data. Using the best fitting parameters, the time at which the peak is reached is estimated and is shown to be consistent with the data. We also show that in the Euclidean model, one can choose δ and q values which reproduce the data for the three countries qualitatively. These choices are correlated with population density, control schemes and other factors. Comparing the real data and the results from the model one can also estimate the size of the actual population susceptible to the disease. Rescaling the real data a reasonably good quantitative agreement with the simulation results is obtained.

  12. An empirical analysis of the Ebola outbreak in West Africa

    CERN Document Server

    Khaleque, Abdul

    2016-01-01

    The data for the Ebola outbreak that occurred in 2014-2016 in three countries of West Africa are analysed within a common framework. The analysis is made using the results of an agent based Susceptible-Infected-Removed (SIR) model on a Euclidean network, where nodes at a distance $l$ are connected with probability $P(l) \\propto l^{-\\delta }$ in addition to nearest neighbors. The cumulative density of infected population here has the form $R(t) = \\frac{a\\exp(t/T)}{1+c\\exp(t/T)}$, where the parameters depend on $\\delta$ and the infection probability $q$. This form is seen to fit well with the data. Using the best fitting parameters, the time at which the peak is reached is estimated and is shown to be consistent with the data. We also show that in the Euclidean model, one can choose $\\delta$ and $q$ values which reproduce the data for the three countries qualitatively. These choices are correlated with population density, control schemes and other factors.

  13. An empirical analysis of ERP adoption by oil and gas firms

    Science.gov (United States)

    Romero, Jorge

    2005-07-01

    Despite the growing popularity of enterprise-resource-planning (ERP) systems for the information technology infrastructure of large and medium-sized businesses, there is limited empirical evidence on the competitive benefits of ERP implementations. Case studies of individual firms provide insights but do not provide sufficient evidence to draw reliable inferences and cross-sectional studies of firms in multiple industries provide a broad-brush perspective of the performance effects associated with ERP installations. To narrow the focus to a specific competitive arena, I analyze the impact of ERP adoption on various dimensions of performance for firms in the Oil and Gas Industry. I selected the Oil and Gas Industry because several companies installed a specific type of ERP system, SAP R/3, during the period from 1990 to 2002. In fact, SAP was the dominant provider of enterprise software to oil and gas companies during this period. I evaluate performance of firms that implemented SAP R/3 relative to firms that did not adopt ERP systems in the pre-implementation, implementation and post-implementation periods. My analysis takes two different approaches, the first from a financial perspective and the second from a strategic perspective. Using the Sloan (General Motors) model commonly applied in financial statement analysis, I examine changes in performance for ERP-adopting firms versus non-adopting firms along the dimensions of asset utilization and return on sales. Asset utilization is more closely aligned with changes in leanness of operations, and return on sales is more closely aligned with customer-value-added. I test hypotheses related to the timing and magnitude of the impact of ERP implementation with respect to leanness of operations and customer value added. I find that SAP-adopting companies performed relatively better in terms of asset turnover than non-SAP-adopting companies during both the implementation and post-implementation periods and that SAP

  14. Investigation for Ensuring the Reliability of the MELCOR Analysis Results

    Energy Technology Data Exchange (ETDEWEB)

    Sung, Joonyoung; Maeng, Yunhwan; Lee, Jaeyoung [Handong Global Univ., Pohang (Korea, Republic of)

    2015-10-15

    Flow rate could be also main factor to be proven because it is in charge of a role which takes thermal balance through heat transfer in inner side of fuel assembly. Some problems about a reliability of MELCOR results could be posed in the 2{sup nd} technical report of NSRC project. In order to confirm whether MELCOR results are dependable, experimental data of Sandia Fuel Project 1 phase were used to be compared to be a reference. In Spent Fuel Pool (SFP) severe accident, especially in case of boil-off, partial loss of coolant accident, and complete loss of coolant accident; heat source and flow rate could be main points to analyze the MELCOR results. Heat source might be composed as decay heat and oxidation heat. Because heat source makes it possible to lead a zirconium fire situation if it is satisfied that heat accumulates in spent fuel rod and then cladding temperature could be raised continuously to be generated an oxidation heat, this might be a main factor to be confirmed. This work was proposed to investigate reliability of MELCOR results in order to confirm physical phenomena if SFP severe accident is occurred. Almost results showed that MELCOR results were significantly different by minute change of main parameter in identical condition. Therefore it could be necessary that oxidation coefficients have to be chosen as value to delineate real phenomena as possible.

  15. Reliability analysis on a shell and tube heat exchanger

    Science.gov (United States)

    Lingeswara, S.; Omar, R.; Mohd Ghazi, T. I.

    2016-06-01

    A shell and tube heat exchanger reliability was done in this study using past history data from a carbon black manufacturing plant. The heat exchanger reliability study is vital in all related industries as inappropriate maintenance and operation of the heat exchanger will lead to major Process Safety Events (PSE) and loss of production. The overall heat exchanger coefficient/effectiveness (Uo) and Mean Time between Failures (MTBF) were analyzed and calculated. The Aspen and down time data was taken from a typical carbon black shell and tube heat exchanger manufacturing plant. As a result of the Uo calculated and analyzed, it was observed that the Uo declined over a period caused by severe fouling and heat exchanger limitation. This limitation also requires further burn out period which leads to loss of production. The MTBF calculated is 649.35 hours which is very low compared to the standard 6000 hours for the good operation of shell and tube heat exchanger. The guidelines on heat exchanger repair, preventive and predictive maintenance was identified and highlighted for better heat exchanger inspection and repair in the future. The fouling of heat exchanger and the production loss will be continuous if proper heat exchanger operation and repair using standard operating procedure is not followed.

  16. Does risk management contribute to IT project success? A meta-analysis of empirical evidence

    NARCIS (Netherlands)

    de Bakker, K.F.C.; Boonstra, A.; Wortmann, J.C.

    2010-01-01

    The question whether risk management contributes to IT project success is considered relevant by people from both academic and practitioners' communities already for a long time. This paper presents a meta-analysis of the empirical evidence that either supports or opposes the claim that risk managem

  17. Some Empirical Issues in Research on Academic Departments: Homogeneity, Aggregation, and Levels of Analysis.

    Science.gov (United States)

    Ramsey, V. Jean; Dodge, L. Delf

    1983-01-01

    The appropriateness of using academic departments as a level of analysis of organizational administration is examined. Factors analyzed include homogeneity of faculty responses to measures of organizational structure, environmental uncertainty, and task routineness. Results were mixed, demonstrating the importance of empirically testing rather…

  18. An ACE-based Nonlinear Extension to Traditional Empirical Orthogonal Function Analysis

    DEFF Research Database (Denmark)

    Hilger, Klaus Baggesen; Nielsen, Allan Aasbjerg; Andersen, Ole;

    2001-01-01

    This paper shows the application of the empirical orthogonal unctions/principal component transformation on global sea surface height and temperature data from 1996 and 1997. A nonlinear correlation analysis of the transformed data is proposed and performed by applying the alternating conditional...

  19. Extended Analysis of Empirical Citations with Skinner's "Verbal Behavior": 1984-2004

    Science.gov (United States)

    Dixon, Mark R.; Small, Stacey L.; Rosales, Rocio

    2007-01-01

    The present paper comments on and extends the citation analysis of verbal operant publications based on Skinner's "Verbal Behavior" (1957) by Dymond, O'Hora, Whelan, and O'Donovan (2006). Variations in population parameters were evaluated for only those studies that Dymond et al. categorized as empirical. Preliminary results indicate that the…

  20. An empirical analysis of the relationship between the consumption of alcohol and liver cirrhosis mortality

    DEFF Research Database (Denmark)

    Bentzen, Jan Børsen; Smith, Valdemar

    The question whether intake of alcohol is associated with liver cirrhosis mortality is analyzed using aggregate data for alcohol consumption, alcohol related diseases and alcohol policies of 16 European countries. The empirical analysis gives support to a close association between cirrhosis...

  1. Image Retrieval: Theoretical Analysis and Empirical User Studies on Accessing Information in Images.

    Science.gov (United States)

    Ornager, Susanne

    1997-01-01

    Discusses indexing and retrieval for effective searches of digitized images. Reports on an empirical study about criteria for analysis and indexing digitized images, and the different types of user queries done in newspaper image archives in Denmark. Concludes that it is necessary that the indexing represent both a factual and an expressional…

  2. Steering the Ship through Uncertain Waters: Empirical Analysis and the Future of Evangelical Higher Education

    Science.gov (United States)

    Rine, P. Jesse; Guthrie, David S.

    2016-01-01

    Leaders of evangelical Christian colleges must navigate a challenging environment shaped by public concern about college costs and educational quality, federal inclinations toward increased regulation, and lingering fallout from the Great Recession. Proceeding from the premise that empirical analysis empowers institutional actors to lead well in…

  3. An Empirical Analysis of Life Jacket Effectiveness in Recreational Boating.

    Science.gov (United States)

    Viauroux, Christelle; Gungor, Ali

    2016-02-01

    This article gives a measure of life jacket (LJ) effectiveness in U.S. recreational boating. Using the U.S. Coast Guard's Boating Accident Report Database from 2008 to 2011, we find that LJ wear is one of the most important determinants influencing the number of recreational boating fatalities, together with the number of vessels involved, and the type and engine of the vessel(s). We estimate a decrease in the number of deceased per vessel of about 80% when the operator wears their LJs compared to when they do not. The odds of dying are 86% higher than average if the accident involves a canoe or kayak, but 80% lower than average when more than one vessel is involved in the accident and 34% lower than average when the operator involved in the accident has more than 100 hours of boating experience. Interestingly, we find that LJ effectiveness decreases significantly as the length of the boat increases and decreases slightly as water temperature increases. However, it increases slightly as the operator's age increases. We find that between 2008 and 2011, an LJ regulation that requires all operators to wear their LJs-representing a 20% increase in wear rate-would have saved 1,721 (out of 3,047) boaters or 1,234 out of 2,185 drowning victims. The same policy restricted to boats 16-30 feet in length would have saved approximately 778 victims. Finally, we find that such a policy would reduce the percentage of drowning victims compared to other causes of death. © 2015 Society for Risk Analysis.

  4. An Empirical Analysis of Rough Set Categorical Clustering Techniques

    Science.gov (United States)

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy. PMID:28068344

  5. An Empirical Analysis of Rough Set Categorical Clustering Techniques.

    Science.gov (United States)

    Uddin, Jamal; Ghazali, Rozaida; Deris, Mustafa Mat

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy.

  6. An Empirical Analysis of the Default Rate of Informal Lending—Evidence from Yiwu, China

    Science.gov (United States)

    Lu, Wei; Yu, Xiaobo; Du, Juan; Ji, Feng

    This study empirically analyzes the underlying factors contributing to the default rate of informal lending. This paper adopts snowball sampling interview to collect data and uses the logistic regression model to explore the specific factors. The results of these analyses validate the explanation of how the informal lending differs from the commercial loan. Factors that contribute to the default rate have particular attributes, while sharing some similarities with commercial bank or FICO credit scoring Index. Finally, our concluding remarks draw some inferences from empirical analysis and speculate as to what this may imply for the role of formal and informal financial sectors.

  7. Social vulnerability as criteria used in student assistance policy: an analysis of conceptual and empirical

    Directory of Open Access Journals (Sweden)

    Junia Zacour del Giúdice

    2014-12-01

    Full Text Available The criterion of social vulnerability is embedded in different programs for the analysis of poverty and social exclusion. For the present study, we sought to examine conceptually and empirically, the criterion of social vulnerability adopted in student assistance from the Federal University of Viçosa / MG for selection of students benefited by means of literature and questionnaire. Related to social vulnerability and social exclusion risks and addressed the students' perception of the subject. We conclude that the empirical approach refers to the conceptual, social vulnerability being concentrated on economic and social indicators, represented by income, ownership of assets, work, health and family structure.

  8. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  9. Rockfall travel distance analysis by using empirical models (Solà d'Andorra la Vella, Central Pyrenees)

    Science.gov (United States)

    Copons, R.; Vilaplana, J. M.; Linares, R.

    2009-12-01

    The prediction of rockfall travel distance below a rock cliff is an indispensable activity in rockfall susceptibility, hazard and risk assessment. Although the size of the detached rock mass may differ considerably at each specific rock cliff, small rockfall (government has been documenting rockfalls since 1999. This documentation consists in mapping the release point and the individual fallen blocks immediately after the event. The documentation of historical rockfalls by morphological analysis, eye-witness accounts and historical images serve to increase available information. In total, data from twenty small rockfalls have been gathered which reveal an amount of a hundred individual fallen rock blocks. The data acquired has been used to check the reliability of the main empirical models widely adopted (reach and shadow angle models) and to analyse the influence of parameters which affecting the travel distance (rockfall size, height of fall along the rock cliff and volume of the individual fallen rock block). For predicting travel distances in maps with medium scales, a method has been proposed based on the "reach probability" concept. The accuracy of results has been tested from the line entailing the farthest fallen boulders which represents the maximum travel distance of past rockfalls. The paper concludes with a discussion of the application of both empirical models to other study areas.

  10. Reliablity analysis of gravity dams by response surface method

    Science.gov (United States)

    Humar, Nina; Kryžanowski, Andrej; Brilly, Mitja; Schnabl, Simon

    2013-04-01

    A dam failure is one of the most important problems in dam industry. Since the mechanical behavior of dams is usually a complex phenomenon existing classical mathematical models are generally insufficient to adequately predict the dam failure and thus the safety of dams. Therefore, numerical reliability methods are often used to model such a complex mechanical phenomena. Thus, the main purpose of the present paper is to present the response surface method as a powerful mathematical tool used to study and foresee the dam safety considering a set of collected monitoring data. The derived mathematical model is applied to a case study, the Moste dam, which is the highest concrete gravity dam in Slovenia. Based on the derived model, the ambient/state variables are correlated with the dam deformation in order to gain a forecasting tool able to define the critical thresholds for dam management.

  11. Cost Minimization Analysis of Empiric Antibiotic Used by Sepsis Patient Respiratory Infection Source

    Directory of Open Access Journals (Sweden)

    Okky S. Purwanti

    2014-03-01

    Full Text Available Empirical antibiotics plays an important role in the therapy of sepsis. The aims of this study was to estimate and compare the cost of treating inpatient sepsis with respiratory infection, with cefotaximemetronidazole or cefotaxime-erythromycin antibiotics. Observational study of cost minimization analysis was conducted by retrospective data from 2010 until 2012. Data were collected from medical records of inpatients sepsis with respiratory infection and received empirical therapy cefotaximemetronidazole or cefotaxime-erythromycin and treatment’s pricelist from department of accounting. Direct medical cost was calculated from empirical antibiotic costs, costs of medical treatment, medical expenses, hospitalization costs, and administrative costs. The study considered the cost from preadmission because sepsis until the patient was fully recovered of sepsis. Cefotaxime-metronidazole and cefotaxime-erythromycin are assumed to have equivalent efficacy. Patients with empirical cefotaxime-metronidazole were found have longer length of stay (25 versus 11 and average total cost of treatment was cheaper (16.641.112,04 IDR versus 21.641.678,02 IDR. The findings demonstrate that combination of empirical antibiotic of cefotaxime–metronidazole is more efficient than cefotaxime-erythromycin.

  12. Reliability of three-dimensional gait analysis in cervical spondylotic myelopathy.

    LENUS (Irish Health Repository)

    McDermott, Ailish

    2010-10-01

    Gait impairment is one of the primary symptoms of cervical spondylotic myelopathy (CSM). Detailed assessment is possible using three-dimensional gait analysis (3DGA), however the reliability of 3DGA for this population has not been established. The aim of this study was to evaluate the test-retest reliability of temporal-spatial, kinematic and kinetic parameters in a CSM population.

  13. RELIABILITY ANALYSIS OF A SYSTEM OF BOILER USED IN READYMADE GARMENT INDUSTRY

    Directory of Open Access Journals (Sweden)

    R.K. Agnihotri

    2008-01-01

    Full Text Available The present paper deals with the reliability analysis of a system of boiler used in garment industry.The system consists of a single unit of boiler which plays an important role in garment industry. Usingregenerative point technique with Markov renewal process various reliability characteristics of interest areobtained.

  14. Convergence among Data Sources, Response Bias, and Reliability and Validity of a Structured Job Analysis Questionnaire.

    Science.gov (United States)

    Smith, Jack E.; Hakel, Milton D.

    1979-01-01

    Examined are questions pertinent to the use of the Position Analysis Questionnaire: Who can use the PAQ reliably and validly? Must one rely on trained job analysts? Can people having no direct contact with the job use the PAQ reliably and validly? Do response biases influence PAQ responses? (Author/KC)

  15. Risk and reliability analysis theory and applications : in honor of Prof. Armen Der Kiureghian

    CERN Document Server

    2017-01-01

    This book presents a unique collection of contributions from some of the foremost scholars in the field of risk and reliability analysis. Combining the most advanced analysis techniques with practical applications, it is one of the most comprehensive and up-to-date books available on risk-based engineering. All the fundamental concepts needed to conduct risk and reliability assessments are covered in detail, providing readers with a sound understanding of the field and making the book a powerful tool for students and researchers alike. This book was prepared in honor of Professor Armen Der Kiureghian, one of the fathers of modern risk and reliability analysis.

  16. Reliability analysis of repairable systems using system dynamics modeling and simulation

    Science.gov (United States)

    Srinivasa Rao, M.; Naikan, V. N. A.

    2014-07-01

    Repairable standby system's study and analysis is an important topic in reliability. Analytical techniques become very complicated and unrealistic especially for modern complex systems. There have been attempts in the literature to evolve more realistic techniques using simulation approach for reliability analysis of systems. This paper proposes a hybrid approach called as Markov system dynamics (MSD) approach which combines the Markov approach with system dynamics simulation approach for reliability analysis and to study the dynamic behavior of systems. This approach will have the advantages of both Markov as well as system dynamics methodologies. The proposed framework is illustrated for a standby system with repair. The results of the simulation when compared with that obtained by traditional Markov analysis clearly validate the MSD approach as an alternative approach for reliability analysis.

  17. RELIABILITY-BASED DESIGN AND ANALYSIS ON HYDRAULIC SYSTEM FOR SYNTHETIC RUBBER PRESS

    Institute of Scientific and Technical Information of China (English)

    Yao Chengyu; Zhao Jingyi

    2005-01-01

    To overcome the design limitations of traditional hydraulic control system for synthetic rubber press and such faults as high fault rate, low reliability, high energy-consuming and which always led to shutting down of post-treatment product line for synthetic rubber, brand-new hydraulic system combining with PC control and two-way cartridge valves for the press is developed, whose reliability is analyzed, reliability model of the hydraulic system for the press is established by analyzing processing steps, and reliability simulation of each step and the whole system is carried out by software MATLAB, which is verified through reliability test. The fixed time test has proved not that theory analysis is sound, but the system has characteristics of reasonable design and high reliability,and can lower the required power supply and operational energy cost.

  18. Rockfall travel distance analysis by using empirical models (Solà d'Andorra la Vella, Central Pyrenees

    Directory of Open Access Journals (Sweden)

    R. Copons

    2009-12-01

    Full Text Available The prediction of rockfall travel distance below a rock cliff is an indispensable activity in rockfall susceptibility, hazard and risk assessment. Although the size of the detached rock mass may differ considerably at each specific rock cliff, small rockfall (<100 m3 is the most frequent process. Empirical models may provide us with suitable information for predicting the travel distance of small rockfalls over an extensive area at a medium scale (1:100 000–1:25 000. "Solà d'Andorra la Vella" is a rocky slope located close to the town of Andorra la Vella, where the government has been documenting rockfalls since 1999. This documentation consists in mapping the release point and the individual fallen blocks immediately after the event. The documentation of historical rockfalls by morphological analysis, eye-witness accounts and historical images serve to increase available information. In total, data from twenty small rockfalls have been gathered which reveal an amount of a hundred individual fallen rock blocks. The data acquired has been used to check the reliability of the main empirical models widely adopted (reach and shadow angle models and to analyse the influence of parameters which affecting the travel distance (rockfall size, height of fall along the rock cliff and volume of the individual fallen rock block. For predicting travel distances in maps with medium scales, a method has been proposed based on the "reach probability" concept. The accuracy of results has been tested from the line entailing the farthest fallen boulders which represents the maximum travel distance of past rockfalls. The paper concludes with a discussion of the application of both empirical models to other study areas.

  19. Determinants of Human Development Index: A Cross-Country Empirical Analysis

    OpenAIRE

    Shah, Smit

    2016-01-01

    The Human Development Index is a statistical tool used to measure countries overall achievements in its social and economic dimensions. This paper tried to find out major factors affecting Human Development Index like health index, education index and income index. The objective of this study is found out the empirical findings and trend of human development across countries, regression analysis of determinants factors and region wise analysis of human development index.

  20. Low Carbon-Oriented Optimal Reliability Design with Interval Product Failure Analysis and Grey Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Yixiong Feng

    2017-03-01

    Full Text Available The problem of large amounts of carbon emissions causes wide concern across the world, and it has become a serious threat to the sustainable development of the manufacturing industry. The intensive research into technologies and methodologies for green product design has significant theoretical meaning and practical value in reducing the emissions of the manufacturing industry. Therefore, a low carbon-oriented product reliability optimal design model is proposed in this paper: (1 The related expert evaluation information was prepared in interval numbers; (2 An improved product failure analysis considering the uncertain carbon emissions of the subsystem was performed to obtain the subsystem weight taking the carbon emissions into consideration. The interval grey correlation analysis was conducted to obtain the subsystem weight taking the uncertain correlations inside the product into consideration. Using the above two kinds of subsystem weights and different caution indicators of the decision maker, a series of product reliability design schemes is available; (3 The interval-valued intuitionistic fuzzy sets (IVIFSs were employed to select the optimal reliability and optimal design scheme based on three attributes, namely, low carbon, correlation and functions, and economic cost. The case study of a vertical CNC lathe proves the superiority and rationality of the proposed method.

  1. Pop-culture as a factor of Socialization: the Opportunities for Empirical Analysis

    Directory of Open Access Journals (Sweden)

    I V Trotsuk

    2010-03-01

    Full Text Available The article offers the systematic analysis of the issues related to the sociological research of popular culture at theoretical and empirical levels. The former level of analysis deals with the definitions of pop-culture in comparison with the closely related concept of mass culture as well as interdisciplinary endeavours to conceptualize the range of topics related to popular culture. Furthermore, the functions of popular culture as well as its socialization opportunities (both positive and negative are outlined. As far as the empirical analysis is concerned, the above-mentioned issues have yet received little attention. The sociological analysis tools usually comprise nothing but youth leisure preferences and value orientations theme-based modules which are not infrequently supposed to confirm the negative influence of television on socialization. The authors put forward another approach to the empirical study of the impact of popular culture with the focus on the analysis of identification models represented in «texts» of popular culture. An example illustrating the application of the given approach (the content-analysis of the youth magazine «Molotok» is provided in this very item.

  2. Reactor scram experience for shutdown system reliability analysis. [BWR; PWR

    Energy Technology Data Exchange (ETDEWEB)

    Edison, G.E.; Pugliese, S.L.; Sacramo, R.F.

    1976-06-01

    Scram experience in a number of operating light water reactors has been reviewed. The date and reactor power of each scram was compiled from monthly operating reports and personal communications with the operating plant personnel. The average scram frequency from ''significant'' power (defined as P/sub trip//P/sub max greater than/ approximately 20 percent) was determined as a function of operating life. This relationship was then used to estimate the total number of reactor trips from above approximately 20 percent of full power expected to occur during the life of a nuclear power plant. The shape of the scram frequency vs. operating life curve resembles a typical reliability bathtub curve (failure rate vs. time), but without a rising ''wearout'' phase due to the lack of operating data near the end of plant design life. For this case the failures are represented by ''bugs'' in the plant system design, construction, and operation which lead to scram. The number of scrams would appear to level out at an average of around three per year; the standard deviations from the mean value indicate an uncertainty of about 50 percent. The total number of scrams from significant power that could be expected in a plant designed for a 40-year life would be about 130 if no wearout phase develops near the end of life.

  3. Pharmacoeconomic analysis of voriconazole vs. caspofungin in the empirical antifungal therapy of febrile neutropenia in Australia.

    Science.gov (United States)

    Al-Badriyeh, Daoud; Liew, Danny; Stewart, Kay; Kong, David C M

    2012-05-01

    In two major clinical trials, voriconazole and caspofungin were recommended as alternatives to liposomal amphotericin B for empirical use in febrile neutropenia. This study investigated the health economic impact of using voriconazole vs. caspofungin in patients with febrile neutropenia. A decision analytic model was developed to measure downstream consequences of empirical antifungal therapy. Clinical outcomes measured were success, breakthrough infection, persistent base-line infection, persistent fever, premature discontinuation and death. Treatment transition probabilities and patterns were directly derived from data in two relevant randomised controlled trials. Resource use was estimated using an expert clinical panel. Cost inputs were obtained from latest Australian sources. The analysis adopted the perspective of the Australian hospital system. The use of caspofungin led to a lower expected mean cost per patient than voriconazole (AU$40,558 vs. AU$41,356), with a net cost saving of AU$798 (1.9%) per patient. Results were most sensitive to the duration of therapy and the alternative therapy used post-discontinuation. In uncertainty analysis, the cost associated with caspofungin is less than that with voriconazole in 65.5% of cases. This is the first economic evaluation of voriconazole vs. caspofungin for empirical therapy. Caspofungin appears to have a higher probability of having cost-savings than voriconazole for empirical therapy. The difference between the two medications does not seem to be statistically significant however.

  4. Intraobserver and intermethod reliability for using two different computer programs in preoperative lower limb alignment analysis

    Directory of Open Access Journals (Sweden)

    Mohamed Kenawey

    2016-12-01

    Conclusion: Computer assisted lower limb alignment analysis is reliable whether using graphics editing program or specialized planning software. However slight higher variability for angles away from the knee joint can be expected.

  5. Reliability of 3D upper limb motion analysis in children with obstetric brachial plexus palsy.

    Science.gov (United States)

    Mahon, Judy; Malone, Ailish; Kiernan, Damien; Meldrum, Dara

    2017-03-01

    Kinematics, measured by 3D upper limb motion analysis (3D-ULMA), can potentially increase understanding of movement patterns by quantifying individual joint contributions. Reliability in children with obstetric brachial plexus palsy (OBPP) has not been established.

  6. Analysis methods for structure reliability of piping components

    Energy Technology Data Exchange (ETDEWEB)

    Schimpfke, T.; Grebner, H.; Sievers, J. [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Koeln (Germany)

    2004-07-01

    In the frame of the German reactor safety research program of the Federal Ministry of Economics and Labour (BMWA) GRS has started to develop an analysis code named PROST (PRObabilistic STructure analysis) for estimating the leak and break probabilities of piping systems in nuclear power plants. The long-term objective of this development is to provide failure probabilities of passive components for probabilistic safety analysis of nuclear power plants. Up to now the code can be used for calculating fatigue problems. The paper mentions the main capabilities and theoretical background of the present PROST development and presents some of the results of a benchmark analysis in the frame of the European project NURBIM (Nuclear Risk Based Inspection Methodologies for Passive Components). (orig.)

  7. Use of Fault Tree Analysis for Automotive Reliability and Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lambert, H

    2003-09-24

    Fault tree analysis (FTA) evolved from the aerospace industry in the 1960's. A fault tree is deductive logic model that is generated with a top undesired event in mind. FTA answers the question, ''how can something occur?'' as opposed to failure modes and effects analysis (FMEA) that is inductive and answers the question, ''what if?'' FTA is used in risk, reliability and safety assessments. FTA is currently being used by several industries such as nuclear power and chemical processing. Typically the automotive industries uses failure modes and effects analysis (FMEA) such as design FMEAs and process FMEAs. The use of FTA has spread to the automotive industry. This paper discusses the use of FTA for automotive applications. With the addition automotive electronics for various applications in systems such as engine/power control, cruise control and braking/traction, FTA is well suited to address failure modes within these systems. FTA can determine the importance of these failure modes from various perspectives such as cost, reliability and safety. A fault tree analysis of a car starting system is presented as an example.

  8. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  9. Uncertainty analysis with reliability techniques of fluvial hydraulic simulations

    Science.gov (United States)

    Oubennaceur, K.; Chokmani, K.; Nastev, M.

    2016-12-01

    Flood inundation models are commonly used to simulate hydraulic and floodplain inundation processes, prerequisite to successful floodplain management and preparation of appropriate flood risk mitigation plans. Selecting statistically significant ranges of the variables involved in the inundation modelling is crucial for the model performance. This involves various levels of uncertainty, which due to the cumulative nature can lead to considerable uncertainty in the final results. Therefore, in addition to the validation of the model results, there is a need for clear understanding and identifying sources of uncertainty and for measuring the model uncertainty. A reliability approach called Point-Estimate Method is presented to quantify uncertainty effects of the input data and to calculate the propagation of uncertainty in the inundation modelling process. The Point Estimate Method is a special case of numerical quadrature based on orthogonal polynomials. It allows to evaluate the low order of performance functions of independent random variables such the water depth. The variables considered in the analyses include elevation data, flow rate and Manning's coefficient n given with their own probability distributions. The approach is applied to a 45 km reach of the Richelieu River, Canada, between Rouses point and Fryers Rapids. The finite element hydrodynamic model H2D2 was used to solve the shallow water equations (SWE) and provide maps of expected water depths associated spatial distributions of standard deviations as a measure of uncertainty. The results indicate that for the simulated flow rates of 1113, 1206, and 1282, the uncertainties in water depths have a range of 25 cm, 30cm, and 60 cm, respectively. This kind of information is useful information for decision-making framework risk management in the context flood risk assessment.

  10. Parametric and semiparametric models with applications to reliability, survival analysis, and quality of life

    CERN Document Server

    Nikulin, M; Mesbah, M; Limnios, N

    2004-01-01

    Parametric and semiparametric models are tools with a wide range of applications to reliability, survival analysis, and quality of life. This self-contained volume examines these tools in survey articles written by experts currently working on the development and evaluation of models and methods. While a number of chapters deal with general theory, several explore more specific connections and recent results in "real-world" reliability theory, survival analysis, and related fields.

  11. Reliability analysis of a gravity-based foundation for wind turbines

    DEFF Research Database (Denmark)

    Vahdatirad, Mohammad Javad; Griffiths, D. V.; Andersen, Lars Vabbersgaard

    2014-01-01

    Deterministic code-based designs proposed for wind turbine foundations, are typically biased on the conservative side, and overestimate the probability of failure which can lead to higher than necessary construction cost. In this study reliability analysis of a gravity-based foundation concerning...... technique to perform the reliability analysis. The calibrated code-based design approach leads to savings of up to 20% in the concrete foundation volume, depending on the target annual reliability level. The study can form the basis for future optimization on deterministic-based designs for wind turbine...... foundations....

  12. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  13. A Review: Passive System Reliability Analysis – Accomplishments and Unresolved Issues

    Directory of Open Access Journals (Sweden)

    ARUN KUMAR NAYAK

    2014-10-01

    Full Text Available Reliability assessment of passive safety systems is one of the important issues, since safety of advanced nuclear reactors rely on several passive features. In this context, a few methodologies such as Reliability Evaluation of Passive Safety System (REPAS, Reliability Methods for Passive Safety Functions (RMPS and Analysis of Passive Systems ReliAbility (APSRA have been developed in the past. These methodologies have been used to assess reliability of various passive safety systems. While these methodologies have certain features in common, but they differ in considering certain issues; for example, treatment of model uncertainties, deviation of geometric and process parameters from their nominal values, etc. This paper presents the state of the art on passive system reliability assessment methodologies, the accomplishments and remaining issues. In this review three critical issues pertaining to passive systems performance and reliability have been identified. The first issue is, applicability of best estimate codes and model uncertainty. The best estimate codes based phenomenological simulations of natural convection passive systems could have significant amount of uncertainties, these uncertainties must be incorporated in appropriate manner in the performance and reliability analysis of such systems. The second issue is the treatment of dynamic failure characteristics of components of passive systems. REPAS, RMPS and APSRA methodologies do not consider dynamic failures of components or process, which may have strong influence on the failure of passive systems. The influence of dynamic failure characteristics of components on system failure probability is presented with the help of a dynamic reliability methodology based on Monte Carlo simulation. The analysis of a benchmark problem of Hold-up tank shows the error in failure probability estimation by not considering the dynamism of components. It is thus suggested that dynamic reliability

  14. Stochastic Response and Reliability Analysis of Hysteretic Structures

    DEFF Research Database (Denmark)

    Mørk, Kim Jørgensen

    During the last 30 years response analysis of structures under random excitation has been studied in detail. These studies are motivated by the fact that most of natures excitations, such as earthquakes, wind and wave loads exhibit randomly fluctuating characters. For safety reasons this randomness...

  15. reliability analysis of a two span floor designed according to ...

    African Journals Online (AJOL)

    user

    The Structural analysis and design of the timber floor system was carried out using deterministic approach ... The cell structure of hardwoods is more complex than ..... [12] BS EN -1-1: Eurocode 5: Design of Timber Structures, Part. 1-1.

  16. Detection of Decreasing Vegetation Cover Based on Empirical Orthogonal Function and Temporal Unmixing Analysis

    Directory of Open Access Journals (Sweden)

    Di Xu

    2017-01-01

    Full Text Available Vegetation plays an important role in the energy exchange of the land surface, biogeochemical cycles, and hydrological cycles. MODIS (MODerate-resolution Imaging Spectroradiometer EVI (Enhanced Vegetation Index is considered as a quantitative indicator for examining dynamic vegetation changes. This paper applied a new method of integrated empirical orthogonal function (EOF and temporal unmixing analysis (TUA to detect the vegetation decreasing cover in Jiangsu Province of China. The empirical orthogonal function (EOF statistical results provide vegetation decreasing/increasing trend as prior information for temporal unmixing analysis. Temporal unmixing analysis (TUA results could reveal the dominant spatial distribution of decreasing vegetation. The results showed that decreasing vegetation areas in Jiangsu are distributed in the suburbs and newly constructed areas. For validation, the vegetation’s decreasing cover is revealed by linear spectral mixture from Landsat data in three selected cities. Vegetation decreasing areas pixels are also calculated from land use maps in 2000 and 2010. The accuracy of integrated empirical orthogonal function and temporal unmixing analysis method is about 83.14%. This method can be applied to detect vegetation change in large rapidly urbanizing areas.

  17. Reliability analysis of the control system of large-scale vertical mixing equipment

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The control system of vertical mixing equipment is a concentrate distributed monitoring system (CDMS).A reliability analysis model was built and its analysis was conducted based on reliability modeling theories such as the graph theory,Markov process,and redundancy theory.Analysis and operational results show that the control system can meet all technical requirements for high energy composite solid propellant manufacturing.The reliability performance of the control system can be considerably improved by adopting a control strategy combined with the hot spared redundancy of the primary system and the cold spared redundancy of the emergent one.The reliability performance of the control system can be also improved by adopting the redundancy strategy or improving the quality of each component and cable of the system.

  18. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  19. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  20. New method for nonlinear and nonstationary time series analysis: empirical mode decomposition and Hilbert spectral analysis

    Science.gov (United States)

    Huang, Norden E.

    2000-04-01

    A new method for analyzing nonlinear and nonstationary data has been developed. The key pat of the method is the Empirical Mode Decomposition method with which any complicated data set can be decomposed into a finite and often small number of Intrinsic Mode Functions (IMF). An IMF is define das any function having the same numbers of zero- crossing and extrema, and also having symmetric envelopes defined by the local maxima and minima respectively. The IMF also admits well-behaved Hilbert transform. This decomposition method is adaptive, and therefore, highly efficient. Since the decomposition is based on the local characteristic time scale of het data, it is applicable to nonlinear and nonstationary processes. With the Hilbert transform, the IMF yield instantaneous frequencies as functions of time that give sharp identifications of embedded structures. The final presentation of the result is an energy-frequency-time distribution, designated as the Hilbert Spectrum. Comparisons with Wavelet and window Fourier analysis show the new method offers much better temporal and frequency resolutions.

  1. Multidisciplinary Inverse Reliability Analysis Based on Collaborative Optimization with Combination of Linear Approximations

    Directory of Open Access Journals (Sweden)

    Xin-Jia Meng

    2015-01-01

    Full Text Available Multidisciplinary reliability is an important part of the reliability-based multidisciplinary design optimization (RBMDO. However, it usually has a considerable amount of calculation. The purpose of this paper is to improve the computational efficiency of multidisciplinary inverse reliability analysis. A multidisciplinary inverse reliability analysis method based on collaborative optimization with combination of linear approximations (CLA-CO is proposed in this paper. In the proposed method, the multidisciplinary reliability assessment problem is first transformed into a problem of most probable failure point (MPP search of inverse reliability, and then the process of searching for MPP of multidisciplinary inverse reliability is performed based on the framework of CLA-CO. This method improves the MPP searching process through two elements. One is treating the discipline analyses as the equality constraints in the subsystem optimization, and the other is using linear approximations corresponding to subsystem responses as the replacement of the consistency equality constraint in system optimization. With these two elements, the proposed method realizes the parallel analysis of each discipline, and it also has a higher computational efficiency. Additionally, there are no difficulties in applying the proposed method to problems with nonnormal distribution variables. One mathematical test problem and an electronic packaging problem are used to demonstrate the effectiveness of the proposed method.

  2. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  3. Sensitivity analysis for reliable design verification of nuclear turbosets

    Energy Technology Data Exchange (ETDEWEB)

    Zentner, Irmela, E-mail: irmela.zentner@edf.f [Lamsid-Laboratory for Mechanics of Aging Industrial Structures, UMR CNRS/EDF, 1, avenue Du General de Gaulle, 92141 Clamart (France); EDF R and D-Structural Mechanics and Acoustics Department, 1, avenue Du General de Gaulle, 92141 Clamart (France); Tarantola, Stefano [Joint Research Centre of the European Commission-Institute for Protection and Security of the Citizen, T.P. 361, 21027 Ispra (Italy); Rocquigny, E. de [Ecole Centrale Paris-Applied Mathematics and Systems Department (MAS), Grande Voie des Vignes, 92 295 Chatenay-Malabry (France)

    2011-03-15

    In this paper, we present an application of sensitivity analysis for design verification of nuclear turbosets. Before the acquisition of a turbogenerator, energy power operators perform independent design assessment in order to assure safe operating conditions of the new machine in its environment. Variables of interest are related to the vibration behaviour of the machine: its eigenfrequencies and dynamic sensitivity to unbalance. In the framework of design verification, epistemic uncertainties are preponderant. This lack of knowledge is due to inexistent or imprecise information about the design as well as to interaction of the rotating machinery with supporting and sub-structures. Sensitivity analysis enables the analyst to rank sources of uncertainty with respect to their importance and, possibly, to screen out insignificant sources of uncertainty. Further studies, if necessary, can then focus on predominant parameters. In particular, the constructor can be asked for detailed information only about the most significant parameters.

  4. Combining forecasts in short term load forecasting: Empirical analysis and identification of robust forecaster

    Indian Academy of Sciences (India)

    YOGESH K BICHPURIYA; S A SOMAN; A SUBRAMANYAM

    2016-10-01

    We present an empirical analysis to show that combination of short term load forecasts leads to better accuracy. We also discuss other aspects of combination, i.e.,distribution of weights, effect of variation in the historical window and distribution of forecast errors. The distribution of forecast errors is analyzed in order to get a robust forecast. We define a robust forecaster as one which has consistency in forecast accuracy, lesser shocks (outliers) and lower standard deviation in the distribution of forecast errors. We propose a composite ranking (CRank) scheme based on a composite score which considers three performance measures—standard deviation, kurtosis of distribution of forecast errors and accuracy of forecasts. The CRank helps in identification of a robust forecasts given a choice of individual and combined forecaster. The empirical analysis has been done with the real life data sets of two distribution companies in India.

  5. A Reliable Method for Rhythm Analysis during Cardiopulmonary Resuscitation

    Directory of Open Access Journals (Sweden)

    U. Ayala

    2014-01-01

    Full Text Available Interruptions in cardiopulmonary resuscitation (CPR compromise defibrillation success. However, CPR must be interrupted to analyze the rhythm because although current methods for rhythm analysis during CPR have high sensitivity for shockable rhythms, the specificity for nonshockable rhythms is still too low. This paper introduces a new approach to rhythm analysis during CPR that combines two strategies: a state-of-the-art CPR artifact suppression filter and a shock advice algorithm (SAA designed to optimally classify the filtered signal. Emphasis is on designing an algorithm with high specificity. The SAA includes a detector for low electrical activity rhythms to increase the specificity, and a shock/no-shock decision algorithm based on a support vector machine classifier using slope and frequency features. For this study, 1185 shockable and 6482 nonshockable 9-s segments corrupted by CPR artifacts were obtained from 247 patients suffering out-of-hospital cardiac arrest. The segments were split into a training and a test set. For the test set, the sensitivity and specificity for rhythm analysis during CPR were 91.0% and 96.6%, respectively. This new approach shows an important increase in specificity without compromising the sensitivity when compared to previous studies.

  6. A Reliable Method for Rhythm Analysis during Cardiopulmonary Resuscitation

    Science.gov (United States)

    Ayala, U.; Irusta, U.; Ruiz, J.; Eftestøl, T.; Kramer-Johansen, J.; Alonso-Atienza, F.; Alonso, E.; González-Otero, D.

    2014-01-01

    Interruptions in cardiopulmonary resuscitation (CPR) compromise defibrillation success. However, CPR must be interrupted to analyze the rhythm because although current methods for rhythm analysis during CPR have high sensitivity for shockable rhythms, the specificity for nonshockable rhythms is still too low. This paper introduces a new approach to rhythm analysis during CPR that combines two strategies: a state-of-the-art CPR artifact suppression filter and a shock advice algorithm (SAA) designed to optimally classify the filtered signal. Emphasis is on designing an algorithm with high specificity. The SAA includes a detector for low electrical activity rhythms to increase the specificity, and a shock/no-shock decision algorithm based on a support vector machine classifier using slope and frequency features. For this study, 1185 shockable and 6482 nonshockable 9-s segments corrupted by CPR artifacts were obtained from 247 patients suffering out-of-hospital cardiac arrest. The segments were split into a training and a test set. For the test set, the sensitivity and specificity for rhythm analysis during CPR were 91.0% and 96.6%, respectively. This new approach shows an important increase in specificity without compromising the sensitivity when compared to previous studies. PMID:24895621

  7. Probability maps as a measure of reliability for indivisibility analysis

    Directory of Open Access Journals (Sweden)

    Joksić Dušan

    2005-01-01

    Full Text Available Digital terrain models (DTMs represent segments of spatial data bases related to presentation of terrain features and landforms. Square grid elevation models (DEMs have emerged as the most widely used structure during the past decade because of their simplicity and simple computer implementation. They have become an important segment of Topographic Information Systems (TIS, storing natural and artificial landscape in forms of digital models. This kind of a data structure is especially suitable for morph metric terrain evaluation and analysis, which is very important in environmental and urban planning and Earth surface modeling applications. One of the most often used functionalities of Geographical information systems software packages is indivisibility or view shed analysis of terrain. Indivisibility determination from analog topographic maps may be very exhausting, because of the large number of profiles that have to be extracted and compared. Terrain representation in form of the DEMs databases facilitates this task. This paper describes simple algorithm for terrain view shed analysis by using DEMs database structures, taking into consideration the influence of uncertainties of such data to the results obtained thus far. The concept of probability maps is introduced as a mean for evaluation of results, and is presented as thematic display.

  8. Finite State Machine Based Evaluation Model for Web Service Reliability Analysis

    CERN Document Server

    M, Thirumaran; Abarna, S; P, Lakshmi

    2011-01-01

    Now-a-days they are very much considering about the changes to be done at shorter time since the reaction time needs are decreasing every moment. Business Logic Evaluation Model (BLEM) are the proposed solution targeting business logic automation and facilitating business experts to write sophisticated business rules and complex calculations without costly custom programming. BLEM is powerful enough to handle service manageability issues by analyzing and evaluating the computability and traceability and other criteria of modified business logic at run time. The web service and QOS grows expensively based on the reliability of the service. Hence the service provider of today things that reliability is the major factor and any problem in the reliability of the service should overcome then and there in order to achieve the expected level of reliability. In our paper we propose business logic evaluation model for web service reliability analysis using Finite State Machine (FSM) where FSM will be extended to analy...

  9. Reliability analysis and risk-based methods for planning of operation & maintenance of offshore wind turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    2017-01-01

    for extreme and fatigue limit states are presented. Operation & Maintenance planning often follows corrective and preventive strategies based on information from condition monitoring and structural health monitoring systems. A reliability- and riskbased approach is presented where a life-cycle approach......Reliability analysis and probabilistic models for wind turbines are considered with special focus on structural components and application for reliability-based calibration of partial safety factors. The main design load cases to be considered in design of wind turbine components are presented...... including the effects of the control system and possible faults due to failure of electrical / mechanical components. Considerations are presented on the target reliability level for wind turbine structural components. Application is shown for reliability-based calibrations of partial safety factors...

  10. Reliability analysis of M/G/1 queues with general retrial times and server breakdowns

    Institute of Scientific and Technical Information of China (English)

    WANG Jinting

    2006-01-01

    This paper concerns the reliability issues as well as queueing analysis of M/G/1 retrial queues with general retrial times and server subject to breakdowns and repairs. We assume that the server is unreliable and customers who find the server busy or down are queued in the retrial orbit in accordance with a first-come-first-served discipline. Only the customer at the head of the orbit queue is allowed for access to the server. The necessary and sufficient condition for the system to be stable is given. Using a supplementary variable method, we obtain the Laplace-Stieltjes transform of the reliability function of the server and a steady state solution for both queueing and reliability measures of interest. Some main reliability indexes, such as the availability, failure frequency, and the reliability function of the server, are obtained.

  11. Fatigue damage reliability analysis for Nanjing Yangtze river bridge using structural health monitoring data

    Institute of Scientific and Technical Information of China (English)

    HE Xu-hui; CHEN Zheng-qing; YU Zhi-wu; HUANG Fang-lin

    2006-01-01

    To evaluate the fatigue damage reliability of critical members of the Nanjing Yangtze river bridge, according to the stress-number curve and Miner's rule, the corresponding expressions for calculating the structural fatigue damage reliability were derived. Fatigue damage reliability analysis of some critical members of the Nanjing Yangtze river bridge was carried out by using the strain-time histories measured by the structural health monitoring system of the bridge. The corresponding stress spectra were obtained by the real-time rain-flow counting method.Results of fatigue damage were calculated respectively by the reliability method at different reliability and compared with Miner's rule. The results show that the fatigue damage of critical members of the Nanjing Yangtze river bridge is very small due to its low live-load stress level.

  12. An Empirical Study Based on the SPSS Variance Analysis of College Teachers' Sports Participation and Satisfaction

    OpenAIRE

    Yunqiu Liang

    2013-01-01

    The study on University Teachers ' sports participation and their job satisfaction relationship for empirical research, mainly from the group to participate in sports activities situation on the object of study, investigation and mathematical statistics analysis SPSS. Results show that sports groups participate in job satisfaction higher than those in groups of job satisfaction; sports participation, different job satisfaction is also different. Recommendations for college teachers to address...

  13. Establishment of Grain Farmers' Supply Response Model and Empirical Analysis under Minimum Grain Purchase Price Policy

    OpenAIRE

    Zhang, Shuang

    2012-01-01

    Based on farmers' supply behavior theory and price expectations theory, this paper establishes grain farmers' supply response model of two major grain varieties (early indica rice and mixed wheat) in the major producing areas, to test whether the minimum grain purchase price policy can have price-oriented effect on grain production and supply in the major producing areas. Empirical analysis shows that the minimum purchase price published annually by the government has significant positive imp...

  14. EMPIRICAL VERIFICATION OF ANISOTROPIC HYDRODYNAMIC TRAFFIC MODEL IN TRAFFIC ANALYSIS AT INTERSECTIONS IN URBAN AREA

    Institute of Scientific and Technical Information of China (English)

    WEI Yan-fang; GUO Si-ling; XUE Yu

    2007-01-01

    In this article, the traffic hydrodynamic model considering the driver's reaction time was applied to the traffic analysis at the intersections on real roads. In the numerical simulation with the model, the pinch effect of the right-turning vehicles flow was found, which mainly leads to traffic jamming on the straight lane. All of the results in accordance with the empirical data confirm the applicability of this model.

  15. SEARCH COST REDUCTION INCREASES VARIATION IN HOTELS OCCUPANCY RATE: A THEORETICAL AND EMPIRICAL ANALYSIS

    OpenAIRE

    Marianna Succurro; Federico Boffa

    2010-01-01

    This study explores how direct online booking affects the variation in hotel bed-places occupancy rate between peak and off-peak periods, thereby contributing to three strands of literature, respectively the determinants of seasonality, the tourist information acquisition process and the impact of the internet on tourism. The empirical analysis, covering 18 countries over the 1997-2007 period, investigates the impact of an increase in the use of the internet by consumers on the seasonal varia...

  16. Strategy for Synthesis of Flexible Heat Exchanger Networks Embedded with System Reliability Analysis

    Institute of Scientific and Technical Information of China (English)

    YI Dake; HAN Zhizhong; WANG Kefeng; YAO Pingjing

    2013-01-01

    System reliability can produce a strong influence on the performance of the heat exchanger network (HEN).In this paper,an optimization method with system reliability analysis for flexible HEN by genetic/simulated annealing algorithms (GA/SA) is presented.Initial flexible arrangements of HEN is received by pseudo-temperature enthalpy diagram.For determining system reliability of HEN,the connections of heat exchangers(HEXs) and independent subsystems in the HEN are analyzed by the connection sequence matrix(CSM),and the system reliability is measured by the independent subsystem including maximum number of HEXs in the HEN.As for the HEN that did not meet system reliability,HEN decoupling is applied and the independent subsystems in the HEN are changed by removing decoupling HEX,and thus the system reliability is elevated.After that,heat duty redistribution based on the relevant elements of the heat load loops and HEX areas are optimized in GA/SA.Then,the favorable network configuration,which matches both the most economical cost and system reliability criterion,is located.Moreover,particular features belonging to suitable decoupling HEX are extracted from calculations.Corresponding numerical example is presented to verify that the proposed strategy is effective to formulate optimal flexible HEN with system reliability measurement.

  17. The Revised Child Anxiety and Depression Scale: A systematic review and reliability generalization meta-analysis.

    Science.gov (United States)

    Piqueras, Jose A; Martín-Vivar, María; Sandin, Bonifacio; San Luis, Concepción; Pineda, David

    2017-08-15

    Anxiety and depression are among the most common mental disorders during childhood and adolescence. Among the instruments for the brief screening assessment of symptoms of anxiety and depression, the Revised Child Anxiety and Depression Scale (RCADS) is one of the more widely used. Previous studies have demonstrated the reliability of the RCADS for different assessment settings and different versions. The aims of this study were to examine the mean reliability of the RCADS and the influence of the moderators on the RCADS reliability. We searched in EBSCO, PsycINFO, Google Scholar, Web of Science, and NCBI databases and other articles manually from lists of references of extracted articles. A total of 146 studies were included in our meta-analysis. The RCADS showed robust internal consistency reliability in different assessment settings, countries, and languages. We only found that reliability of the RCADS was significantly moderated by the version of RCADS. However, these differences in reliability between different versions of the RCADS were slight and can be due to the number of items. We did not examine factor structure, factorial invariance across gender, age, or country, and test-retest reliability of the RCADS. The RCADS is a reliable instrument for cross-cultural use, with the advantage of providing more information with a low number of items in the assessment of both anxiety and depression symptoms in children and adolescents. Copyright © 2017. Published by Elsevier B.V.

  18. A Reliability-Based Analysis of Bicyclist Red-Light Running Behavior at Urban Intersections

    Directory of Open Access Journals (Sweden)

    Mei Huan

    2015-01-01

    Full Text Available This paper describes the red-light running behavior of bicyclists at urban intersections based on reliability analysis approach. Bicyclists’ crossing behavior was collected by video recording. Four proportional hazard models by the Cox, exponential, Weibull, and Gompertz distributions were proposed to analyze the covariate effects on safety crossing reliability. The influential variables include personal characteristics, movement information, and situation factors. The results indicate that the Cox hazard model gives the best description of bicyclists’ red-light running behavior. Bicyclists’ safety crossing reliabilities decrease as their waiting times increase. There are about 15.5% of bicyclists with negligible waiting times, who are at high risk of red-light running and very low safety crossing reliabilities. The proposed reliability models can capture the covariates’ effects on bicyclists’ crossing behavior at signalized intersections. Both personal characteristics and traffic conditions have significant effects on bicyclists’ safety crossing reliability. A bicyclist is more likely to have low safety crossing reliability and high violation risk when more riders are crossing against the red light, and they wait closer to the motorized lane. These findings provide valuable insights in understanding bicyclists’ violation behavior; and their implications in assessing bicyclists’ safety crossing reliability were discussed.

  19. Report on the analysis of field data relating to the reliability of solar hot water systems.

    Energy Technology Data Exchange (ETDEWEB)

    Menicucci, David F. (Building Specialists, Inc., Albuquerque, NM)

    2011-07-01

    Utilities are overseeing the installations of thousand of solar hot water (SHW) systems. Utility planners have begun to ask for quantitative measures of the expected lifetimes of these systems so that they can properly forecast their loads. This report, which augments a 2009 reliability analysis effort by Sandia National Laboratories (SNL), addresses this need. Additional reliability data have been collected, added to the existing database, and analyzed. The results are presented. Additionally, formal reliability theory is described, including the bathtub curve, which is the most common model to characterize the lifetime reliability character of systems, and for predicting failures in the field. Reliability theory is used to assess the SNL reliability database. This assessment shows that the database is heavily weighted with data that describe the reliability of SHW systems early in their lives, during the warranty period. But it contains few measured data to describe the ends of SHW systems lives. End-of-life data are the most critical ones to define sufficiently the reliability of SHW systems in order to answer the questions that the utilities pose. Several ideas are presented for collecting the required data, including photometric analysis of aerial photographs of installed collectors, statistical and neural network analysis of energy bills from solar homes, and the development of simple algorithms to allow conventional SHW controllers to announce system failures and record the details of the event, similar to how aircraft black box recorders perform. Some information is also presented about public expectations for the longevity of a SHW system, information that is useful in developing reliability goals.

  20. Reliability and life-cycle analysis of deteriorating systems

    CERN Document Server

    Sánchez-Silva, Mauricio

    2016-01-01

    This book compiles and critically discusses modern engineering system degradation models and their impact on engineering decisions. In particular, the authors focus on modeling the uncertain nature of degradation considering both conceptual discussions and formal mathematical formulations. It also describes the basics concepts and the various modeling aspects of life-cycle analysis (LCA).  It highlights the role of degradation in LCA and defines optimum design and operation parameters. Given the relationship between operational decisions and the performance of the system’s condition over time, maintenance models are also discussed. The concepts and models presented have applications in a large variety of engineering fields such as Civil, Environmental, Industrial, Electrical and Mechanical engineering. However, special emphasis is given to problems related to large infrastructure systems. The book is intended to be used both as a reference resource for researchers and practitioners and as an academic text ...

  1. Reliability Analysis of Repairable Systems Using Stochastic Point Processes

    Institute of Scientific and Technical Information of China (English)

    TAN Fu-rong; JIANG Zhi-bin; BAI Tong-shuo

    2008-01-01

    In order to analyze the failure data from repairable systems, the homogeneous Poisson process(HPP) is usually used. In general, HPP cannot be applied to analyze the entire life cycle of a complex, re-pairable system because the rate of occurrence of failures (ROCOF) of the system changes over time rather thanremains stable. However, from a practical point of view, it is always preferred to apply the simplest methodto address problems and to obtain useful practical results. Therefore, we attempted to use the HPP model toanalyze the failure data from real repairable systems. A graphic method and the Laplace test were also usedin the analysis. Results of numerical applications show that the HPP model may be a useful tool for the entirelife cycle of repairable systems.

  2. Mechanical system reliability analysis using a combination of graph theory and Boolean function

    Energy Technology Data Exchange (ETDEWEB)

    Tang, J

    2001-04-01

    A new method based on graph theory and Boolean function for assessing reliability of mechanical systems is proposed. The procedure for this approach consists of two parts. By using the graph theory, the formula for the reliability of a mechanical system that considers the interrelations of subsystems or components is generated. Use of the Boolean function to examine the failure interactions of two particular elements of the system, followed with demonstrations of how to incorporate such failure dependencies into the analysis of larger systems, a constructive algorithm for quantifying the genuine interconnections between the subsystems or components is provided. The combination of graph theory and Boolean function provides an effective way to evaluate the reliability of a large, complex mechanical system. A numerical example demonstrates that this method an effective approaches in system reliability analysis.

  3. Analysis and Application of Mechanical System Reliability Model Based on Copula Function

    Directory of Open Access Journals (Sweden)

    An Hai

    2016-10-01

    Full Text Available There is complicated correlations in mechanical system. By using the advantages of copula function to solve the related issues, this paper proposes the mechanical system reliability model based on copula function. And makes a detailed research for the serial and parallel mechanical system model and gets their reliability function respectively. Finally, the application research is carried out for serial mechanical system reliability model to prove its validity by example. Using Copula theory to make mechanical system reliability modeling and its expectation, studying the distribution of the random variables (marginal distribution of the mechanical product’ life and associated structure of variables separately, can reduce the difficulty of multivariate probabilistic modeling and analysis to make the modeling and analysis process more clearly.

  4. Technology development of maintenance optimization and reliability analysis for safety features in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Tae Woon; Choi, Seong Soo; Lee, Dong Gue; Kim, Young Il

    1999-12-01

    The reliability data management system (RDMS) for safety systems of PHWR type plants has been developed and utilized in the reliability analysis of the special safety systems of Wolsong Unit 1,2 with plant overhaul period lengthened. The RDMS is developed for the periodic efficient reliability analysis of the safety systems of Wolsong Unit 1,2. In addition, this system provides the function of analyzing the effects on safety system unavailability if the test period of a test procedure changes as well as the function of optimizing the test periods of safety-related test procedures. The RDMS can be utilized in handling the requests of the regulatory institute actively with regard to the reliability validation of safety systems. (author)

  5. Methodological Approach for Performing Human Reliability and Error Analysis in Railway Transportation System

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2011-10-01

    Full Text Available Today, billions of dollars are being spent annually world wide to develop, manufacture, and operate transportation system such trains, ships, aircraft, and motor vehicles. Around 70 to 90 percent oftransportation crashes are, directly or indirectly, the result of human error. In fact, with the development of technology, system reliability has increased dramatically during the past decades, while human reliability has remained unchanged over the same period. Accordingly, human error is now considered as the most significant source of accidents or incidents in safety-critical systems. The aim of the paper is the proposal of a methodological approach to improve the transportation system reliability and in particular railway transportation system. The methodology presented is based on Failure Modes, Effects and Criticality Analysis (FMECA and Human Reliability Analysis (HRA.

  6. Tensile reliability analysis for gravity dam foundation surface based on FEM and response surface method

    Institute of Scientific and Technical Information of China (English)

    Tong-chun LI; Dan-dan LI; Zhi-qiang WANG

    2010-01-01

    In this study,the limit state equation for tensile reliability analysis of the foundation surface of a gravity dam was established.The possible crack length was set as the action effect and allowable crack length was set as the resistance in the limit state.The nonlinear FEM was used to obtain the crack length of the foundation surface of the gravity dam,and the linear response surface method based on the orthogonal test design method was used to calculate the reliability,providing a reasonable and simple method for calculating the reliability of the serviceability limit state.The Longtan RCC gravity dam was chosen as an example.An orthogonal test,including eleven factors and two levels,was conducted,and the tensile reliability was calculated.The analysis shows that this method is reasonable.

  7. Analysis of whisker-toughened CMC structural components using an interactive reliability model

    Science.gov (United States)

    Duffy, Stephen F.; Palko, Joseph L.

    1992-01-01

    Realizing wider utilization of ceramic matrix composites (CMC) requires the development of advanced structural analysis technologies. This article focuses on the use of interactive reliability models to predict component probability of failure. The deterministic William-Warnke failure criterion serves as theoretical basis for the reliability model presented here. The model has been implemented into a test-bed software program. This computer program has been coupled to a general-purpose finite element program. A simple structural problem is presented to illustrate the reliability model and the computer algorithm.

  8. Reliability analysis of tunnel surrounding rock stability by Monte-Carlo method

    Institute of Scientific and Technical Information of China (English)

    XI Jia-mi; YANG Geng-she

    2008-01-01

    Discussed advantages of improved Monte-Carlo method and feasibility aboutproposed approach applying in reliability analysis for tunnel surrounding rock stability. Onthe basis of deterministic parsing for tunnel surrounding rock, reliability computing methodof surrounding rock stability was derived from improved Monte-Carlo method. The com-puting method considered random of related parameters, and therefore satisfies relativityamong parameters. The proposed method can reasonably determine reliability of sur-rounding rock stability. Calculation results show that this method is a scientific method indiscriminating and checking surrounding rock stability.

  9. Latency Analysis of Systems with Multiple Interfaces for Ultra-Reliable M2M Communication

    DEFF Research Database (Denmark)

    Nielsen, Jimmy Jessen; Popovski, Petar

    2016-01-01

    One of the ways to satisfy the requirements of ultra-reliable low latency communication for mission critical Machine-type Communications (MTC) applications is to integrate multiple communication interfaces. In order to estimate the performance in terms of latency and reliability...... of such an integrated communication system, we propose an analysis framework that combines traditional reliability models with technology-specific latency probability distributions. In our proposed model we demonstrate how failure correlation between technologies can be taken into account. We show for the considered...

  10. Reliability and error analysis on xenon/CT CBF

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Z. [Diversified Diagnostic Products, Inc., Houston, TX (United States)

    2000-02-01

    This article provides a quantitative error analysis of a simulation model of xenon/CT CBF in order to investigate the behavior and effect of different types of errors such as CT noise, motion artifacts, lower percentage of xenon supply, lower tissue enhancements, etc. A mathematical model is built to simulate these errors. By adjusting the initial parameters of the simulation model, we can scale the Gaussian noise, control the percentage of xenon supply, and change the tissue enhancement with different kVp settings. The motion artifact will be treated separately by geometrically shifting the sequential CT images. The input function is chosen from an end-tidal xenon curve of a practical study. Four kinds of cerebral blood flow, 10, 20, 50, and 80 cc/100 g/min, are examined under different error environments and the corresponding CT images are generated following the currently popular timing protocol. The simulated studies will be fed to a regular xenon/CT CBF system for calculation and evaluation. A quantitative comparison is given to reveal the behavior and effect of individual error resources. Mixed error testing is also provided to inspect the combination effect of errors. The experiment shows that CT noise is still a major error resource. The motion artifact affects the CBF results more geometrically than quantitatively. Lower xenon supply has a lesser effect on the results, but will reduce the signal/noise ratio. The lower xenon enhancement will lower the flow values in all areas of brain. (author)

  11. Reliability of Foundation Pile Based on Settlement and a Parameter Sensitivity Analysis

    OpenAIRE

    Shujun Zhang; Luo Zhong; Zhijun Xu

    2016-01-01

    Based on the uncertainty analysis to calculation model of settlement, the formula of reliability index of foundation pile is derived. Based on this formula, the influence of coefficient of variation of the calculated settlement at pile head, coefficient of variation of the permissible limit of the settlement, coefficient of variation of the measured settlement, safety coefficient, and the mean value of calculation model coefficient on reliability is analyzed. The results indicate that (1) hig...

  12. The revised NEUROGES-ELAN system: An objective and reliable interdisciplinary analysis tool for nonverbal behavior and gesture.

    Science.gov (United States)

    Lausberg, Hedda; Sloetjes, Han

    2016-09-01

    As visual media spread to all domains of public and scientific life, nonverbal behavior is taking its place as an important form of communication alongside the written and spoken word. An objective and reliable method of analysis for hand movement behavior and gesture is therefore currently required in various scientific disciplines, including psychology, medicine, linguistics, anthropology, sociology, and computer science. However, no adequate common methodological standards have been developed thus far. Many behavioral gesture-coding systems lack objectivity and reliability, and automated methods that register specific movement parameters often fail to show validity with regard to psychological and social functions. To address these deficits, we have combined two methods, an elaborated behavioral coding system and an annotation tool for video and audio data. The NEUROGES-ELAN system is an effective and user-friendly research tool for the analysis of hand movement behavior, including gesture, self-touch, shifts, and actions. Since its first publication in 2009 in Behavior Research Methods, the tool has been used in interdisciplinary research projects to analyze a total of 467 individuals from different cultures, including subjects with mental disease and brain damage. Partly on the basis of new insights from these studies, the system has been revised methodologically and conceptually. The article presents the revised version of the system, including a detailed study of reliability. The improved reproducibility of the revised version makes NEUROGES-ELAN a suitable system for basic empirical research into the relation between hand movement behavior and gesture and cognitive, emotional, and interactive processes and for the development of automated movement behavior recognition methods.

  13. Investigation of Common Symptoms of Cancer and Reliability Analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Objective: To identify cancer distribution and treatment requirements, a questionnaire on cancer patients was conducted. It was our objective to validate a series of symptoms commonly used in traditional Chinese medicine (TCM). Methods: The M. D. Anderson Symptom Assessment Inventory (MDASI) was used with 10 more TCM items added. Questions regarding TCM application requested in cancer care were also asked. A multi-center, cross-sectional study was conducted in 340 patients from 4 hospitals in Beijing and Dalian. SPSS and Excel software were adopted for statistical analysis. The questionnaire was self-evaluated with the Cronbach's alpha score. Results: The most common symptoms were fatigue 89.4%, sleep disturbance 74.4%, dry mouth 72.9%, poor appetite 72.9%, and difficulty remembering 71.2%. These symptoms affected work (89.8%), mood (82.6%),and activity (76.8%), resulting in poor quality of life. Eighty percent of the patients wanted to regulate the body with TCM. Almost 100% of the patients were interested in acquiring knowledge regarding the integrated traditional Chinese medicine (TCM) and Western medicine (WM) in the treatment and rehabilitation of cancer. Cronbach's alpha score indicated that there was acceptable internal consistency within both the MDASI and TCM items, 0.86 for MDASI, 0.78 for TCM, and 0.90 for MDASI-TCM (23 items). Conclusions: Fatigue, sleep disturbance, dry mouth, poor appetite, and difficulty remembering are the most common symptoms in cancer patients. These greatly affect the quality of life for these patients. Patients expressed a strong desire for TCM holistic regulation. The MDASI and its TCM-adapted model could be a critical tool for the quantitative study of TCM symptoms.

  14. Application analysis of empirical mode decomposition and phase space reconstruction in dam time-varying characteristic

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    In view of some courses of the time-varying characteristics processing in the analysis of dam deformation,the paper proposes a new method to analyze the dam time-varying characteristic based on the empirical mode decomposition and phase space reconstruction theory.First of all,to reduce the influences on the traditional statistical model from human factors and assure the analysis accuracy,response variables of the time-varying characteristic are obtained by the way of the empirical mode decomposition;and then,a phase plane of those variables is reconstructed to investigate their processing rules.These methods have already been applied to an actual project and the results showed that data interpretation with the assists of empirical mode decomposition and phase space reconstruction is effective in analyzing the perturbations of response variables,explicit in reflecting the entire development process,and valid for obtaining the evolution rules of the time-varying characteristic.This methodology is a powerful technical support for people to further master the rules of dam operation.

  15. Development of phonological awareness in Down syndrome: A meta-analysis and empirical study.

    Science.gov (United States)

    Næss, Kari-Anne B

    2016-02-01

    Phonological awareness (PA) is the knowledge and understanding of the sound structure of language and is believed to be an important skill for the development of reading. This study explored PA skills in children with Down syndrome and matched typically developing (TD) controls using a dual approach: a meta-analysis of the existing international literature and a longitudinal empirical study. The results from both the meta-analysis and the empirical study showed that the children with Down syndrome initially had weaker PA skills compared to the controls; in particular, the awareness of rhyme was delayed. The longitudinal empirical data indicated that, as a result of formal education, the children with Down syndrome exhibited greater improvement on all PA measures compared with the controls who had not yet entered school. The results reach significance for rhyme awareness. With respect to dimensionality, the performance of the children with Down syndrome loaded on 1 factor, whereas the performance of the younger TD controls was multidimensional. In sum, these findings underline the need for studies that compare interventions designed especially to stimulate development of PA in this group of children and to provide insight into the underlying causes of the developmental profile of children with Down syndrome. PsycINFO Database Record (c) 2016 APA, all rights reserved.

  16. Reliability and Validity of Quantitative Video Analysis of Baseball Pitching Motion.

    Science.gov (United States)

    Oyama, Sakiko; Sosa, Araceli; Campbell, Rebekah; Correa, Alexandra

    2017-02-01

    Video recordings are used to quantitatively analyze pitchers' techniques. However, reliability and validity of such analysis is unknown. The purpose of the study was to investigate the reliability and validity of joint and segment angles identified during a pitching motion using video analysis. Thirty high school baseball pitchers participated. The pitching motion was captured using 2 high-speed video cameras and a motion capture system. Two raters reviewed the videos to digitize the body segments to calculate 2-dimensional angles. The corresponding 3-dimensional angles were calculated from the motion capture data. Intrarater reliability, interrater reliability, and validity of the 2-dimensional angles were determined. The intrarater and interrater reliability of the 2-dimensional angles were high for most variables. The trunk contralateral flexion at maximum external rotation was the only variable with high validity. Trunk contralateral flexion at ball release, trunk forward flexion at foot contact and ball release, shoulder elevation angle at foot contact, and maximum shoulder external rotation had moderate validity. Two-dimensional angles at the shoulder, elbow, and trunk could be measured with high reliability. However, the angles are not necessarily anatomically correct, and thus use of quantitative video analysis should be limited to angles that can be measured with good validity.

  17. Stochastic data-flow graph models for the reliability analysis of communication networks and computer systems

    Energy Technology Data Exchange (ETDEWEB)

    Chen, D.J.

    1988-01-01

    The literature is abundant with combinatorial reliability analysis of communication networks and fault-tolerant computer systems. However, it is very difficult to formulate reliability indexes using combinatorial methods. These limitations have led to the development of time-dependent reliability analysis using stochastic processes. In this research, time-dependent reliability-analysis techniques using Dataflow Graphs (DGF) are developed. The chief advantages of DFG models over other models are their compactness, structural correspondence with the systems, and general amenability to direct interpretation. This makes the verification of the correspondence of the data-flow graph representation to the actual system possible. Several DGF models are developed and used to analyze the reliability of communication networks and computer systems. Specifically, Stochastic Dataflow graphs (SDFG), both the discrete-time and the continuous time models are developed and used to compute time-dependent reliability of communication networks and computer systems. The repair and coverage phenomenon of communication networks is also analyzed using SDFG models.

  18. Multiobject Reliability Analysis of Turbine Blisk with Multidiscipline under Multiphysical Field Interaction

    Directory of Open Access Journals (Sweden)

    Chun-Yi Zhang

    2015-01-01

    Full Text Available To study accurately the influence of the deformation, stress, and strain of turbine blisk on the performance of aeroengine, the comprehensive reliability analysis of turbine blisk with multiple disciplines and multiple objects was performed based on multiple response surface method (MRSM and fluid-thermal-solid coupling technique. Firstly, the basic thought of MRSM was introduced. And then the mathematical model of MRSM was established with quadratic polynomial. Finally, the multiple reliability analyses of deformation, stress, and strain of turbine blisk were completed under multiphysical field coupling by the MRSM, and the comprehensive performance of turbine blisk was evaluated. From the reliability analysis, it is demonstrated that the reliability degrees of the deformation, stress, and strain for turbine blisk are 0.9942, 0.9935, 0.9954, and 0.9919, respectively, when the allowable deformation, stress, and strain are 3.7 × 10−3 m, 1.07 × 109 Pa, and 1.12 × 10−2 m/m, respectively; besides, the comprehensive reliability degree of turbine blisk is 0.9919, which basically satisfies the engineering requirement of aeroengine. The efforts of this paper provide a promising approach method for multidiscipline multiobject reliability analysis.

  19. Segmental analysis of indocyanine green pharmacokinetics for the reliable diagnosis of functional vascular insufficiency

    Science.gov (United States)

    Kang, Yujung; Lee, Jungsul; An, Yuri; Jeon, Jongwook; Choi, Chulhee

    2011-03-01

    Accurate and reliable diagnosis of functional insufficiency of peripheral vasculature is essential since Raynaud phenomenon (RP), most common form of peripheral vascular insufficiency, is commonly associated with systemic vascular disorders. We have previously demonstrated that dynamic imaging of near-infrared fluorophore indocyanine green (ICG) can be a noninvasive and sensitive tool to measure tissue perfusion. In the present study, we demonstrated that combined analysis of multiple parameters, especially onset time and modified Tmax which means the time from onset of ICG fluorescence to Tmax, can be used as a reliable diagnostic tool for RP. To validate the method, we performed the conventional thermographic analysis combined with cold challenge and rewarming along with ICG dynamic imaging and segmental analysis. A case-control analysis demonstrated that segmental pattern of ICG dynamics in both hands was significantly different between normal and RP case, suggesting the possibility of clinical application of this novel method for the convenient and reliable diagnosis of RP.

  20. Canonical Least-Squares Monte Carlo Valuation of American Options: Convergence and Empirical Pricing Analysis

    Directory of Open Access Journals (Sweden)

    Xisheng Yu

    2014-01-01

    Full Text Available The paper by Liu (2010 introduces a method termed the canonical least-squares Monte Carlo (CLM which combines a martingale-constrained entropy model and a least-squares Monte Carlo algorithm to price American options. In this paper, we first provide the convergence results of CLM and numerically examine the convergence properties. Then, the comparative analysis is empirically conducted using a large sample of the S&P 100 Index (OEX puts and IBM puts. The results on the convergence show that choosing the shifted Legendre polynomials with four regressors is more appropriate considering the pricing accuracy and the computational cost. With this choice, CLM method is empirically demonstrated to be superior to the benchmark methods of binominal tree and finite difference with historical volatilities.

  1. Theoretical and empirical applications of petroleum production function framework for analysis of the Phenomenon of Plenty

    Directory of Open Access Journals (Sweden)

    Sergey Uzhegov

    2011-09-01

    Full Text Available The current study examines how analysis of the Phenomenon of Plenty, paradox of economic underperformance of resource-rich nations, could benefit from theoretical and empirical application of suggested petroleum production function framework, basing on sample oil-abundant countries of the Commonwealth of Independent States, in particular Russia, Azerbaijan, and Kazakhstan. Proposed approach displays capacity of oil-economy production function to shed light on larger scope of theoretical issues. Empirical testing of suggested theoretical framework exhibited ability of proxied components of devised production function, capturing main metrics of the Phenomenon of Plenty and additionally factoring in corruption, to exert a strong impact on the majority of twelve principal macroeconomic indicators monitored by CIS supra-national institutions: with most pronounced influence on gross domestic product, industrial production, capital investments, and export to CIS countries.

  2. Relative prices and economic development: an analysis of the empirical evidence

    Directory of Open Access Journals (Sweden)

    P. ERCOLANI

    2013-12-01

    Full Text Available The paper examines empirical evidence on the evolution of relative prices and the long-run differences between countries with different levels of per-capita income. The price indicators employed are derived from national accounts aggregates, with the aim of drawing useful information on the relationship between changes in the structure of production and economic development. After a brief review of the literature, long-term data of eight developed countries is examined, as well as cross-sectional data from 1975 of a large group of countries with different levels of development. Some limitations of previous analyses are then presented and direct indications are advanced to explain the empirical evidence. Finally, the author highlights the consequences that international differences in relative prices entail in the cross-section analysis of the sectoral distribution of production.

  3. Analysis of strain gage reliability in F-100 jet engine testing at NASA Lewis Research Center

    Science.gov (United States)

    Holanda, R.

    1983-01-01

    A reliability analysis was performed on 64 strain gage systems mounted on the 3 rotor stages of the fan of a YF-100 engine. The strain gages were used in a 65 hour fan flutter research program which included about 5 hours of blade flutter. The analysis was part of a reliability improvement program. Eighty-four percent of the strain gages survived the test and performed satisfactorily. A post test analysis determined most failure causes. Five failures were caused by open circuits, three failed gages showed elevated circuit resistance, and one gage circuit was grounded. One failure was undetermined.

  4. Problems Related to Use of Some Terms in System Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Nadezda Hanusova

    2004-01-01

    Full Text Available The paper deals with problems of using dependability terms, defined in actual standard STN IEC 50 (191: International electrotechnical dictionary, chap. 191: Dependability and quality of service (1993, in a technical systems dependability analysis. The goal of the paper is to find a relation between terms introduced in the mentioned standard and used in the technical systems dependability analysis and rules and practices used in a system analysis of the system theory. Description of a part of the system life cycle related to reliability is used as a starting point. The part of a system life cycle is described by the state diagram and reliability relevant therms are assigned.

  5. Kernel empirical orthogonal function analysis of 1992-2008 global sea surface height anomaly data

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Andersen, Ole Baltazar; Knudsen, Per

    2009-01-01

    This paper describes a kernel version of empirical orthogonal function (EOF) analysis and its application to detect patterns of interest in global monthly mean sea surface height (SSH) anomalies from satellite altimetry acquired during the last 17 years. EOF analysis like principal component...... to large scale ocean currents and particularly to the pulsing of the El Niño/Southern Oscillation. Large scale ocean events associated with the El Niño/Southern Oscillation related signals are conveniently concentrated in the first SSH EOF modes. A major difference between the classical linear EOF...

  6. A Price Index Model for Road Freight Transportation and Its Empirical analysis in China

    Directory of Open Access Journals (Sweden)

    Liu Zhishuo

    2017-01-01

    Full Text Available The aim of price index for road freight transportation (RFT is to reflect the changes of price in the road transport market. Firstly, a price index model for RFT based on the sample data from Alibaba logistics platform is built. This model is a three levels index system including total index, classification index and individual index and the Laspeyres method is applied to calculate these indices. Finally, an empirical analysis of the price index for RFT market in Zhejiang Province is performed. In order to demonstrate the correctness and validity of the exponential model, a comparative analysis with port throughput and PMI index is carried out.

  7. Content Analysis in Mass Communication: Assessment and Reporting of Intercoder Reliability.

    Science.gov (United States)

    Lombard, Matthew; Snyder-Duch, Jennifer; Bracken, Cheryl Campanella

    2002-01-01

    Reviews the importance of intercoder agreement for content analysis in mass communication research. Describes several indices for calculating this type of reliability (varying in appropriateness, complexity, and apparent prevalence of use). Presents a content analysis of content analyses reported in communication journals to establish how…

  8. Dynamic Scapular Movement Analysis: Is It Feasible and Reliable in Stroke Patients during Arm Elevation?

    Science.gov (United States)

    De Baets, Liesbet; Van Deun, Sara; Desloovere, Kaat; Jaspers, Ellen

    2013-01-01

    Knowledge of three-dimensional scapular movements is essential to understand post-stroke shoulder pain. The goal of the present work is to determine the feasibility and the within and between session reliability of a movement protocol for three-dimensional scapular movement analysis in stroke patients with mild to moderate impairment, using an optoelectronic measurement system. Scapular kinematics of 10 stroke patients and 10 healthy controls was recorded on two occasions during active anteflexion and abduction from 0° to 60° and from 0° to 120°. All tasks were executed unilaterally and bilaterally. The protocol’s feasibility was first assessed, followed by within and between session reliability of scapular total range of motion (ROM), joint angles at start position and of angular waveforms. Additionally, measurement errors were calculated for all parameters. Results indicated that the protocol was generally feasible for this group of patients and assessors. Within session reliability was very good for all tasks. Between sessions, scapular angles at start position were measured reliably for most tasks, while scapular ROM was more reliable during the 120° tasks. In general, scapular angles showed higher reliability during anteflexion compared to abduction, especially for protraction. Scapular lateral rotations resulted in smallest measurement errors. This study indicates that scapular kinematics can be measured reliably and with precision within one measurement session. In case of multiple test sessions, further methodological optimization is required for this protocol to be suitable for clinical decision-making and evaluation of treatment efficacy. PMID:24244414

  9. Dynamic scapular movement analysis: is it feasible and reliable in stroke patients during arm elevation?

    Directory of Open Access Journals (Sweden)

    Liesbet De Baets

    Full Text Available Knowledge of three-dimensional scapular movements is essential to understand post-stroke shoulder pain. The goal of the present work is to determine the feasibility and the within and between session reliability of a movement protocol for three-dimensional scapular movement analysis in stroke patients with mild to moderate impairment, using an optoelectronic measurement system. Scapular kinematics of 10 stroke patients and 10 healthy controls was recorded on two occasions during active anteflexion and abduction from 0° to 60° and from 0° to 120°. All tasks were executed unilaterally and bilaterally. The protocol's feasibility was first assessed, followed by within and between session reliability of scapular total range of motion (ROM, joint angles at start position and of angular waveforms. Additionally, measurement errors were calculated for all parameters. Results indicated that the protocol was generally feasible for this group of patients and assessors. Within session reliability was very good for all tasks. Between sessions, scapular angles at start position were measured reliably for most tasks, while scapular ROM was more reliable during the 120° tasks. In general, scapular angles showed higher reliability during anteflexion compared to abduction, especially for protraction. Scapular lateral rotations resulted in smallest measurement errors. This study indicates that scapular kinematics can be measured reliably and with precision within one measurement session. In case of multiple test sessions, further methodological optimization is required for this protocol to be suitable for clinical decision-making and evaluation of treatment efficacy.

  10. Estimating Reliability of Disturbances in Satellite Time Series Data Based on Statistical Analysis

    Science.gov (United States)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with "Change/ No change" by most of the present methods, while few methods focus on estimating reliability (or confidence level) of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1) Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST). (2) Forecasting and detecting disturbances in new time series data. (3) Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI) and Confidence Levels (CL). The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  11. Reliability reallocation models as a support tools in traffic safety analysis.

    Science.gov (United States)

    Bačkalić, Svetlana; Jovanović, Dragan; Bačkalić, Todor

    2014-04-01

    One of the essential questions placed before a road authority is where to act first, i.e. which road sections should be treated in order to achieve the desired level of reliability of a particular road, while this is at the same time the subject of this research. The paper shows how the reliability reallocation theory can be applied in safety analysis of a road consisting of sections. The model has been successfully tested using two apportionment techniques - ARINC and the minimum effort algorithm. The given methods were applied in the traffic safety analysis as a basic step, for the purpose of achieving a higher level of reliability. The previous methods used for selecting hazardous locations do not provide precise values for the required frequency of accidents, i.e. the time period between the occurrences of two accidents. In other words, they do not allow for the establishment of a connection between a precise demand for increased reliability (expressed as a percentage) and the selection of particular road sections for further analysis. The paper shows that reallocation models can also be applied in road safety analysis, or more precisely, as part of the measures for increasing their level of safety. A tool has been developed for selecting road sections for treatment on the basis of a precisely defined increase in the level of reliability of a particular road, i.e. the mean time between the occurrences of two accidents.

  12. Reliability analysis of supporting pressure in tunnels based on three-dimensional failure mechanism

    Institute of Scientific and Technical Information of China (English)

    罗卫华; 李闻韬

    2016-01-01

    Based on nonlinear failure criterion, a three-dimensional failure mechanism of the possible collapse of deep tunnel is presented with limit analysis theory. Support pressure is taken into consideration in the virtual work equation performed under the upper bound theorem. It is necessary to point out that the properties of surrounding rock mass plays a vital role in the shape of collapsing rock mass. The first order reliability method and Monte Carlo simulation method are then employed to analyze the stability of presented mechanism. Different rock parameters are considered random variables to value the corresponding reliability index with an increasing applied support pressure. The reliability indexes calculated by two methods are in good agreement. Sensitivity analysis was performed and the influence of coefficient variation of rock parameters was discussed. It is shown that the tensile strength plays a much more important role in reliability index than dimensionless parameter, and that small changes occurring in the coefficient of variation would make great influence of reliability index. Thus, significant attention should be paid to the properties of surrounding rock mass and the applied support pressure to maintain the stability of tunnel can be determined for a given reliability index.

  13. The Monte Carlo Simulation Method for System Reliability and Risk Analysis

    CERN Document Server

    Zio, Enrico

    2013-01-01

    Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling.   Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques.   This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...

  14. A NEW TWO-POINT ADAPTIVENONLINEAR APPROXIMATION METHOD FOR RELIABILITY ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    LiuShutian

    2004-01-01

    A two-point adaptive nonlinear approximation (referred to as TANA4) suitable for reliability analysis is proposed. Transformed and normalized random variables in probabilistic analysis could become negative and pose a challenge to the earlier developed two-point approximations; thus a suitable method that can address this issue is needed. In the method proposed, the nonlinearity indices of intervening variables are limited to integers. Then, on the basis of the present method, an improved sequential approximation of the limit state surface for reliability analysis is presented. With the gradient projection method, the data points for the limit state surface approximation are selected on the original limit state surface, which effectively represents the nature of the original response function. On the basis of this new approximation, the reliability is estimated using a first-order second-moment method. Various examples, including both structural and non-structural ones, are presented to show the effectiveness of the method proposed.

  15. An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.

    Science.gov (United States)

    Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes

    2017-01-09

    This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice.

  16. Asymptotic Sampling for Reliability Analysis of Adhesive Bonded Stepped Lap Composite Joints

    DEFF Research Database (Denmark)

    Kimiaeifar, Amin; Lund, Erik; Thomsen, Ole Thybo

    2013-01-01

    Reliability analysis coupled with finite element analysis (FEA) of composite structures is computationally very demanding and requires a large number of simulations to achieve an accurate prediction of the probability of failure with a small standard error. In this paper Asymptotic Sampling, which....... Three dimensional (3D) FEA is used for the structural analysis together with a design equation that is associated with a deterministic code-based design equation where reliability is secured by partial safety factors. The Tsai-Wu and the maximum principal stress failure criteria are used to predict...... failure in the composite and adhesive layers, respectively, and the results are compared with the target reliability level implicitly used in the wind turbine standard IEC 61400-1. The accuracy and efficiency of Asymptotic Sampling is investigated by comparing the results with predictions obtained using...

  17. Structure buckling and non-probabilistic reliability analysis of supercavitating vehicles

    Institute of Scientific and Technical Information of China (English)

    AN Wei-guang; ZHOU Ling; AN Hai

    2009-01-01

    To perform structure buckling and reliability analysis on supercavitating vehicles with high velocity in the submarine, supercavitating vehicles were simplified as variable cross section beam firstly. Then structural buckling analysis of supercavitating vehicles with or without engine thrust was conducted, and the structural buckling safety margin equation of supercavitating vehicles was established. The indefinite information was de-scribed by interval set and the structure reliability analysis was performed by using non-probabilistic reliability method. Considering interval variables as random variables which satisfy uniform distribution, the Monte-Carlo method was used to calculate the non-probabilistic failure degree. Numerical examples of supercavitating vehi-cles were presented. Under different ratios of base diameter to cavitator diameter, the change tendency of non-probabilistic failure degree of structural buckling of supereavitating vehicles with or without engine thrust was studied along with the variety of speed.

  18. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    Science.gov (United States)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  19. Reliability analysis of shoulder balance measures: comparison of the 4 available methods.

    Science.gov (United States)

    Hong, Jae-Young; Suh, Seung-Woo; Yang, Jae-Hyuk; Park, Si-Young; Han, Ji-Hoon

    2013-12-15

    Observational study with 3 examiners. To compare the reliability of shoulder balance measurement methods. There are several measurement methods for shoulder balance. No reliability analysis has been performed despite the clinical importance of this measurement. Whole spine posteroanterior radiographs (n = 270) were collected to compare the reliability of the 4 shoulder balance measures in patients with adolescent idiopathic scoliosis. Each radiograph was measured twice by each of the 3 examiners using 4 measurement methods. The data were analyzed statistically to determine the inter- and intraobserver reliability. Overall, the 4 radiographical methods showed an excellent intraclass correlation coefficient regardless of severity in intraobserver comparisons (>0.904). In addition, the mean absolute difference values in all methods were low and were comparatively similar (0.445, mean absolute difference 0.810 and >0.787, respectively) regardless of severity. In addition, the mean absolute difference values in the clavicular angle method were lower (balance measurement method clinically. 3.

  20. A LONGITUDINAL ANALYSIS REGARDING THE EVOLUTION OF PROFIT TAX REGULATIONS IN ROMANIA AN EMPIRICAL VIEW

    Directory of Open Access Journals (Sweden)

    Albu Nadia

    2011-07-01

    Full Text Available The study conducted a longitudinal analysis regarding Romanian profit tax regulations. Beginning with the first profit tax regulation implemented in 1991 and until now, we analyzed based on a empirical approach all changes that have occurred over time in the Romanian accounting environment. The motivation of the study conducted was based on the strong relationship between accounting and taxation in the Romanian accounting environment over time, the profit tax being one of the main items of this relation. This particular study is divided into five sections. After a short introduction and presenting the motivation of the study (section 1, in section 2 we conducted the literature review based on international and national studies regarding the profit tax regulations through the relationship between accounting and taxation. Section 3 presents a brief review of the main Romanian regulations that concerned the profit tax and the most important changes that have occurred over time. In section 4 we conducted the empirical analysis. In this section is realized a series of analysis, aiming the following: (1 the total number of regulations that have amend the main regulations presented in the previous section; (2 the type of amendments implemented over regulations (abolishment, text amendment, adding new articles or alignments; (3 the total number of amendments approved by law without modifications, respectively the total number of amendments approved on the Official Journal through Government Ordinance or Emergency Ordinance and unapproved by law. The empirical analysis conducted documented that the main shortcoming associated with the profit tax regulation is due by the multiple changes which have been subject of the 5 main profit tax regulations. The last section (section 5 consists in presenting the conclusions of the study. As main conclusion, the profit tax regulation is stable only in terms of the small number of main regulations, the large number

  1. Biofuels policy and the US market for motor fuels: Empirical analysis of ethanol splashing

    Energy Technology Data Exchange (ETDEWEB)

    Walls, W.D., E-mail: wdwalls@ucalgary.ca [Department of Economics, University of Calgary, 2500 University Drive NW, Calgary, Alberta, T2N 1N4 (Canada); Rusco, Frank; Kendix, Michael [US GAO (United States)

    2011-07-15

    Low ethanol prices relative to the price of gasoline blendstock, and tax credits, have resulted in discretionary blending at wholesale terminals of ethanol into fuel supplies above required levels-a practice known as ethanol splashing in industry parlance. No one knows precisely where or in what volume ethanol is being blended with gasoline and this has important implications for motor fuels markets: Because refiners cannot perfectly predict where ethanol will be blended with finished gasoline by wholesalers, they cannot know when to produce and where to ship a blendstock that when mixed with ethanol at 10% would create the most economically efficient finished motor gasoline that meets engine standards and has comparable evaporative emissions as conventional gasoline without ethanol blending. In contrast to previous empirical analyses of biofuels that have relied on highly aggregated data, our analysis is disaggregated to the level of individual wholesale fuel terminals or racks (of which there are about 350 in the US). We incorporate the price of ethanol as well as the blendstock price to model the wholesaler's decision of whether or not to blend additional ethanol into gasoline at any particular wholesale city-terminal. The empirical analysis illustrates how ethanol and gasoline prices affect ethanol usage, controlling for fuel specifications, blend attributes, and city-terminal-specific effects that, among other things, control for differential costs of delivering ethanol from bio-refinery to wholesale rack. - Research Highlights: > Low ethanol prices and tax credits have resulted in discretionary blending of ethanol into fuel supplies above required levels. > This has important implications for motor fuels markets and vehicular emissions. > Our analysis incorporates the price of ethanol as well as the blendstock price to model the wholesaler's decision of whether or not to blend additional ethanol into gasoline at any particular wholesale city

  2. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    Science.gov (United States)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  3. Reliability analysis of production ships with emphasis on load combination and ultimate strength

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xiaozhi

    1995-05-01

    This thesis deals with ultimate strength and reliability analysis of offshore production ships, accounting for stochastic load combinations, using a typical North Sea production ship for reference. A review of methods for structural reliability analysis is presented. Probabilistic methods are established for the still water and vertical wave bending moments. Linear stress analysis of a midships transverse frame is carried out, four different finite element models are assessed. Upon verification of the general finite element code ABAQUS with a typical ship transverse girder example, for which test results are available, ultimate strength analysis of the reference transverse frame is made to obtain the ultimate load factors associated with the specified pressure loads in Det norske Veritas Classification rules for ships and rules for production vessels. Reliability analysis is performed to develop appropriate design criteria for the transverse structure. It is found that the transverse frame failure mode does not seem to contribute to the system collapse. Ultimate strength analysis of the longitudinally stiffened panels is performed, accounting for the combined biaxial and lateral loading. Reliability based design of the longitudinally stiffened bottom and deck panels is accomplished regarding the collapse mode under combined biaxial and lateral loads. 107 refs., 76 refs., 37 tabs.

  4. Assessing the Reliability of Digitalized Cephalometric Analysis in Comparison with Manual Cephalometric Analysis

    Science.gov (United States)

    Farooq, Mohammed Umar; Khan, Mohd. Asadullah; Imran, Shahid; Qureshi, Arshad; Ahmed, Syed Afroz; Kumar, Sujan; Rahman, Mohd. Aziz Ur

    2016-01-01

    Introduction For more than seven decades orthodontist used cephalometric analysis as one of the main diagnostic tools which can be performed manually or by software. The use of computers in treatment planning is expected to avoid errors and make it less time consuming with effective evaluation and high reproducibility. Aim This study was done to evaluate and compare the accuracy and reliability of cephalometric measurements between computerized method of direct digital radiographs and conventional tracing. Materials and Methods Digital and conventional hand tracing cephalometric analysis of 50 patients were done. Thirty anatomical landmarks were defined on each radiograph by a single investi-gator, 5 skeletal analysis (Steiner, Wits, Tweeds, McNamara, Rakosi Jarabaks) and 28 variables were calculated. Results The variables showed consistency between the two methods except for 1-NA, Y-axis and interincisal angle measurements which were higher in manual tracing and higher facial axis angle in digital tracing. Conclusion Most of the commonly used measurements were accurate except some measurements between the digital tracing with FACAD® and manual methods. The advantages of digital imaging such as enhancement, transmission, archiving and low radiation dosages makes it to be preferred over conventional method in daily use. PMID:27891451

  5. Reliability analysis and prediction of mixed mode load using Markov Chain Model

    Science.gov (United States)

    Nikabdullah, N.; Singh, S. S. K.; Alebrahim, R.; Azizi, M. A.; K, Elwaleed A.; Noorani, M. S. M.

    2014-06-01

    The aim of this paper is to present the reliability analysis and prediction of mixed mode loading by using a simple two state Markov Chain Model for an automotive crankshaft. The reliability analysis and prediction for any automotive component or structure is important for analyzing and measuring the failure to increase the design life, eliminate or reduce the likelihood of failures and safety risk. The mechanical failures of the crankshaft are due of high bending and torsion stress concentration from high cycle and low rotating bending and torsional stress. The Markov Chain was used to model the two states based on the probability of failure due to bending and torsion stress. In most investigations it revealed that bending stress is much serve than torsional stress, therefore the probability criteria for the bending state would be higher compared to the torsion state. A statistical comparison between the developed Markov Chain Model and field data was done to observe the percentage of error. The reliability analysis and prediction was derived and illustrated from the Markov Chain Model were shown in the Weibull probability and cumulative distribution function, hazard rate and reliability curve and the bathtub curve. It can be concluded that Markov Chain Model has the ability to generate near similar data with minimal percentage of error and for a practical application; the proposed model provides a good accuracy in determining the reliability for the crankshaft under mixed mode loading.

  6. Reliability analysis and prediction of mixed mode load using Markov Chain Model

    Energy Technology Data Exchange (ETDEWEB)

    Nikabdullah, N. [Department of Mechanical and Materials Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia and Institute of Space Science (ANGKASA), Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); Singh, S. S. K.; Alebrahim, R.; Azizi, M. A. [Department of Mechanical and Materials Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); K, Elwaleed A. [Institute of Space Science (ANGKASA), Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); Noorani, M. S. M. [School of Mathematical Sciences, Faculty of Science and Technology, Universiti Kebangsaan Malaysia (Malaysia)

    2014-06-19

    The aim of this paper is to present the reliability analysis and prediction of mixed mode loading by using a simple two state Markov Chain Model for an automotive crankshaft. The reliability analysis and prediction for any automotive component or structure is important for analyzing and measuring the failure to increase the design life, eliminate or reduce the likelihood of failures and safety risk. The mechanical failures of the crankshaft are due of high bending and torsion stress concentration from high cycle and low rotating bending and torsional stress. The Markov Chain was used to model the two states based on the probability of failure due to bending and torsion stress. In most investigations it revealed that bending stress is much serve than torsional stress, therefore the probability criteria for the bending state would be higher compared to the torsion state. A statistical comparison between the developed Markov Chain Model and field data was done to observe the percentage of error. The reliability analysis and prediction was derived and illustrated from the Markov Chain Model were shown in the Weibull probability and cumulative distribution function, hazard rate and reliability curve and the bathtub curve. It can be concluded that Markov Chain Model has the ability to generate near similar data with minimal percentage of error and for a practical application; the proposed model provides a good accuracy in determining the reliability for the crankshaft under mixed mode loading.

  7. Application of FTA Method to Reliability Analysis of Vacuum Resin Shot Dosing Equipment

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Faults of vacuum resin shot dosing equipment are studied systematically and the fault tree of the system is constructed by using the fault tree analysis(FTA) method. Then the qualitative and quantitative analysis of the tree is carried out, respectively, and according to the results of the analysis, the measures to improve the system are worked out and implemented. As a result, the reliability of the equipment is enhanced greatly.

  8. An Empirical Analysis of the Changing Role of the German Bundesbank after 1983

    DEFF Research Database (Denmark)

    Juselius, Katarina

    A cointegrated VAR model describing a small macroeconomic system consisting of money, income, prices, and interest rates is estimated on split sample data before and after 1983. The monetary mechanisms were found to be significantly different. Before 1983, the money supply seemed controlable...... and expansion or contraction of money supply had the expected effect on prices, income, and interest rates. After 1983, the conventional mechanisms no longer seemed to work. The empirical analysis pointed to the crucial role of the bond rate in the system, particularly for the more recent period...

  9. What is a Leading Case in EU law? An empirical analysis

    DEFF Research Database (Denmark)

    Sadl, Urska; Panagis, Yannis

    2015-01-01

    . Our analysis focuses on Les Verts, a case of considerable fame in EU law, closely scrutinising whether it contains inherent leading case material. We show how the legal relevance of a case can become “embedded” in a long process of reinterpretation by legal actors, and we demonstrate that the actual......Lawyers generally explain legal development by looking at explicit amendments to statutory law and modifications in judicial practice. As far as the latter are concerned, leading cases occupy a special place. This article empirically studies the process in which certain cases become leading cases...

  10. Empirical Requirements Analysis for Mars Surface Operations Using the Flashline Mars Arctic Research Station

    Science.gov (United States)

    Clancey, William J.; Lee, Pascal; Sierhuis, Maarten; Norvig, Peter (Technical Monitor)

    2001-01-01

    Living and working on Mars will require model-based computer systems for maintaining and controlling complex life support, communication, transportation, and power systems. This technology must work properly on the first three-year mission, augmenting human autonomy, without adding-yet more complexity to be diagnosed and repaired. One design method is to work with scientists in analog (mars-like) setting to understand how they prefer to work, what constrains will be imposed by the Mars environment, and how to ameliorate difficulties. We describe how we are using empirical requirements analysis to prototype model-based tools at a research station in the High Canadian Arctic.

  11. An Empirical Analysis of Farmers’ Rabbit Breeds Purchase and Its Influencing Factors

    Institute of Scientific and Technical Information of China (English)

    Yuhe; SONG; Laping; WU

    2014-01-01

    In this paper,based on the survey data on farmers in 14 provinces and cities nationwide provided by China Rabbit Research System,we analyze the farmers’ rabbit breeds selection,purchase channels and the demand for new varieties of rabbits as well as the problems in the course of rabbit usage. We make an empirical analysis of the factors influencing farmers’ rabbit demand,and put forth the recommendations for farmers’ rabbit breeds usage and to improve the promotion of new varieties of rabbits.

  12. Aviation Fuel System Reliability and Fail-Safety Analysis. Promising Alternative Ways for Improving the Fuel System Reliability

    Directory of Open Access Journals (Sweden)

    I. S. Shumilov

    2017-01-01

    Full Text Available The paper deals with design requirements for an aviation fuel system (AFS, AFS basic design requirements, reliability, and design precautions to avoid AFS failure. Compares the reliability and fail-safety of AFS and aircraft hydraulic system (AHS, considers the promising alternative ways to raise reliability of fuel systems, as well as elaborates recommendations to improve reliability of the pipeline system components and pipeline systems, in general, based on the selection of design solutions.It is extremely advisable to design the AFS and AHS in accordance with Aviation Regulations АП25 and Accident Prevention Guidelines, ICAO (International Civil Aviation Association, which will reduce risk of emergency situations, and in some cases even avoid heavy disasters.ATS and AHS designs should be based on the uniform principles to ensure the highest reliability and safety. However, currently, this principle is not enough kept, and AFS looses in reliability and fail-safety as compared with AHS. When there are the examined failures (single and their combinations the guidelines to ensure the AFS efficiency should be the same as those of norm-adopted in the Regulations АП25 for AHS. This will significantly increase reliability and fail-safety of the fuel systems and aircraft flights, in general, despite a slight increase in AFS mass.The proposed improvements through the use of components redundancy of the fuel system will greatly raise reliability of the fuel system of a passenger aircraft, which will, without serious consequences for the flight, withstand up to 2 failures, its reliability and fail-safety design will be similar to those of the AHS, however, above improvement measures will lead to a slightly increasing total mass of the fuel system.It is advisable to set a second pump on the engine in parallel with the first one. It will run in case the first one fails for some reasons. The second pump, like the first pump, can be driven from the

  13. Analysis of the Kinematic Accuracy Reliability of a 3-DOF Parallel Robot Manipulator

    Directory of Open Access Journals (Sweden)

    Guohua Cui

    2015-02-01

    Full Text Available Kinematic accuracy reliability is an important performance index in the evaluation of mechanism quality. By using a 3- DOF 3-PUU parallel robot manipulator as the research object, the position and orientation error model was derived by mapping the relation between the input and output of the mechanism. Three error sensitivity indexes that evaluate the kinematic accuracy of the parallel robot manipulator were obtained by adapting the singular value decomposition of the error translation matrix. Considering the influence of controllable and uncontrollable factors on the kinematic accuracy, the mathematical model of reliability based on random probability was employed. The measurement and calculation method for the evaluation of the mechanism’s kinematic reliability level was also provided. By analysing the mechanism’s errors and reliability, the law of surface error sensitivity for the location and structure parameters was obtained. The kinematic reliability of the parallel robot manipulator was statistically computed on the basis of the Monte Carlo simulation method. The reliability analysis of kinematic accuracy provides a theoretical basis for design optimization and error compensation.

  14. Markov Chain Modelling of Reliability Analysis and Prediction under Mixed Mode Loading

    Institute of Scientific and Technical Information of China (English)

    SINGH Salvinder; ABDULLAH Shahrum; NIK MOHAMED Nik Abdullah; MOHD NOORANI Mohd Salmi

    2015-01-01

    The reliability assessment for an automobile crankshaft provides an important understanding in dealing with the design life of the component in order to eliminate or reduce the likelihood of failure and safety risks. The failures of the crankshafts are considered as a catastrophic failure that leads towards a severe failure of the engine block and its other connecting subcomponents. The reliability of an automotive crankshaft under mixed mode loading using the Markov Chain Model is studied. The Markov Chain is modelled by using a two-state condition to represent the bending and torsion loads that would occur on the crankshaft. The automotive crankshaft represents a good case study of a component under mixed mode loading due to the rotating bending and torsion stresses. An estimation of the Weibull shape parameter is used to obtain the probability density function, cumulative distribution function, hazard and reliability rate functions, the bathtub curve and the mean time to failure. The various properties of the shape parameter is used to model the failure characteristic through the bathtub curve is shown. Likewise, an understanding of the patterns posed by the hazard rate onto the component can be used to improve the design and increase the life cycle based on the reliability and dependability of the component. The proposed reliability assessment provides an accurate, efficient, fast and cost effective reliability analysis in contrast to costly and lengthy experimental techniques.

  15. Reliability Analysis of Distributed Grid-connected Photovoltaic System Monitoring Network

    Directory of Open Access Journals (Sweden)

    Fu Zhixin

    2016-01-01

    Full Text Available A large amount of distributed grid-connected Photovoltaic systems have brought new challenges to the dispatching of power network. Real-time monitoring the PV system can efficiently help improve the ability of power network to accept and control the distributed PV systems, and thus mitigate the impulse on the power network imposed by the uncertainty of its power output. To study the reliability of distributed PV monitoring network, it is of great significance to look for a method to build a highly reliable monitoring system, analyze the weak links and key nodes of its monitoring performance in improving the performance of the monitoring network. Firstly a reliability model of PV system was constructed based on WSN technology. Then, in view of the dynamic characteristics of the network’s reliability, fault tree analysis was used to judge any possible reasons that cause the failure of the network and logical relationship between them. Finally, the reliability of the monitoring network was analyzed to figure out the weak links and key nodes. This paper provides guidance to build a stable and reliable monitoring network of a distributed PV system.

  16. Reduced Expanding Load Method for Simulation-Based Structural System Reliability Analysis

    Institute of Scientific and Technical Information of China (English)

    远方; 宋丽娜; 方江生

    2004-01-01

    The current situation and difficulties of the structural system reliability analysis are mentioned. Then on the basis of Monte Carlo method and computer simulation, a new analysis method reduced expanding load method ( RELM ) is presented, which can be used to solve structural reliability problems effectively and conveniently. In this method, the uncertainties of loads, structural material properties and dimensions can be fully considered. If the statistic parameters of stochastic variables are known, by using this method, the probability of failure can be estimated rather accurately. In contrast with traditional approaches, RELM method gives a much better understanding of structural failure frequency and its reliability indexβ is more meaningful. To illustrate this new idea, a specific example is given.

  17. Vibration reliability analysis for aeroengine compressor blade based on support vector machine response surface method

    Institute of Scientific and Technical Information of China (English)

    GAO Hai-feng; BAI Guang-chen

    2015-01-01

    To ameliorate reliability analysis efficiency for aeroengine components, such as compressor blade, support vector machine response surface method (SRSM) is proposed. SRSM integrates the advantages of support vector machine (SVM) and traditional response surface method (RSM), and utilizes experimental samples to construct a suitable response surface function (RSF) to replace the complicated and abstract finite element model. Moreover, the randomness of material parameters, structural dimension and operating condition are considered during extracting data so that the response surface function is more agreeable to the practical model. The results indicate that based on the same experimental data, SRSM has come closer than RSM reliability to approximating Monte Carlo method (MCM); while SRSM (17.296 s) needs far less running time than MCM (10958 s) and RSM (9840 s). Therefore, under the same simulation conditions, SRSM has the largest analysis efficiency, and can be considered a feasible and valid method to analyze structural reliability.

  18. Method and Application for Reliability Analysis of Measurement Data in Nuclear Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Hun; Hwang, Kyeongmo; Lee, Hyoseoung [KEPCO E and C, Seoungnam (Korea, Republic of); Moon, Seungjae [Hanyang University, Seoul (Korea, Republic of)

    2015-02-15

    Pipe wall-thinning by flow-accelerated corrosion and various types of erosion is significant damage in secondary system piping of nuclear power plants(NPPs). All NPPs in Korea have management programs to ensure pipe integrity from degradation mechanisms. Ultrasonic test(UT) is widely used for pipe wall thickness measurement. Numerous UT measurements have been performed during scheduled outages. Wall-thinning rates are determined conservatively according to several evaluation methods developed by Electric Power Research Institute(EPRI). The issue of reliability caused by measurement error should be considered in the process of evaluation. The reliability analysis method was developed for single and multiple measurement data in the previous researches. This paper describes the application results of reliability analysis method to real measurement data during scheduled outage and proved its benefits.

  19. An Efficient Approach for the Reliability Analysis of Phased-Mission Systems with Dependent Failures

    Science.gov (United States)

    Xing, Liudong; Meshkat, Leila; Donahue, Susan K.

    2006-01-01

    We consider the reliability analysis of phased-mission systems with common-cause failures in this paper. Phased-mission systems (PMS) are systems supporting missions characterized by multiple, consecutive, and nonoverlapping phases of operation. System components may be subject to different stresses as well as different reliability requirements throughout the course of the mission. As a result, component behavior and relationships may need to be modeled differently from phase to phase when performing a system-level reliability analysis. This consideration poses unique challenges to existing analysis methods. The challenges increase when common-cause failures (CCF) are incorporated in the model. CCF are multiple dependent component failures within a system that are a direct result of a shared root cause, such as sabotage, flood, earthquake, power outage, or human errors. It has been shown by many reliability studies that CCF tend to increase a system's joint failure probabilities and thus contribute significantly to the overall unreliability of systems subject to CCF.We propose a separable phase-modular approach to the reliability analysis of phased-mission systems with dependent common-cause failures as one way to meet the above challenges in an efficient and elegant manner. Our methodology is twofold: first, we separate the effects of CCF from the PMS analysis using the total probability theorem and the common-cause event space developed based on the elementary common-causes; next, we apply an efficient phase-modular approach to analyze the reliability of the PMS. The phase-modular approach employs both combinatorial binary decision diagram and Markov-chain solution methods as appropriate. We provide an example of a reliability analysis of a PMS with both static and dynamic phases as well as CCF as an illustration of our proposed approach. The example is based on information extracted from a Mars orbiter project. The reliability model for this orbiter considers

  20. Reliability Index for Reinforced Concrete Frames using Nonlinear Pushover and Dynamic Analysis

    Directory of Open Access Journals (Sweden)

    Ahmad A. Fallah

    2009-12-01

    Full Text Available In the conventional design and analysis methods affecting parameters loads, materials' strength, etc are not set as probable variables. Safety factors in the current Codes and Standards are usually obtained on the basis of judgment and experience, which may be improper or uneconomical. In technical literature, a method based on nonlinear static analysis is suggested to set Reliability Index on strength of structural systems. In this paper, a method based on Nonlinear Dynamic analysis with rising acceleration (or Incremental Dynamic Analysis is introduced, the results of which are compared with those of the previous (Static Pushover Analysis method and two concepts namely Redundancy Strength and Redundancy Variations are proposed as an index to these impacts. The Redundancy Variation Factor and Redundancy Strength Factor indices for reinforced concrete frames with varying number of bays and stories and different ductility potentials are computed and ultimately, Reliability Index is determined using these two indices.