Sample records for reliability comparative analysis

  1. Comparative reliability studies and analysis of Au, Pd-coated Cu and Pd-doped Cu wire in microelectronics packaging.

    Chong Leong, Gan; Uda, Hashim


    This paper compares and discusses the wearout reliability and analysis of Gold (Au), Palladium (Pd) coated Cu and Pd-doped Cu wires used in fineline Ball Grid Array (BGA) package. Intermetallic compound (IMC) thickness measurement has been carried out to estimate the coefficient of diffusion (Do) under various aging conditions of different bonding wires. Wire pull and ball bond shear strengths have been analyzed and we found smaller variation in Pd-doped Cu wire compared to Au and Pd-doped Cu wire. Au bonds were identified to have faster IMC formation, compared to slower IMC growth of Cu. The obtained weibull slope, β of three bonding wires are greater than 1.0 and belong to wearout reliability data point. Pd-doped Cu wire exhibits larger time-to-failure and cycles-to-failure in both wearout reliability tests in Highly Accelerated Temperature and Humidity (HAST) and Temperature Cycling (TC) tests. This proves Pd-doped Cu wire has a greater potential and higher reliability margin compared to Au and Pd-coated Cu wires.

  2. Power electronics reliability analysis.

    Smith, Mark A.; Atcitty, Stanley


    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  3. Application of SAW method for multiple-criteria comparative analysis of the reliability of heat supply organizations

    Akhmetova, I. G.; Chichirova, N. D.


    and the analysis of heat-supply organizations is performed by the example of the Republic of Tatarstan. The assessment system is based on construction of comparative ratings of heat-supply organizations. A rating is the assessment of reliability of the organization, is characterized by a numerical value, and makes it possible to compare organizations engaged in the same kind of activity between each other.

  4. Comparative Study on Response Surfaces for Reliability Analysis of Spatially Variable Soil Slope

    李亮; 褚雪松


    This paper focuses on the performance of the second-order polynomial-based response surfaces on the reliability of spatially variable soil slope. A single response surface constructed to approximate the slope system failure performance functionG(X) (called single RS) and multiple response surfaces constructed on finite number of slip surfaces (called multiple RS) are developed, respectively. Single RS and multiple RS are applied to evaluate the system failure probability pf for a cohesive soil slope together with Monte Carlo simulation (MCS). It is found thatpf calculated by single RS deviates significantly from that obtained by searching a large number of potential slip surfaces, and this deviation becomes insignificant with the decrease of the number of random variables or the increase of the scale of fluctuation. In other words, single RS cannot approximateG(X) with reasonable accuracy. The value ofpf from multiple response surfaces fits well with that obtained by searching a large number of potential slip surfaces. That is, multiple RS can estimateG(X) with reasonable accuracy.

  5. Interrater reliability of schizoaffective disorder compared with schizophrenia, bipolar disorder, and unipolar depression - A systematic review and meta-analysis.

    Santelmann, Hanno; Franklin, Jeremy; Bußhoff, Jana; Baethge, Christopher


    Schizoaffective disorder is a common diagnosis in clinical practice but its nosological status has been subject to debate ever since it was conceptualized. Although it is key that diagnostic reliability is sufficient, schizoaffective disorder has been reported to have low interrater reliability. Evidence based on systematic review and meta-analysis methods, however, is lacking. Using a highly sensitive literature search in Medline, Embase, and PsycInfo we identified studies measuring the interrater reliability of schizoaffective disorder in comparison to schizophrenia, bipolar disorder, and unipolar disorder. Out of 4126 records screened we included 25 studies reporting on 7912 patients diagnosed by different raters. The interrater reliability of schizoaffective disorder was moderate (meta-analytic estimate of Cohen's kappa 0.57 [95% CI: 0.41-0.73]), and substantially lower than that of its main differential diagnoses (difference in kappa between 0.22 and 0.19). Although there was considerable heterogeneity, analyses revealed that the interrater reliability of schizoaffective disorder was consistently lower in the overwhelming majority of studies. The results remained robust in subgroup and sensitivity analyses (e.g., diagnostic manual used) as well as in meta-regressions (e.g., publication year) and analyses of publication bias. Clinically, the results highlight the particular importance of diagnostic re-evaluation in patients diagnosed with schizoaffective disorder. They also quantify a widely held clinical impression of lower interrater reliability and agree with earlier meta-analysis reporting low test-retest reliability.

  6. System Reliability Analysis: Foundations.


    performance formulas for systems subject to pre- ventive maintenance are given. V * ~, , 9 D -2 SYSTEM RELIABILITY ANALYSIS: FOUNDATIONS Richard E...reliability in this case is V P{s can communicate with the terminal t = h(p) Sp2(((((p p)p) p)p)gp) + p(l -p)(((pL p)p)(p 2 JLp)) + p(l -p)((p(p p...For undirected networks, the basic reference is A. Satyanarayana and Kevin Wood (1982). For directed networks, the basic reference is Avinash

  7. ATLAS reliability analysis

    Bartsch, R.R.


    Key elements of the 36 MJ ATLAS capacitor bank have been evaluated for individual probabilities of failure. These have been combined to estimate system reliability which is to be greater than 95% on each experimental shot. This analysis utilizes Weibull or Weibull-like distributions with increasing probability of failure with the number of shots. For transmission line insulation, a minimum thickness is obtained and for the railgaps, a method for obtaining a maintenance interval from forthcoming life tests is suggested.

  8. Comparative measurement of collagen bundle orientation by Fourier analysis and semiquantitative evaluation: reliability and agreement in Masson's trichrome, Picrosirius red and confocal microscopy techniques.

    Marcos-Garcés, V; Harvat, M; Molina Aguilar, P; Ferrández Izquierdo, A; Ruiz-Saurí, A


    Measurement of collagen bundle orientation in histopathological samples is a widely used and useful technique in many research and clinical scenarios. Fourier analysis is the preferred method for performing this measurement, but the most appropriate staining and microscopy technique remains unclear. Some authors advocate the use of Haematoxylin-Eosin (H&E) and confocal microscopy, but there are no studies comparing this technique with other classical collagen stainings. In our study, 46 human skin samples were collected, processed for histological analysis and stained with Masson's trichrome, Picrosirius red and H&E. Five microphotographs of the reticular dermis were taken with a 200× magnification with light microscopy, polarized microscopy and confocal microscopy, respectively. Two independent observers measured collagen bundle orientation with semiautomated Fourier analysis with the Image-Pro Plus 7.0 software and three independent observers performed a semiquantitative evaluation of the same parameter. The average orientation for each case was calculated with the values of the five pictures. We analyzed the interrater reliability, the consistency between Fourier analysis and average semiquantitative evaluation and the consistency between measurements in Masson's trichrome, Picrosirius red and H&E-confocal. Statistical analysis for reliability and agreement was performed with the SPSS 22.0 software and consisted of intraclass correlation coefficient (ICC), Bland-Altman plots and limits of agreement and coefficient of variation. Interrater reliability was almost perfect (ICC > 0.8) with all three histological and microscopy techniques and always superior in Fourier analysis than in average semiquantitative evaluation. Measurements were consistent between Fourier analysis by one observer and average semiquantitative evaluation by three observers, with an almost perfect agreement with Masson's trichrome and Picrosirius red techniques (ICC > 0.8) and a strong

  9. Reliability Analysis of Wind Turbines

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard


    In order to minimise the total expected life-cycle costs of a wind turbine it is important to estimate the reliability level for all components in the wind turbine. This paper deals with reliability analysis for the tower and blades of onshore wind turbines placed in a wind farm. The limit states...... consideres are in the ultimate limit state (ULS) extreme conditions in the standstill position and extreme conditions during operating. For wind turbines, where the magnitude of the loads is influenced by the control system, the ultimate limit state can occur in both cases. In the fatigue limit state (FLS......) the reliability level for a wind turbine placed in a wind farm is considered, and wake effects from neighbouring wind turbines is taken into account. An illustrative example with calculation of the reliability for mudline bending of the tower is considered. In the example the design is determined according...

  10. Hybrid reliability model for fatigue reliability analysis of steel bridges

    曹珊珊; 雷俊卿


    A kind of hybrid reliability model is presented to solve the fatigue reliability problems of steel bridges. The cumulative damage model is one kind of the models used in fatigue reliability analysis. The parameter characteristics of the model can be described as probabilistic and interval. The two-stage hybrid reliability model is given with a theoretical foundation and a solving algorithm to solve the hybrid reliability problems. The theoretical foundation is established by the consistency relationships of interval reliability model and probability reliability model with normally distributed variables in theory. The solving process is combined with the definition of interval reliability index and the probabilistic algorithm. With the consideration of the parameter characteristics of theS−N curve, the cumulative damage model with hybrid variables is given based on the standards from different countries. Lastly, a case of steel structure in the Neville Island Bridge is analyzed to verify the applicability of the hybrid reliability model in fatigue reliability analysis based on the AASHTO.

  11. Sensitivity Analysis of Component Reliability



    In a system, Every component has its unique position within system and its unique failure characteristics. When a component's reliability is changed, its effect on system reliability is not equal. Component reliability sensitivity is a measure of effect on system reliability while a component's reliability is changed. In this paper, the definition and relative matrix of component reliability sensitivity is proposed, and some of their characteristics are analyzed. All these will help us to analyse or improve the system reliability.

  12. Reliability Analysis of Sensor Networks

    JIN Yan; YANG Xiao-zong; WANG Ling


    To Integrate the capacity of sensing, communication, computing, and actuating, one of the compelling technological advances of these years has been the appearance of distributed wireless sensor network (DSN) for information gathering tasks. In order to save the energy, multi-hop routing between the sensor nodes and the sink node is necessary because of limited resource. In addition, the unpredictable conditional factors make the sensor nodes unreliable. In this paper, the reliability of routing designed for sensor network and some dependability issues of DSN, such as MTTF(mean time to failure) and the probability of connectivity between the sensor nodes and the sink node are analyzed.Unfortunately, we could not obtain the accurate result for the arbitrary network topology, which is # P-hard problem.And the reliability analysis of restricted topologies clustering-based is given. The method proposed in this paper will show us a constructive idea about how to place energyconstrained sensor nodes in the network efficiently from the prospective of reliability.

  13. Reliability Analysis of Slope Stability by Central Point Method

    Li, Chunge; WU Congliang


    Given uncertainty and variability of the slope stability analysis parameter, the paper proceed from the perspective of probability theory and statistics based on the reliability theory. Through the central point method of reliability analysis, performance function about the reliability of slope stability analysis is established. What’s more, the central point method and conventional limit equilibrium methods do comparative analysis by calculation example. The approach’s numerical ...


    LI Hong-shuang; L(U) Zhen-zhou; YUE Zhu-feng


    Support vector machine (SVM) was introduced to analyze the reliability of the implicit performance function, which is difficult to implement by the classical methods such as the first order reliability method (FORM) and the Monte Carlo simulation (MCS). As a classification method where the underlying structural risk minimization inference rule is employed, SVM possesses excellent learning capacity with a small amount of information and good capability of generalization over the complete data. Hence,two approaches, i.e., SVM-based FORM and SVM-based MCS, were presented for the structural reliability analysis of the implicit limit state function. Compared to the conventional response surface method (RSM) and the artificial neural network (ANN), which are widely used to replace the implicit state function for alleviating the computation cost,the more important advantages of SVM are that it can approximate the implicit function with higher precision and better generalization under the small amount of information and avoid the "curse of dimensionality". The SVM-based reliability approaches can approximate the actual performance function over the complete sampling data with the decreased number of the implicit performance function analysis (usually finite element analysis), and the computational precision can satisfy the engineering requirement, which are demonstrated by illustrations.

  15. Creep-rupture reliability analysis

    Peralta-Duran, A.; Wirsching, P. H.


    A probabilistic approach to the correlation and extrapolation of creep-rupture data is presented. Time temperature parameters (TTP) are used to correlate the data, and an analytical expression for the master curve is developed. The expression provides a simple model for the statistical distribution of strength and fits neatly into a probabilistic design format. The analysis focuses on the Larson-Miller and on the Manson-Haferd parameters, but it can be applied to any of the TTP's. A method is developed for evaluating material dependent constants for TTP's. It is shown that optimized constants can provide a significant improvement in the correlation of the data, thereby reducing modelling error. Attempts were made to quantify the performance of the proposed method in predicting long term behavior. Uncertainty in predicting long term behavior from short term tests was derived for several sets of data. Examples are presented which illustrate the theory and demonstrate the application of state of the art reliability methods to the design of components under creep.

  16. On Bayesian System Reliability Analysis

    Soerensen Ringi, M.


    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  17. Reliability Analysis of Money Habitudes

    Delgadillo, Lucy M.; Bushman, Brittani S.


    Use of the Money Habitudes exercise has gained popularity among various financial professionals. This article reports on the reliability of this resource. A survey administered to young adults at a western state university was conducted, and each Habitude or "domain" was analyzed using Cronbach's alpha procedures. Results showed all six…

  18. How to assess and compare inter-rater reliability, agreement and correlation of ratings: an exemplary analysis of mother-father and parent-teacher expressive vocabulary rating pairs

    Margarita eStolarova


    Full Text Available This report has two main purposes. First, we combine well-known analytical approaches to conduct a comprehensive assessment of agreement and correlation of rating-pairs and to dis-entangle these often confused concepts, providing a best-practice example on concrete data and a tutorial for future reference. Second, we explore whether a screening questionnaire deve-loped for use with parents can be reliably employed with daycare teachers when assessing early expressive vocabulary. A total of 53 vocabulary rating pairs (34 parent-teacher and 19 mother-father pairs collected for two-year-old children (12 bilingual are evaluated. First, inter-rater reliability both within and across subgroups is assessed using the intra-class correlation coefficient (ICC. Next, based on this analysis of reliability and on the test-retest reliability of the employed tool, inter-rater agreement is analyzed, magnitude and direction of rating differences are considered. Finally, Pearson correlation coefficients of standardized vocabulary scores are calculated and compared across subgroups. The results underline the necessity to distinguish between reliability measures, agreement and correlation. They also demonstrate the impact of the employed reliability on agreement evaluations. This study provides evidence that parent-teacher ratings of children’s early vocabulary can achieve agreement and correlation comparable to those of mother-father ratings on the assessed vocabulary scale. Bilingualism of the evaluated child decreased the likelihood of raters’ agreement. We conclude that future reports of agree-ment, correlation and reliability of ratings will benefit from better definition of terms and stricter methodological approaches. The methodological tutorial provided here holds the potential to increase comparability across empirical reports and can help improve research practices and knowledge transfer to educational and therapeutic settings.

  19. Combination of structural reliability and interval analysis

    Zhiping Qiu; Di Yang; saac Elishakoff


    In engineering applications,probabilistic reliability theory appears to be presently the most important method,however,in many cases precise probabilistic reliability theory cannot be considered as adequate and credible model of the real state of actual affairs.In this paper,we developed a hybrid of probabilistic and non-probabilistic reliability theory,which describes the structural uncertain parameters as interval variables when statistical data are found insufficient.By using the interval analysis,a new method for calculating the interval of the structural reliability as well as the reliability index is introduced in this paper,and the traditional probabilistic theory is incorporated with the interval analysis.Moreover,the new method preserves the useful part of the traditional probabilistic reliability theory,but removes the restriction of its strict requirement on data acquisition.Example is presented to demonstrate the feasibility and validity of the proposed theory.

  20. Integrated Methodology for Software Reliability Analysis

    Marian Pompiliu CRISTESCU


    Full Text Available The most used techniques to ensure safety and reliability of the systems are applied together as a whole, and in most cases, the software components are usually overlooked or to little analyzed. The present paper describes the applicability of fault trees analysis software system, analysis defined as Software Fault Tree Analysis (SFTA, fault trees are evaluated using binary decision diagrams, all of these being integrated and used with help from Java library reliability.

  1. Reliability Sensitivity Analysis for Location Scale Family

    洪东跑; 张海瑞


    Many products always operate under various complex environment conditions. To describe the dynamic influence of environment factors on their reliability, a method of reliability sensitivity analysis is proposed. In this method, the location parameter is assumed as a function of relevant environment variables while the scale parameter is assumed as an un- known positive constant. Then, the location parameter function is constructed by using the method of radial basis function. Using the varied environment test data, the log-likelihood function is transformed to a generalized linear expression by de- scribing the indicator as Poisson variable. With the generalized linear model, the maximum likelihood estimations of the model coefficients are obtained. With the reliability model, the reliability sensitivity is obtained. An instance analysis shows that the method is feasible to analyze the dynamic variety characters of reliability along with environment factors and is straightforward for engineering application.

  2. Space Mission Human Reliability Analysis (HRA) Project

    National Aeronautics and Space Administration — The purpose of this project is to extend current ground-based Human Reliability Analysis (HRA) techniques to a long-duration, space-based tool to more effectively...

  3. Production Facility System Reliability Analysis Report

    Dale, Crystal Buchanan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Klein, Steven Karl [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    This document describes the reliability, maintainability, and availability (RMA) modeling of the Los Alamos National Laboratory (LANL) design for the Closed Loop Helium Cooling System (CLHCS) planned for the NorthStar accelerator-based 99Mo production facility. The current analysis incorporates a conceptual helium recovery system, beam diagnostics, and prototype control system into the reliability analysis. The results from the 1000 hr blower test are addressed.

  4. Software reliability experiments data analysis and investigation

    Walker, J. Leslie; Caglayan, Alper K.


    The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.

  5. How useful and reliable are disaster databases in the context of climate and global change? A comparative case study analysis in Peru

    C. Huggel


    Full Text Available Loss and damage caused by weather and climate related disasters have increased over the past decades, and growing exposure and wealth have been identified as main drivers of this increase. Disaster databases are a primary tool for the analysis of disaster characteristics and trends at global or national scales, and support disaster risk reduction and climate change adaptation. However, the quality, consistency and completeness of different disaster databases are highly variable. Even though such variation critically influences the outcome of any study, comparative analyses of different disaster databases are still rare to date. Furthermore, there is an unequal geographic distribution of current disaster trend studies, with developing countries being under-represented. Here, we analyze three different disaster databases for the developing country context of Peru; a global database (EM-DAT, a regional Latin American (DesInventar and a national database (SINPAD. The analysis is performed across three dimensions, (1 spatial scales, from local to regional (provincial and national scale; (2 time scales, from single events to decadal trends; and (3 disaster categories and metrics, including the number of disaster occurrence, and damage metrics such as people killed and affected. Results show limited changes in disaster occurrence in the Cusco and Apurímac regions in southern Peru over the past four decades, but strong trends in people affected at the national scale. We furthermore found large variations of the disaster parameters studied over different spatial and temporal scales, depending on the disaster database analyzed. We conclude and recommend that the type, method and source of documentation should be carefully evaluated for any analysis of disaster databases; reporting criteria should be improved and documentation efforts strengthened.

  6. Reliability Analysis of DOOF for Weibull Distribution

    陈文华; 崔杰; 樊小燕; 卢献彪; 相平


    Hierarchical Bayesian method for estimating the failure probability under DOOF by taking the quasi-Beta distribution as the prior distribution is proposed in this paper. The weighted Least Squares Estimate method was used to obtain the formula for computing reliability distribution parameters and estimating the reliability characteristic values under DOOF. Taking one type of aerospace electrical connector as an example, the correctness of the above method through statistical analysis of electrical connector accelerated life test data was verified.

  7. A comparative study on the HW reliability assessment methods for digital I and C equipment

    Jung, Hoan Sung; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Lee, G. Y. [Korea Atomic Energy Research Institute, Taejeon (Korea); Kim, M. C. [Korea Advanced Institute of Science and Technology, Taejeon (Korea); Jun, S. T. [KHNP, Taejeon (Korea)


    It is necessary to predict or to evaluate the reliability of electronic equipment for the probabilistic safety analysis of digital instrument and control equipment. But most databases for the reliability prediction have no data for the up-to-date equipment and the failure modes are not classified. The prediction results for the specific component show different values according to the methods and databases. For boards and systems each method shows different values than others also. This study is for reliability prediction of PDC system for Wolsong NPP1 as a digital I and C equipment. Various reliability prediction methods and failure databases are used in calculation of the reliability to compare the effects of sensitivity and accuracy of each model and database. Many considerations for the reliability assessment of digital systems are derived with the results of this study. 14 refs., 19 figs., 15 tabs. (Author)

  8. Reliability analysis of flood defence systems

    Steenbergen, H.M.G.M.; Lassing, B.L.; Vrouwenvelder, A.C.W.M.; Waarts, P.H.


    In recent years an advanced program for the reliability analysis of flood defence systems has been under development. This paper describes the global data requirements for the application and the setup of the models. The analysis generates the probability of system failure and the contribution of ea

  9. Reliability Analysis of High Rockfill Dam Stability

    Ping Yi


    Full Text Available A program 3DSTAB combining slope stability analysis and reliability analysis is developed and validated. In this program, the limit equilibrium method is utilized to calculate safety factors of critical slip surfaces. The first-order reliability method is used to compute reliability indexes corresponding to critical probabilistic surfaces. When derivatives of the performance function are calculated by finite difference method, the previous iteration’s critical slip surface is saved and used. This sequential approximation strategy notably improves efficiency. Using this program, the stability reliability analyses of concrete faced rockfill dams and earth core rockfill dams with different heights and different slope ratios are performed. The results show that both safety factors and reliability indexes decrease as the dam’s slope increases at a constant height and as the dam’s height increases at a constant slope. They decrease dramatically as the dam height increases from 100 m to 200 m while they decrease slowly once the dam height exceeds 250 m, which deserves attention. Additionally, both safety factors and reliability indexes of the upstream slope of earth core rockfill dams are higher than that of the downstream slope. Thus, the downstream slope stability is the key failure mode for earth core rockfill dams.

  10. Culture Representation in Human Reliability Analysis

    David Gertman; Julie Marble; Steven Novack


    Understanding human-system response is critical to being able to plan and predict mission success in the modern battlespace. Commonly, human reliability analysis has been used to predict failures of human performance in complex, critical systems. However, most human reliability methods fail to take culture into account. This paper takes an easily understood state of the art human reliability analysis method and extends that method to account for the influence of culture, including acceptance of new technology, upon performance. The cultural parameters used to modify the human reliability analysis were determined from two standard industry approaches to cultural assessment: Hofstede’s (1991) cultural factors and Davis’ (1989) technology acceptance model (TAM). The result is called the Culture Adjustment Method (CAM). An example is presented that (1) reviews human reliability assessment with and without cultural attributes for a Supervisory Control and Data Acquisition (SCADA) system attack, (2) demonstrates how country specific information can be used to increase the realism of HRA modeling, and (3) discusses the differences in human error probability estimates arising from cultural differences.

  11. Reliability Analysis of a Steel Frame

    M. Sýkora


    Full Text Available A steel frame with haunches is designed according to Eurocodes. The frame is exposed to self-weight, snow, and wind actions. Lateral-torsional buckling appears to represent the most critical criterion, which is considered as a basis for the limit state function. In the reliability analysis, the probabilistic models proposed by the Joint Committee for Structural Safety (JCSS are used for basic variables. The uncertainty model coefficients take into account the inaccuracy of the resistance model for the haunched girder and the inaccuracy of the action effect model. The time invariant reliability analysis is based on Turkstra's rule for combinations of snow and wind actions. The time variant analysis describes snow and wind actions by jump processes with intermittencies. Assuming a 50-year lifetime, the obtained values of the reliability index b vary within the range from 3.95 up to 5.56. The cross-profile IPE 330 designed according to Eurocodes seems to be adequate. It appears that the time invariant reliability analysis based on Turkstra's rule provides considerably lower values of b than those obtained by the time variant analysis.

  12. Discrete event simulation versus conventional system reliability analysis approaches

    Kozine, Igor


    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  13. Event/Time/Availability/Reliability-Analysis Program

    Viterna, L. A.; Hoffman, D. J.; Carr, Thomas


    ETARA is interactive, menu-driven program that performs simulations for analysis of reliability, availability, and maintainability. Written to evaluate performance of electrical power system of Space Station Freedom, but methodology and software applied to any system represented by block diagram. Program written in IBM APL.

  14. Reliability analysis of ship structure system with multi-defects


    This paper analyzes the influence of multi-defects including the initial distortions,welding residual stresses,cracks and local dents on the ultimate strength of the plate element,and has worked out expressions of reliability calculation and sensitivity analysis of the plate element.Reliability analysis is made for the system with multi-defects plate elements.Failure mechanism,failure paths and the calculating approach to global reliability index are also worked out.After plate elements with multi-defects fail,the formula of reverse node forces which affect the residual structure is deduced,so are the sensitivity expressions of the system reliability index.This ensures calculating accuracy and rationality for reliability analysis,and makes it convenient to find weakness plate elements which affect the reliability of the structure system.Finally,for the validity of the approach proposed,we take the numerical example of a ship cabin to compare and contrast the reliability and the sensitivity analysis of the structure system with multi-defects with those of the structure system with no defects.The approach has implications for the structure design,rational maintenance and renewing strategy.

  15. Reliability analysis of DOOF for Weibull distribution

    陈文华; 崔杰; 樊晓燕; 卢献彪; 相平


    Hierarchical Bayesian method for estimating the failure probability Pi under DOOF by taking the quasi-Beta distribution B(pi-1 , 1,1, b ) as the prior distribution is proposed in this paper. The weighted Least Squares Estimate method was used to obtain the formula for computing reliability distribution parameters and estimating the reliability characteristic values under DOOF. Taking one type of aerospace electrical connectoras an example, the correctness of the above method through statistical analysis of electrical connector acceler-ated life test data was verified.

  16. Comparative reliability of cheiloscopy and palatoscopy in human identification

    Sharma Preeti


    Full Text Available Background: Establishing a person′s identity in postmortem scenarios can be a very difficult process. Dental records, fingerprint and DNA comparisons are probably the most common techniques used in this context, allowing fast and reliable identification processes. However, under certain circumstances they cannot always be used; sometimes it is necessary to apply different and less known techniques. In forensic identification, lip prints and palatal rugae patterns can lead us to important information and help in a person′s identification. This study aims to ascertain the use of lip prints and palatal rugae pattern in identification and sex differentiation. Materials and Methods: A total of 100 subjects, 50 males and 50 females were selected from among the students of Subharti Dental College, Meerut. The materials used to record lip prints were lipstick, bond paper, cellophane tape, a brush for applying the lipstick, and a magnifying lens. To study palatal rugae, alginate impressions were taken and the dental casts analyzed for their various patterns. Results: Statistical analysis (applying Z-test for proportion showed significant difference for type I, I′, IV and V lip patterns (P < 0.05 in males and females, while no significant difference was observed for the same in the palatal rugae patterns (P > 0.05. Conclusion: This study not only showed that palatal rugae and lip prints are unique to an individual, but also that lip prints is more reliable for recognition of the sex of an individual.

  17. Requalification of offshore structures. Reliability analysis of platform

    Bloch, A.; Dalsgaard Soerensen, J. [Aalborg Univ. (Denmark)


    A preliminary reliability analysis has been performed for an example platform. In order to model the structural response such that it is possible to calculate reliability indices, approximate quadratic response surfaces have been determined for cross-sectional forces. Based on a deterministic, code-based analysis the elements and joints which can be expected to be the most critical are selected and response surfaces are established for the cross-sectional forces in those. A stochastic model is established for the uncertain variables. The reliability analysis shows that with this stochastic model the smallest reliability indices for elements are about 3.9. The reliability index for collapse (pushover) is estimated to 6.7 and the reliability index for fatigue failure using a crude model is for the expected most critical detail estimated to 3.2, corresponding to the accumulated damage during the design lifetime of the platform. These reliability indices are considered to be reasonable compared with values recommended by e.g. ISO. The most important stochastic variables are found to be the wave height and the drag coefficient (including the model uncertainty related to estimation of wave forces on the platform). (au)

  18. Human reliability analysis of control room operators

    Santos, Isaac J.A.L.; Carvalho, Paulo Victor R.; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)


    Human reliability is the probability that a person correctly performs some system required action in a required time period and performs no extraneous action that can degrade the system Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. Significant progress has been made in the HRA field during the last years, mainly in nuclear area. Some first-generation HRA methods were developed, as THERP (Technique for human error rate prediction). Now, an array of called second-generation methods are emerging as alternatives, for instance ATHEANA (A Technique for human event analysis). The ergonomics approach has as tool the ergonomic work analysis. It focus on the study of operator's activities in physical and mental form, considering at the same time the observed characteristics of operator and the elements of the work environment as they are presented to and perceived by the operators. The aim of this paper is to propose a methodology to analyze the human reliability of the operators of industrial plant control room, using a framework that includes the approach used by ATHEANA, THERP and the work ergonomics analysis. (author)

  19. Bridging Resilience Engineering and Human Reliability Analysis

    Ronald L. Boring


    There has been strong interest in the new and emerging field called resilience engineering. This field has been quick to align itself with many existing safety disciplines, but it has also distanced itself from the field of human reliability analysis. To date, the discussion has been somewhat one-sided, with much discussion about the new insights afforded by resilience engineering. This paper presents an attempt to address resilience engineering from the perspective of human reliability analysis (HRA). It is argued that HRA shares much in common with resilience engineering and that, in fact, it can help strengthen nascent ideas in resilience engineering. This paper seeks to clarify and ultimately refute the arguments that have served to divide HRA and resilience engineering.

  20. Reliability analysis of wastewater treatment plants.

    Oliveira, Sílvia C; Von Sperling, Marcos


    This article presents a reliability analysis of 166 full-scale wastewater treatment plants operating in Brazil. Six different processes have been investigated, comprising septic tank+anaerobic filter, facultative pond, anaerobic pond+facultative pond, activated sludge, upflow anaerobic sludge blanket (UASB) reactors alone and UASB reactors followed by post-treatment. A methodology developed by Niku et al. [1979. Performance of activated sludge process and reliability-based design. J. Water Pollut. Control Assoc., 51(12), 2841-2857] is used for determining the coefficients of reliability (COR), in terms of the compliance of effluent biochemical oxygen demand (BOD), chemical oxygen demand (COD), total suspended solids (TSS), total nitrogen (TN), total phosphorus (TP) and fecal or thermotolerant coliforms (FC) with discharge standards. The design concentrations necessary to meet the prevailing discharge standards and the expected compliance percentages have been calculated from the COR obtained. The results showed that few plants, under the observed operating conditions, would be able to present reliable performances considering the compliance with the analyzed standards. The article also discusses the importance of understanding the lognormal behavior of the data in setting up discharge standards, in interpreting monitoring results and compliance with the legislation.

  1. The quantitative failure of human reliability analysis

    Bennett, C.T.


    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  2. Small nuclear power reactor emergency electric power supply system reliability comparative analysis; Analise da confiabilidade do sistema de suprimento de energia eletrica de emergencia de um reator nuclear de pequeno porte

    Bonfietti, Gerson


    This work presents an analysis of the reliability of the emergency power supply system, of a small size nuclear power reactor. Three different configurations are investigated and their reliability analyzed. The fault tree method is used as the main tool of analysis. The work includes a bibliographic review of emergency diesel generator reliability and a discussion of the design requirements applicable to emergency electrical systems. The influence of common cause failure influences is considered using the beta factor model. The operator action is considered using human failure probabilities. A parametric analysis shows the strong dependence between the reactor safety and the loss of offsite electric power supply. It is also shown that common cause failures can be a major contributor to the system reliability. (author)

  3. Reliability analysis of ceramic matrix composite laminates

    Thomas, David J.; Wetherhold, Robert C.


    At a macroscopic level, a composite lamina may be considered as a homogeneous orthotropic solid whose directional strengths are random variables. Incorporation of these random variable strengths into failure models, either interactive or non-interactive, allows for the evaluation of the lamina reliability under a given stress state. Using a non-interactive criterion for demonstration purposes, laminate reliabilities are calculated assuming previously established load sharing rules for the redistribution of load as the failure of laminae occur. The matrix cracking predicted by ACK theory is modeled to allow a loss of stiffness in the fiber direction. The subsequent failure in the fiber direction is controlled by a modified bundle theory. Results using this modified bundle model are compared with previous models which did not permit separate consideration of matrix cracking, as well as to results obtained from experimental data.

  4. Reliability of photographic posture analysis of adolescents.

    Hazar, Zeynep; Karabicak, Gul Oznur; Tiftikci, Ugur


    [Purpose] Postural problems of adolescents needs to be evaluated accurately because they may lead to greater problems in the musculoskeletal system as they develop. Although photographic posture analysis has been frequently used, more simple and accessible methods are still needed. The purpose of this study was to investigate the inter- and intra-rater reliability of photographic posture analysis using MB-ruler software. [Subjects and Methods] Subjects were 30 adolescents (15 girls and 15 boys, mean age: 16.4±0.4 years, mean height 166.3±6.7 cm, mean weight 63.8±15.1 kg) and photographs of their habitual standing posture photographs were taken in the sagittal plane. For the evaluation of postural angles, reflective markers were placed on anatomical landmarks. For angular measurements, MB-ruler (Markus Bader- MB Software Solutions, triangular screen ruler) was used. Photographic evaluations were performed by two observers with a repetition after a week. Test-retest and inter-rater reliability evaluations were calculated using intra-class correlation coefficients (ICC). [Results] Inter-rater (ICC>0.972) and test-retest (ICC>0.774) reliability were found to be in the range of acceptable to excellent. [Conclusion] Reference angles for postural evaluation were found to be reliable and repeatable. The present method was found to be an easy and non-invasive method and it may be utilized by researchers who are in search of an alternative method for photographic postural assessments.

  5. Representative Sampling for reliable data analysis

    Petersen, Lars; Esbensen, Kim Harry


    The Theory of Sampling (TOS) provides a description of all errors involved in sampling of heterogeneous materials as well as all necessary tools for their evaluation, elimination and/or minimization. This tutorial elaborates on—and illustrates—selected central aspects of TOS. The theoretical...... regime in order to secure the necessary reliability of: samples (which must be representative, from the primary sampling onwards), analysis (which will not mean anything outside the miniscule analytical volume without representativity ruling all mass reductions involved, also in the laboratory) and data...

  6. Reliability Analysis of Adhesive Bonded Scarf Joints

    Kimiaeifar, Amin; Toft, Henrik Stensgaard; Lund, Erik;


    A probabilistic model for the reliability analysis of adhesive bonded scarfed lap joints subjected to static loading is developed. It is representative for the main laminate in a wind turbine blade subjected to flapwise bending. The structural analysis is based on a three dimensional (3D) finite...... the FEA model, and a sensitivity analysis on the influence of various geometrical parameters and material properties on the maximum stress is conducted. Because the yield behavior of many polymeric structural adhesives is dependent on both deviatoric and hydrostatic stress components, different ratios...... of the compressive to tensile adhesive yield stresses in the failure criterion are considered. It is shown that the chosen failure criterion, the scarf angle and the load are significant for the assessment of the probability of failure....


    Bowerman, P. N.


    RELAV (Reliability/Availability Analysis Program) is a comprehensive analytical tool to determine the reliability or availability of any general system which can be modeled as embedded k-out-of-n groups of items (components) and/or subgroups. Both ground and flight systems at NASA's Jet Propulsion Laboratory have utilized this program. RELAV can assess current system performance during the later testing phases of a system design, as well as model candidate designs/architectures or validate and form predictions during the early phases of a design. Systems are commonly modeled as System Block Diagrams (SBDs). RELAV calculates the success probability of each group of items and/or subgroups within the system assuming k-out-of-n operating rules apply for each group. The program operates on a folding basis; i.e. it works its way towards the system level from the most embedded level by folding related groups into single components. The entire folding process involves probabilities; therefore, availability problems are performed in terms of the probability of success, and reliability problems are performed for specific mission lengths. An enhanced cumulative binomial algorithm is used for groups where all probabilities are equal, while a fast algorithm based upon "Computing k-out-of-n System Reliability", Barlow & Heidtmann, IEEE TRANSACTIONS ON RELIABILITY, October 1984, is used for groups with unequal probabilities. Inputs to the program include a description of the system and any one of the following: 1) availabilities of the items, 2) mean time between failures and mean time to repairs for the items from which availabilities are calculated, 3) mean time between failures and mission length(s) from which reliabilities are calculated, or 4) failure rates and mission length(s) from which reliabilities are calculated. The results are probabilities of success of each group and the system in the given configuration. RELAV assumes exponential failure distributions for

  8. Reliability of the Emergency Severity Index: Meta-analysis

    Amir Mirhaghi


    Full Text Available Objectives: Although triage systems based on the Emergency Severity Index (ESI have many advantages in terms of simplicity and clarity, previous research has questioned their reliability in practice. Therefore, the aim of this meta-analysis was to determine the reliability of ESI triage scales. Methods: This metaanalysis was performed in March 2014. Electronic research databases were searched and articles conforming to the Guidelines for Reporting Reliability and Agreement Studies were selected. Two researchers independently examined selected abstracts. Data were extracted in the following categories: version of scale (latest/older, participants (adult/paediatric, raters (nurse, physician or expert, method of reliability (intra/inter-rater, reliability statistics (weighted/unweighted kappa and the origin and publication year of the study. The effect size was obtained by the Z-transformation of reliability coefficients. Data were pooled with random-effects models and a meta-regression was performed based on the method of moments estimator. Results: A total of 19 studies from six countries were included in the analysis. The pooled coefficient for the ESI triage scales was substantial at 0.791 (95% confidence interval: 0.787‒0.795. Agreement was higher with the latest and adult versions of the scale and among expert raters, compared to agreement with older and paediatric versions of the scales and with other groups of raters, respectively. Conclusion: ESI triage scales showed an acceptable level of overall reliability. However, ESI scales require more development in order to see full agreement from all rater groups. Further studies concentrating on other aspects of reliability assessment are needed.

  9. Integrated Reliability and Risk Analysis System (IRRAS)

    Russell, K D; McKay, M K; Sattison, M.B. Skinner, N.L.; Wood, S T [EG and G Idaho, Inc., Idaho Falls, ID (United States); Rasmuson, D M [Nuclear Regulatory Commission, Washington, DC (United States)


    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance.

  10. Reliability Analysis of Tubular Joints in Offshore Structures

    Thoft-Christensen, Palle; Sørensen, John Dalsgaard


    Reliability analysis of single tubular joints and offshore platforms with tubular joints is" presented. The failure modes considered are yielding, punching, buckling and fatigue failure. Element reliability as well as systems reliability approaches are used and illustrated by several examples....... Finally, optimal design of tubular.joints with reliability constraints is discussed and illustrated by an example....

  11. Reliability analysis of retaining walls with multiple failure modes

    张道兵; 孙志彬; 朱川曲


    In order to reduce the errors of the reliability of the retaining wall structure in the establishment of function, in the estimation of parameter and algorithm, firstly, two new reliability and stability models of anti-slipping and anti-overturning based on the upper-bound theory of limit analysis were established, and two kinds of failure modes were regarded as a series of systems with multiple correlated failure modes. Then, statistical characteristics of parameters of the retaining wall structure were inferred by maximal entropy principle. At last, the structural reliabilities of single failure mode and multiple failure modes were calculated by Monte Carlo method in MATLAB and the results were compared and analyzed on the sensitivity. It indicates that this method, with a high precision, is not only easy to program and quick in calculation, but also without the limit of nonlinear functions and non-normal random variables. And the results calculated by this method which applies both the limit analysis theory, maximal entropy principle and Monte Carlo method into analyzing the reliability of the retaining wall structures is more scientific, accurate and reliable, in comparison with those calculated by traditional method.

  12. Reliability Analysis of a Green Roof Under Different Storm Scenarios

    William, R. K.; Stillwell, A. S.


    Urban environments continue to face the challenges of localized flooding and decreased water quality brought on by the increasing amount of impervious area in the built environment. Green infrastructure provides an alternative to conventional storm sewer design by using natural processes to filter and store stormwater at its source. However, there are currently few consistent standards available in North America to ensure that installed green infrastructure is performing as expected. This analysis offers a method for characterizing green roof failure using a visual aid commonly used in earthquake engineering: fragility curves. We adapted the concept of the fragility curve based on the efficiency in runoff reduction provided by a green roof compared to a conventional roof under different storm scenarios. We then used the 2D distributed surface water-groundwater coupled model MIKE SHE to model the impact that a real green roof might have on runoff in different storm events. We then employed a multiple regression analysis to generate an algebraic demand model that was input into the Matlab-based reliability analysis model FERUM, which was then used to calculate the probability of failure. The use of reliability analysis as a part of green infrastructure design code can provide insights into green roof weaknesses and areas for improvement. It also supports the design of code that is more resilient than current standards and is easily testable for failure. Finally, the understanding of reliability of a single green roof module under different scenarios can support holistic testing of system reliability.

  13. Reliability Analysis of Elasto-Plastic Structures

    Thoft-Christensen, Palle; Sørensen, John Dalsgaard


    This paper only deals with framed and trussed structures which can be modelled as systems with ductile elements. The elements are all assumed to be linear-elastic perfectly plastic. The loading is assumed to be concentrated and time-independent. The strength of the elements and the loads....... Failure of this type of system is defined either as formation of a mechanism or by failure of a prescribed number of elements. In the first case failure is independent of the order in which the elements fail, but this is not so by the second definition. The reliability analysis consists of two parts...... of the significant failure modes. The significant failure modes are as usual modelled as elements in a series system (see e.g. Thoft-Christensen & Baker [2)). Several methods to perform this estimate are presented including upper- and lower-bound estimates. Upper bounds for the failure probability estimate...


    Ronald L. Boring; David I. Gertman; Katya Le Blanc


    This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.

  15. Human Reliability Analysis for Small Modular Reactors

    Ronald L. Boring; David I. Gertman


    Because no human reliability analysis (HRA) method was specifically developed for small modular reactors (SMRs), the application of any current HRA method to SMRs represents tradeoffs. A first- generation HRA method like THERP provides clearly defined activity types, but these activity types do not map to the human-system interface or concept of operations confronting SMR operators. A second- generation HRA method like ATHEANA is flexible enough to be used for SMR applications, but there is currently insufficient guidance for the analyst, requiring considerably more first-of-a-kind analyses and extensive SMR expertise in order to complete a quality HRA. Although no current HRA method is optimized to SMRs, it is possible to use existing HRA methods to identify errors, incorporate them as human failure events in the probabilistic risk assessment (PRA), and quantify them. In this paper, we provided preliminary guidance to assist the human reliability analyst and reviewer in understanding how to apply current HRA methods to the domain of SMRs. While it is possible to perform a satisfactory HRA using existing HRA methods, ultimately it is desirable to formally incorporate SMR considerations into the methods. This may require the development of new HRA methods. More practicably, existing methods need to be adapted to incorporate SMRs. Such adaptations may take the form of guidance on the complex mapping between conventional light water reactors and small modular reactors. While many behaviors and activities are shared between current plants and SMRs, the methods must adapt if they are to perform a valid and accurate analysis of plant personnel performance in SMRs.

  16. Task Decomposition in Human Reliability Analysis

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory


    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  17. [Qualitative analysis: theory, steps and reliability].

    Minayo, Maria Cecília de Souza


    This essay seeks to conduct in-depth analysis of qualitative research, based on benchmark authors and the author's own experience. The hypothesis is that in order for an analysis to be considered reliable, it needs to be based on structuring terms of qualitative research, namely the verbs 'comprehend' and 'interpret', and the nouns 'experience', 'common sense' and 'social action'. The 10 steps begin with the construction of the scientific object by its inclusion on the national and international agenda; the development of tools that make the theoretical concepts tangible; conducting field work that involves the researcher empathetically with the participants in the use of various techniques and approaches, making it possible to build relationships, observations and a narrative with perspective. Finally, the author deals with the analysis proper, showing how the object, which has already been studied in all the previous steps, should become a second-order construct, in which the logic of the actors in their diversity and not merely their speech predominates. The final report must be a theoretic, contextual, concise and clear narrative.

  18. Sensitivity Analysis for the System Reliability Function


    reliabilities. The unique feature of the approach is that stunple data collected on K inde-ndent replications using a specified component reliability % v &:•r...Carlo method. The polynomial time algorithm of Agrawaw Pad Satyanarayana (104) fIr the exact reliability computaton for seres- allel systems exemplifies...consideration. As an example for the s-t connectedness problem, let denote -7- edge-disjoint minimal s-t paths of G and let V , denote edge-disjoint

  19. Reliability analysis of an associated system

    陈长杰; 魏一鸣; 蔡嗣经


    Based on engineering reliability of large complex system and distinct characteristic of soft system, some new conception and theory on the medium elements and the associated system are created. At the same time, the reliability logic model of associated system is provided. In this paper, through the field investigation of the trial operation, the engineering reliability of the paste fill system in No.2 mine of Jinchuan Non-ferrous Metallic Corporation is analyzed by using the theory of associated system.

  20. Reliability Analysis and Optimal Design of Monolithic Vertical Wall Breakwaters

    Sørensen, John Dalsgaard; Burcharth, Hans F.; Christiani, E.


    Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of the most important failure modes, sliding failure, failure of the foundation and overturning failure are described . Relevant design variables are identified and relia......Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of the most important failure modes, sliding failure, failure of the foundation and overturning failure are described . Relevant design variables are identified...

  1. A Novel Two-Terminal Reliability Analysis for MANET

    Xibin Zhao; Zhiyang You; Hai Wan


    Mobile ad hoc network (MANET) is a dynamic wireless communication network. Because of the dynamic and infrastructureless characteristics, MANET is vulnerable in reliability. This paper presents a novel reliability analysis for MANET. The node mobility effect and the node reliability based on a real MANET platform are modeled and analyzed. An effective Monte Carlo method for reliability analysis is proposed. A detailed evaluation is performed in terms of the experiment results.

  2. A Novel Two-Terminal Reliability Analysis for MANET

    Xibin Zhao


    Full Text Available Mobile ad hoc network (MANET is a dynamic wireless communication network. Because of the dynamic and infrastructureless characteristics, MANET is vulnerable in reliability. This paper presents a novel reliability analysis for MANET. The node mobility effect and the node reliability based on a real MANET platform are modeled and analyzed. An effective Monte Carlo method for reliability analysis is proposed. A detailed evaluation is performed in terms of the experiment results.

  3. Reliability test and failure analysis of high power LED packages*

    Chen Zhaohui; Zhang Qin; Wang Kai; Luo Xiaobing; Liu Sheng


    A new type application specific light emitting diode (LED) package (ASLP) with freeform polycarbonate lens for street lighting is developed, whose manufacturing processes are compatible with a typical LED packaging process. The reliability test methods and failure criterions from different vendors are reviewed and compared. It is found that test methods and failure criterions are quite different. The rapid reliability assessment standards are urgently needed for the LED industry. 85 ℃/85 RH with 700 mA is used to test our LED modules with three other vendors for 1000 h, showing no visible degradation in optical performance for our modules, with two other vendors showing significant degradation. Some failure analysis methods such as C-SAM, Nano X-ray CT and optical microscope are used for LED packages. Some failure mechanisms such as delaminations and cracks are detected in the LED packages after the accelerated reliability testing. The finite element simulation method is helpful for the failure analysis and design of the reliability of the LED packaging. One example is used to show one currently used module in industry is vulnerable and may not easily pass the harsh thermal cycle testing.

  4. Software architecture reliability analysis using failure scenarios

    Tekinerdogan, Bedir; Sozer, Hasan; Aksit, Mehmet


    With the increasing size and complexity of software in embedded systems, software has now become a primary threat for the reliability. Several mature conventional reliability engineering techniques exist in literature but traditionally these have primarily addressed failures in hardware components a

  5. Reliability in Cross-National Content Analysis.

    Peter, Jochen; Lauf, Edmund


    Investigates how coder characteristics such as language skills, political knowledge, coding experience, and coding certainty affected inter-coder and coder-training reliability. Shows that language skills influenced both reliability types. Suggests that cross-national researchers should pay more attention to cross-national assessments of…

  6. Individual Differences in Human Reliability Analysis

    Jeffrey C. Joe; Ronald L. Boring


    While human reliability analysis (HRA) methods include uncertainty in quantification, the nominal model of human error in HRA typically assumes that operator performance does not vary significantly when they are given the same initiating event, indicators, procedures, and training, and that any differences in operator performance are simply aleatory (i.e., random). While this assumption generally holds true when performing routine actions, variability in operator response has been observed in multiple studies, especially in complex situations that go beyond training and procedures. As such, complexity can lead to differences in operator performance (e.g., operator understanding and decision-making). Furthermore, psychological research has shown that there are a number of known antecedents (i.e., attributable causes) that consistently contribute to observable and systematically measurable (i.e., not random) differences in behavior. This paper reviews examples of individual differences taken from operational experience and the psychological literature. The impact of these differences in human behavior and their implications for HRA are then discussed. We propose that individual differences should not be treated as aleatory, but rather as epistemic. Ultimately, by understanding the sources of individual differences, it is possible to remove some epistemic uncertainty from analyses.




    RHIC has been successfully operated for 5 years as a collider for different species, ranging from heavy ions including gold and copper, to polarized protons. We present a critical analysis of reliability data for RHIC that not only identifies the principal factors limiting availability but also evaluates critical choices at design times and assess their impact on present machine performance. RHIC availability data are typical when compared to similar high-energy colliders. The critical analysis of operations data is the basis for studies and plans to improve RHIC machine availability beyond the 50-60% typical of high-energy colliders.

  8. Reliability Analysis on English Writing Test of SHSEE in Shanghai

    黄玉麒; 黄芳


    As a subjective test, the validity of writing test is acceptable. What about the reliability? Writing test occupies a special position in the senior high school entrance examination (SHSEE for short). It is important to ensure its reliability. By the analysis of recent years’English writing items in SHSEE, the author offer suggestions on how to guarantee the reliability of writing tests.

  9. The comparability and reliability of five health-state valuation methods.

    Krabbe, P F; Essink-Bot, M L; Bonsel, G J


    The objective of the study was to consider five methods for valuing health states with respect to their comparability (convergent validity, value functions) and reliability. Valuation tasks were performed by 104 student volunteers using five frequently used valuation methods: standard gamble (SG), time trade-off (TTO), rating scale (RS), willingness-to-pay (WTP) and the paired comparisons method (PC). Throughout the study, the EuroQol classification system was used to construct 13 health-state descriptions. Validity was investigated using the multitrait-multimethod (MTMM) methodology. The extent to which results of one method could be predicted by another was examined by transformations. Reliability of the methods was studied parametrically with Generalisability Theory (an ANOVA extension), as well as non-parametrically. Mean values for SG were slightly higher than TTO values. The RS could be distinguished from the other methods. After a simple power transformation, the RS values were found to be close to SG and TTO. Mean values of WTP were linearly related to SG and TTO, except at the extremes of the scale. However, the reliability of WTP was low and the number of inconsistencies substantial. Valuations made by the RS proved to be the most reliable. Paired comparisons did not provide stable results. In conclusion, the results of the parametric transformation function between RS and SG/TTO provide evidence to justify the current use of RS (with transformations) not only for reasons of feasibility and reliability but also for reasons of comparability. A definite judgement on PC requires data of a complete design. Due to the specific structure of the correlation matrix which is inherent in valuing health states, we believe that full MTMM is not applicable for the standard analysis of health-state valuations.

  10. Analysis on Some of Software Reliability Models


    Software reliability & maintainability evaluation tool (SRMET 3.0) is introducted in detail in this paper,which was developed by Software Evaluation and Test Center of China Aerospace Mechanical Corporation. SRMET 3.0is supported by seven soft ware reliability models and four software maintainability models. Numerical characteristicsfor all those models are deeply studied in this paper, and corresponding numerical algorithms for each model are alsogiven in the paper.

  11. System reliability analysis for kinematic performance of planar mechanisms

    ZHANG YiMin; HUANG XianZhen; ZHANG XuFang; HE XiangDong; WEN BangChun


    Based on the reliability and mechanism kinematic accuracy theories, we propose a general methodology for system reliability analysis of kinematic performance of planar mechanisms. The loop closure equations are used to estimate the kinematic performance errors of planar mechanisms. Reliability and system reliability theories are introduced to develop the limit state functions (LSF) for failure of kinematic performance qualities. The statistical fourth moment method and the Edgeworth series technique are used on system reliability analysis for kinematic performance of planar mechanisms, which relax the restrictions of probability distribution of design variables. Finally, the practicality, efficiency and accuracy of the proposed method are demonstrated by numerical examples.

  12. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    Ronald Laurids Boring


    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  13. Issues in benchmarking human reliability analysis methods : a literature review.

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)


    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  14. Using the Reliability Theory for Assessing the Decision Confidence Probability for Comparative Life Cycle Assessments.

    Wei, Wei; Larrey-Lassalle, Pyrène; Faure, Thierry; Dumoulin, Nicolas; Roux, Philippe; Mathias, Jean-Denis


    Comparative decision making process is widely used to identify which option (system, product, service, etc.) has smaller environmental footprints and for providing recommendations that help stakeholders take future decisions. However, the uncertainty problem complicates the comparison and the decision making. Probability-based decision support in LCA is a way to help stakeholders in their decision-making process. It calculates the decision confidence probability which expresses the probability of a option to have a smaller environmental impact than the one of another option. Here we apply the reliability theory to approximate the decision confidence probability. We compare the traditional Monte Carlo method with a reliability method called FORM method. The Monte Carlo method needs high computational time to calculate the decision confidence probability. The FORM method enables us to approximate the decision confidence probability with fewer simulations than the Monte Carlo method by approximating the response surface. Moreover, the FORM method calculates the associated importance factors that correspond to a sensitivity analysis in relation to the probability. The importance factors allow stakeholders to determine which factors influence their decision. Our results clearly show that the reliability method provides additional useful information to stakeholders as well as it reduces the computational time.

  15. Analysis on testing and operational reliability of software

    ZHAO Jing; LIU Hong-wei; CUI Gang; WANG Hui-qiang


    Software reliability was estimated based on NHPP software reliability growth models. Testing reliability and operational reliability may be essentially different. On the basis of analyzing similarities and differences of the testing phase and the operational phase, using the concept of operational reliability and the testing reliability, different forms of the comparison between the operational failure ratio and the predicted testing failure ratio were conducted, and the mathematical discussion and analysis were performed in detail. Finally, software optimal release was studied using software failure data. The results show that two kinds of conclusions can be derived by applying this method, one conclusion is to continue testing to meet the required reliability level of users, and the other is that testing stops when the required operational reliability is met, thus the testing cost can be reduced.

  16. Reliability estimation in a multilevel confirmatory factor analysis framework.

    Geldhof, G John; Preacher, Kristopher J; Zyphur, Michael J


    Scales with varying degrees of measurement reliability are often used in the context of multistage sampling, where variance exists at multiple levels of analysis (e.g., individual and group). Because methodological guidance on assessing and reporting reliability at multiple levels of analysis is currently lacking, we discuss the importance of examining level-specific reliability. We present a simulation study and an applied example showing different methods for estimating multilevel reliability using multilevel confirmatory factor analysis and provide supporting Mplus program code. We conclude that (a) single-level estimates will not reflect a scale's actual reliability unless reliability is identical at each level of analysis, (b) 2-level alpha and composite reliability (omega) perform relatively well in most settings, (c) estimates of maximal reliability (H) were more biased when estimated using multilevel data than either alpha or omega, and (d) small cluster size can lead to overestimates of reliability at the between level of analysis. We also show that Monte Carlo confidence intervals and Bayesian credible intervals closely reflect the sampling distribution of reliability estimates under most conditions. We discuss the estimation of credible intervals using Mplus and provide R code for computing Monte Carlo confidence intervals.

  17. Mechanical reliability analysis of tubes intended for hydrocarbons

    Nahal, Mourad; Khelif, Rabia [Badji Mokhtar University, Annaba (Algeria)


    Reliability analysis constitutes an essential phase in any study concerning reliability. Many industrialists evaluate and improve the reliability of their products during the development cycle - from design to startup (design, manufacture, and exploitation) - to develop their knowledge on cost/reliability ratio and to control sources of failure. In this study, we obtain results for hardness, tensile, and hydrostatic tests carried out on steel tubes for transporting hydrocarbons followed by statistical analysis. Results obtained allow us to conduct a reliability study based on resistance request. Thus, index of reliability is calculated and the importance of the variables related to the tube is presented. Reliability-based assessment of residual stress effects is applied to underground pipelines under a roadway, with and without active corrosion. Residual stress has been found to greatly increase probability of failure, especially in the early stages of pipe lifetime.

  18. Fault Diagnosis and Reliability Analysis Using Fuzzy Logic Method

    Miao Zhinong; Xu Yang; Zhao Xiangyu


    A new fuzzy logic fault diagnosis method is proposed. In this method, fuzzy equations are employed to estimate the component state of a system based on the measured system performance and the relationship between component state and system performance which is called as "performance-parameter" knowledge base and constructed by expert. Compared with the traditional fault diagnosis method, this fuzzy logic method can use humans intuitive knowledge and dose not need a precise mapping between system performance and component state. Simulation proves its effectiveness in fault diagnosis. Then, the reliability analysis is performed based on the fuzzy logic method.

  19. Analysis of Reliability of CET Band4



    CET Band 4 has been carried out for more than a decade. It becomes so large- scaled, so popular and so influential that many testing experts and foreign language teachers are willing to do research on it. In this paper, I will mainly analyse its reliability from the perspective of writing test and speaking test.

  20. Bypassing BDD Construction for Reliability Analysis

    Williams, Poul Frederick; Nikolskaia, Macha; Rauzy, Antoine


    In this note, we propose a Boolean Expression Diagram (BED)-based algorithm to compute the minimal p-cuts of boolean reliability models such as fault trees. BEDs make it possible to bypass the Binary Decision Diagram (BDD) construction, which is the main cost of fault tree assessment....

  1. Reliability sensitivity-based correlation coefficient calculation in structural reliability analysis

    Yang, Zhou; Zhang, Yimin; Zhang, Xufang; Huang, Xianzhen


    The correlation coefficients of random variables of mechanical structures are generally chosen with experience or even ignored, which cannot actually reflect the effects of parameter uncertainties on reliability. To discuss the selection problem of the correlation coefficients from the reliability-based sensitivity point of view, the theory principle of the problem is established based on the results of the reliability sensitivity, and the criterion of correlation among random variables is shown. The values of the correlation coefficients are obtained according to the proposed principle and the reliability sensitivity problem is discussed. Numerical studies have shown the following results: (1) If the sensitivity value of correlation coefficient ρ is less than (at what magnitude 0.000 01), then the correlation could be ignored, which could simplify the procedure without introducing additional error. (2) However, as the difference between ρ s, that is the most sensitive to the reliability, and ρ R , that is with the smallest reliability, is less than 0.001, ρ s is suggested to model the dependency of random variables. This could ensure the robust quality of system without the loss of safety requirement. (3) In the case of | E abs|>0.001 and also | E rel|>0.001, ρ R should be employed to quantify the correlation among random variables in order to ensure the accuracy of reliability analysis. Application of the proposed approach could provide a practical routine for mechanical design and manufactory to study the reliability and reliability-based sensitivity of basic design variables in mechanical reliability analysis and design.

  2. DFTCalc: reliability centered maintenance via fault tree analysis (tool paper)

    Guck, Dennis; Spel, Jip; Stoelinga, Mariëlle; Butler, Michael; Conchon, Sylvain; Zaïdi, Fatiha


    Reliability, availability, maintenance and safety (RAMS) analysis is essential in the evaluation of safety critical systems like nuclear power plants and the railway infrastructure. A widely used methodology within RAMS analysis are fault trees, representing failure propagations throughout a system.


    Hong-Zhong Huang


    Full Text Available Engineering design under uncertainty has gained considerable attention in recent years. A great multitude of new design optimization methodologies and reliability analysis approaches are put forth with the aim of accommodating various uncertainties. Uncertainties in practical engineering applications are commonly classified into two categories, i.e., aleatory uncertainty and epistemic uncertainty. Aleatory uncertainty arises because of unpredictable variation in the performance and processes of systems, it is irreducible even adding more data or knowledge. On the other hand, epistemic uncertainty stems from lack of knowledge of the system due to limited data, measurement limitations, or simplified approximations in modeling system behavior and it can be reduced by obtaining more data or knowledge. More specifically, aleatory uncertainty is naturally represented by a statistical distribution and its associated parameters can be characterized by sufficient data. If, however, the data is limited and can be quantified in a statistical sense, epistemic uncertainty can be considered as an alternative tool in such a situation. Of the several optional treatments for epistemic uncertainty, possibility theory and evidence theory have proved to be the most computationally efficient and stable for reliability analysis and engineering design optimization. This study first attempts to provide a better understanding of uncertainty in engineering design by giving a comprehensive overview of its classifications, theories and design considerations. Then a review is conducted of general topics such as the foundations and applications of possibility theory and evidence theory. This overview includes the most recent results from theoretical research, computational developments and performance improvement of possibility theory and evidence theory with an emphasis on revealing the capability and characteristics of quantifying uncertainty from different perspectives

  4. Reliability Analysis of Existing Vertical Wall Breakwaters

    Burcharth, H. F.; Sørensen, John Dalsgaard


    Vertical wall breakwaters are used under quite different conditions where failure of the breakwater or a part of it will have very different consequences. Further a number of existing vertical wall breakwaters have been subjected to significant wave loads which have caused partial failures of the...... of the structures. The main objective of this paper is to describe how the reliability of existing breakwater structures within the expected remaining lifetime can be estimated taking into account the available information....

  5. Reliability analysis of PLC safety equipment

    Yu, J.; Kim, J. Y. [Chungnam Nat. Univ., Daejeon (Korea, Republic of)


    FMEA analysis for Nuclear Safety Grade PLC, failure rate prediction for nuclear safety grade PLC, sensitivity analysis for components failure rate of nuclear safety grade PLC, unavailability analysis support for nuclear safety system.

  6. Earth slope reliability analysis under seismic loadings using neural network

    PENG Huai-sheng; DENG Jian; GU De-sheng


    A new method was proposed to cope with the earth slope reliability problem under seismic loadings. The algorithm integrates the concepts of artificial neural network, the first order second moment reliability method and the deterministic stability analysis method of earth slope. The performance function and its derivatives in slope stability analysis under seismic loadings were approximated by a trained multi-layer feed-forward neural network with differentiable transfer functions. The statistical moments calculated from the performance function values and the corresponding gradients using neural network were then used in the first order second moment method for the calculation of the reliability index in slope safety analysis. Two earth slope examples were presented for illustrating the applicability of the proposed approach. The new method is effective in slope reliability analysis. And it has potential application to other reliability problems of complicated engineering structure with a considerably large number of random variables.

  7. Design and Analysis for Reliability of Wireless Sensor Network

    Yongxian Song


    Full Text Available Reliability is an important performance indicator of wireless sensor network, to some application fields, which have high demands in terms of reliability, it is particularly important to ensure reliability of network. At present, the reliability research findings of wireless sensor network are much more at home and abroad, but they mainly improve network reliability from the networks topology, reliable protocol and application layer fault correction and so on, and reliability of network is comprehensive considered from hardware and software aspects is much less. This paper adopts bionic hardware to implement bionic reconfigurable of wireless sensor network nodes, so as to the nodes have able to change their structure and behavior autonomously and dynamically, in the cases of the part hardware are failure, and the nodes can realize bionic self-healing. Secondly, Markov state diagram and probability analysis method are adopted to realize solution of functional model for reliability, establish the relationship between reliability and characteristic parameters for sink nodes, analyze sink nodes reliability model, so as to determine the reasonable parameters of the model and ensure reliability of sink nodes.

  8. Statistical analysis on reliability and serviceability of caterpillar tractor

    WANG Jinwu; LIU Jiafu; XU Zhongxiang


    For further understanding reliability and serviceability of tractor and to furnish scientific and technical theories, based on the promotion and application of it, the following experiments and statistical analysis on reliability (reliability and MTBF) serviceability (service and MTTR) of Donfanghong-1002 and Dongfanghong-802 were conducted. The result showed that the intervals of average troubles of these two tractors were 182.62 h and 160.2 h, respectively, and the weakest assembly of them was engine part.

  9. Reliability-Analysis of Offshore Structures using Directional Loads

    Sørensen, John Dalsgaard; Bloch, Allan; Sterndorff, M. J.


    Reliability analyses of offshore structures such as steel jacket platforms are usually performed using stochastic models for the wave loads based on the omnidirectional wave height. However, reliability analyses with respect to structural failure modes such as total collapse of a structure...... heights from the central part of the North Sea. It is described how the stochastic model for the directional wave heights can be used in a reliability analysis where total collapse of offshore steel jacket platforms is considered....

  10. Reliability analysis of large, complex systems using ASSIST

    Johnson, Sally C.


    The SURE reliability analysis program is discussed as well as the ASSIST model generation program. It is found that semi-Markov modeling using model reduction strategies with the ASSIST program can be used to accurately solve problems at least as complex as other reliability analysis tools can solve. Moreover, semi-Markov analysis provides the flexibility needed for modeling realistic fault-tolerant systems.

  11. Evaluating some Reliability Analysis Methodologies in Seismic Design

    A. E. Ghoulbzouri


    Full Text Available Problem statement: Accounting for uncertainties that are present in geometric and material data of reinforced concrete buildings is performed in this study within the context of performance based seismic engineering design. Approach: Reliability of the expected performance state is assessed by using various methodologies based on finite element nonlinear static pushover analysis and specialized reliability software package. Reliability approaches that were considered included full coupling with an external finite element code and surface response based methods in conjunction with either first order reliability method or importance sampling method. Various types of probability distribution functions that model parameters uncertainties were introduced. Results: The probability of failure according to the used reliability analysis method and to the selected distribution of probabilities was obtained. Convergence analysis of the importance sampling method was performed. The required duration of analysis as function of the used reliability method was evaluated. Conclusion/Recommendations: It was found that reliability results are sensitive to the used reliability analysis method and to the selected distribution of probabilities. Durations of analysis for coupling methods were found to be higher than those associated to surface response based methods; one should however include time needed to derive these lasts. For the reinforced concrete building considered in this study, it was found that significant variations exist between all the considered reliability methodologies. The full coupled importance sampling method is recommended, but the first order reliability method applied on a surface response model can be used with good accuracy. Finally, the distributions of probabilities should be carefully identified since giving the mean and the standard deviation were found to be insufficient.

  12. Reliability Distribution of Numerical Control Lathe Based on Correlation Analysis

    Xiaoyan Qi; Guixiang Shen; Yingzhi Zhang; Shuguang Sun; Bingkun Chen


    Combined Reliability distribution with correlation analysis, a new method has been proposed to make Reliability distribution where considering the elements about structure correlation and failure correlation of subsystems. Firstly, we make a sequence for subsystems by means of TOPSIS which comprehends the considerations of Reliability allocation, and introducing a Copula connecting function to set up a distribution model based on structure correlation, failure correlation and target correlation, and then acquiring reliability target area of all subsystems by Matlab. In this method, not only the traditional distribution considerations are concerned, but also correlation influences are involved, to achieve supplementing information and optimizing distribution.

  13. Reliability and safety analysis of redundant vehicle management computer system

    Shi Jian; Meng Yixuan; Wang Shaoping; Bian Mengmeng; Yan Dungong


    Redundant techniques are widely adopted in vehicle management computer (VMC) to ensure that VMC has high reliability and safety. At the same time, it makes VMC have special char-acteristics, e.g., failure correlation, event simultaneity, and failure self-recovery. Accordingly, the reliability and safety analysis to redundant VMC system (RVMCS) becomes more difficult. Aimed at the difficulties in RVMCS reliability modeling, this paper adopts generalized stochastic Petri nets to establish the reliability and safety models of RVMCS. Then this paper analyzes RVMCS oper-ating states and potential threats to flight control system. It is verified by simulation that the reli-ability of VMC is not the product of hardware reliability and software reliability, and the interactions between hardware and software faults can reduce the real reliability of VMC obviously. Furthermore, the failure undetected states and false alarming states inevitably exist in RVMCS due to the influences of limited fault monitoring coverage and false alarming probability of fault mon-itoring devices (FMD). RVMCS operating in some failure undetected states will produce fatal threats to the safety of flight control system. RVMCS operating in some false alarming states will reduce utility of RVMCS obviously. The results abstracted in this paper can guide reliable VMC and efficient FMD designs. The methods adopted in this paper can also be used to analyze other intelligent systems’ reliability.

  14. Seismic reliability analysis of large electric power systems

    何军; 李杰


    Based on the De. Morgan laws and Boolean simplification, a recursive decomposition method is introduced in this paper to identity the main exclusive safe paths and failed paths of a network. The reliability or the reliability bound of a network can be conveniently expressed as the summation of the joint probabilities of these paths. Under the multivariate normal distribution assumption, a conditioned reliability index method is developed to evaluate joint probabilities of various exclusive safe paths and failed paths, and, finally, the seismic reliability or the reliability bound of an electric power system.Examples given in thc paper show that the method is very simple and provides accurate results in the seismic reliability analysis.

  15. Theories of Comparative Analysis


    model of a projectile fired from a cannon in a uniform gravitational field serves to demonstrate the problems due to qualitative arithmetic. Nei...recently demonstrated the qualitative Gauss rule, a type of algebraic manip- ulation that is solution preserving. While it cannot eliminate all...projectile fired from a cannon illustrates this point. Given an increase in muzzle velocity, Vft, as a perturbation, DQ " analysis predicts that apogee

  16. Simulation Approach to Mission Risk and Reliability Analysis Project

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  17. Reliability, Validity, Comparability and Practical Utility of Cybercrime-Related Data, Metrics, and Information

    Nir Kshetri


    Full Text Available With an increasing pervasiveness, prevalence and severity of cybercrimes, various metrics, measures and statistics have been developed and used to measure various aspects of this phenomenon. Cybercrime-related data, metrics, and information, however, pose important and difficult dilemmas regarding the issues of reliability, validity, comparability and practical utility. While many of the issues of the cybercrime economy are similar to other underground and underworld industries, this economy also has various unique aspects. For one thing, this industry also suffers from a problem partly rooted in the incredibly broad definition of the term “cybercrime”. This article seeks to provide insights and analysis into this phenomenon, which is expected to advance our understanding into cybercrime-related information.

  18. Reliability Analysis of OMEGA Network and Its Variants

    Suman Lata


    Full Text Available The performance of a computer system depends directly on the time required to perform a basic operation and the number of these basic operations that can be performed concurrently. High performance computing systems can be designed using parallel processing. Parallel processing is achieved by using more than one processors or computers together they communicate with each other to solve a givenproblem. MINs provide better way for the communication between different processors or memory modules with less complexity, fast communication, good fault tolerance, high reliability and low cost. Reliability of a system is the probability that it will successfully perform its intended operations for a given time under stated operating conditions. From the reliability analysis it has beenobserved that addition of one stage to Omega networks provide higher reliability in terms of terminal reliability than the addition of two stages in the corresponding network.

  19. Seismic reliability analysis of urban water distribution network

    Li Jie; Wei Shulin; Liu Wei


    An approach to analyze the seismic reliability of water distribution networks by combining a hydraulic analysis with a first-order reliability method (FORM), is proposed in this paper.The hydraulic analysis method for normal conditions is modified to accommodate the special conditions necessary to perform a seismic hydraulic analysis. In order to calculate the leakage area and leaking flow of the pipelines in the hydraulic analysis method, a new leakage model established from the seismic response analysis of buried pipelines is presented. To validate the proposed approach, a network with 17 nodes and 24 pipelines is investigated in detail. The approach is also applied to an actual project consisting of 463 nodes and 767pipelines. Thee results show that the proposed approach achieves satisfactory results in analyzing the seismic reliability of large-scale water distribution networks.

  20. Reliability Analysis of Dynamic Stability in Waves

    Søborg, Anders Veldt


    exhibit sufficient characteristics with respect to slope at zero heel (GM value), maximum leverarm, positive range of stability and area below the leverarm curve. The rule-based requirements to calm water leverarm curves are entirely based on experience obtained from vessels in operation and recorded......-4 per ship year such brute force Monte-Carlo simulations are not always feasible due to the required computational resources. Previous studies of dynamic stability of ships in waves typically focused on the capsizing event. In this study the objective is to establish a procedure that can identify...... the distribution of the exceedance probability may be established by an estimation of the out-crossing rate of the "safe set" defined by the utility function. This out-crossing rate will be established using the so-called Madsen's Formula. A bi-product of this analysis is a set of short wave time series...

  1. Reliability analysis of the bulk cargo loading system including dependent components

    Blokus-Roszkowska, Agnieszka


    In the paper an innovative approach to the reliability analysis of multistate series-parallel systems assuming their components' dependency is presented. The reliability function of a multistate series system with components dependent according to the local load sharing rule is determined. Linking these results for series systems with results for parallel systems with independent components, we obtain the reliability function of a multistate series-parallel system assuming dependence of components' departures from the reliability states subsets in series subsystem and assuming independence between these subsystems. As a particular case, the reliability function of a multistate series-parallel system composed of dependent components having exponential reliability functions is fixed. Theoretical results are applied practically to the reliability evaluation of a bulk cargo transportation system, which main area is to load bulk cargo on board the ships. The reliability function and other reliability characteristics of the loading system are determined in case its components have exponential reliability functions with interdependent departures rates from the subsets of their reliability states. Finally, the obtained results are compared with results for the bulk cargo transportation system composed of independent components.

  2. Test-retest reliability of trunk accelerometric gait analysis

    Henriksen, Marius; Lund, Hans; Moe-Nilssen, R


    The purpose of this study was to determine the test-retest reliability of a trunk accelerometric gait analysis in healthy subjects. Accelerations were measured during walking using a triaxial accelerometer mounted on the lumbar spine of the subjects. Six men and 14 women (mean age 35.2; range 18...... then computed and interpolated using quadratic curve fits and point estimates were calculated at a standardised walking speed of 1.35 m/s. Relative reliability was determined using two models of intraclass correlation coefficients (ICC(1,1) and ICC(3,1)) to assess any systematic shifts and absolute reliability...

  3. Analysis on Operation Reliability of Generating Units in 2009



    This paper presents the data on operation reliability indices and relevant analyses toward China's conventional power generating units in 2009. The units brought into the statistical analysis include 100-MW or above thermal generating units, 40-MW or above hydro generating units, and all nuclear generating units. The reliability indices embodied include utilization hours, times and hours of scheduled outages, times and hours of unscheduled outages, equivalent forced outage rate and equivalent availability factor.

  4. Coverage Modeling and Reliability Analysis Using Multi-state Function


    Fault tree analysis is an effective method for predicting the reliability of a system. It gives a pictorial representation and logical framework for analyzing the reliability. Also, it has been used for a long time as an effective method for the quantitative and qualitative analysis of the failure modes of critical systems. In this paper, we propose a new general coverage model (GCM) based on hardware independent faults. Using this model, an effective software tool can be constructed to detect, locate and recover fault from the faulty system. This model can be applied to identify the key component that can cause the failure of the system using failure mode effect analysis (FMEA).

  5. Reliability analysis of repairable systems using system dynamics modeling and simulation

    Srinivasa Rao, M.; Naikan, V. N. A.


    Repairable standby system's study and analysis is an important topic in reliability. Analytical techniques become very complicated and unrealistic especially for modern complex systems. There have been attempts in the literature to evolve more realistic techniques using simulation approach for reliability analysis of systems. This paper proposes a hybrid approach called as Markov system dynamics (MSD) approach which combines the Markov approach with system dynamics simulation approach for reliability analysis and to study the dynamic behavior of systems. This approach will have the advantages of both Markov as well as system dynamics methodologies. The proposed framework is illustrated for a standby system with repair. The results of the simulation when compared with that obtained by traditional Markov analysis clearly validate the MSD approach as an alternative approach for reliability analysis.

  6. Reliability analysis and initial requirements for FC systems and stacks

    Åström, K.; Fontell, E.; Virtanen, S.

    In the year 2000 Wärtsilä Corporation started an R&D program to develop SOFC systems for CHP applications. The program aims to bring to the market highly efficient, clean and cost competitive fuel cell systems with rated power output in the range of 50-250 kW for distributed generation and marine applications. In the program Wärtsilä focuses on system integration and development. System reliability and availability are key issues determining the competitiveness of the SOFC technology. In Wärtsilä, methods have been implemented for analysing the system in respect to reliability and safety as well as for defining reliability requirements for system components. A fault tree representation is used as the basis for reliability prediction analysis. A dynamic simulation technique has been developed to allow for non-static properties in the fault tree logic modelling. Special emphasis has been placed on reliability analysis of the fuel cell stacks in the system. A method for assessing reliability and critical failure predictability requirements for fuel cell stacks in a system consisting of several stacks has been developed. The method is based on a qualitative model of the stack configuration where each stack can be in a functional, partially failed or critically failed state, each of the states having different failure rates and effects on the system behaviour. The main purpose of the method is to understand the effect of stack reliability, critical failure predictability and operating strategy on the system reliability and availability. An example configuration, consisting of 5 × 5 stacks (series of 5 sets of 5 parallel stacks) is analysed in respect to stack reliability requirements as a function of predictability of critical failures and Weibull shape factor of failure rate distributions.

  7. Reliability analysis of flood defence systems in the Netherlands

    Lassing, B.L.; Vrouwenvelder, A.C.W.M.; Waarts, P.H.


    In recent years an advanced program for reliability analysis of dike systems has been under de-velopment in the Netherlands. This paper describes the global data requirements for application and the set-up of the models in the Netherlands. The analysis generates an estimate of the probability of sys




    Full Text Available The introduction of pervasive devices and mobile devices has led to immense growth of real time distributed processing. In such context reliability of the computing environment is very important. Reliability is the probability that the devices, links, processes, programs and files work efficiently for the specified period of time and in the specified condition. Distributed systems are available as conventional ring networks, clusters and agent based systems. Reliability of such systems is focused. These networks are heterogeneous and scalable in nature. There are several factors, which are to be considered for reliability estimation. These include the application related factors like algorithms, data-set sizes, memory usage pattern, input-output, communication patterns, task granularity and load-balancing. It also includes the hardware related factors like processor architecture, memory hierarchy, input-output configuration and network. The software related factors concerning reliability are operating systems, compiler, communication protocols, libraries and preprocessor performance. In estimating the reliability of a system, the performance estimation is an important aspect. Reliability analysis is approached using probability.

  9. On reliability analysis of multi-categorical forecasts

    J. Bröcker


    Full Text Available Reliability analysis of probabilistic forecasts, in particular through the rank histogram or Talagrand diagram, is revisited. Two shortcomings are pointed out: Firstly, a uniform rank histogram is but a necessary condition for reliability. Secondly, if the forecast is assumed to be reliable, an indication is needed how far a histogram is expected to deviate from uniformity merely due to randomness. Concerning the first shortcoming, it is suggested that forecasts be grouped or stratified along suitable criteria, and that reliability is analyzed individually for each forecast stratum. A reliable forecast should have uniform histograms for all individual forecast strata, not only for all forecasts as a whole. As to the second shortcoming, instead of the observed frequencies, the probability of the observed frequency is plotted, providing and indication of the likelihood of the result under the hypothesis that the forecast is reliable. Furthermore, a Goodness-Of-Fit statistic is discussed which is essentially the reliability term of the Ignorance score. The discussed tools are applied to medium range forecasts for 2 m-temperature anomalies at several locations and lead times. The forecasts are stratified along the expected ranked probability score. Those forecasts which feature a high expected score turn out to be particularly unreliable.

  10. The development of a reliable amateur boxing performance analysis template.

    Thomson, Edward; Lamb, Kevin; Nicholas, Ceri


    The aim of this study was to devise a valid performance analysis system for the assessment of the movement characteristics associated with competitive amateur boxing and assess its reliability using analysts of varying experience of the sport and performance analysis. Key performance indicators to characterise the demands of an amateur contest (offensive, defensive and feinting) were developed and notated using a computerised notational analysis system. Data were subjected to intra- and inter-observer reliability assessment using median sign tests and calculating the proportion of agreement within predetermined limits of error. For all performance indicators, intra-observer reliability revealed non-significant differences between observations (P > 0.05) and high agreement was established (80-100%) regardless of whether exact or the reference value of ±1 was applied. Inter-observer reliability was less impressive for both analysts (amateur boxer and experienced analyst), with the proportion of agreement ranging from 33-100%. Nonetheless, there was no systematic bias between observations for any indicator (P > 0.05), and the proportion of agreement within the reference range (±1) was 100%. A reliable performance analysis template has been developed for the assessment of amateur boxing performance and is available for use by researchers, coaches and athletes to classify and quantify the movement characteristics of amateur boxing.

  11. Reliability analysis of cluster-based ad-hoc networks

    Cook, Jason L. [Quality Engineering and System Assurance, Armament Research Development Engineering Center, Picatinny Arsenal, NJ (United States); Ramirez-Marquez, Jose Emmanuel [School of Systems and Enterprises, Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ 07030 (United States)], E-mail:


    The mobile ad-hoc wireless network (MAWN) is a new and emerging network scheme that is being employed in a variety of applications. The MAWN varies from traditional networks because it is a self-forming and dynamic network. The MAWN is free of infrastructure and, as such, only the mobile nodes comprise the network. Pairs of nodes communicate either directly or through other nodes. To do so, each node acts, in turn, as a source, destination, and relay of messages. The virtue of a MAWN is the flexibility this provides; however, the challenge for reliability analyses is also brought about by this unique feature. The variability and volatility of the MAWN configuration makes typical reliability methods (e.g. reliability block diagram) inappropriate because no single structure or configuration represents all manifestations of a MAWN. For this reason, new methods are being developed to analyze the reliability of this new networking technology. New published methods adapt to this feature by treating the configuration probabilistically or by inclusion of embedded mobility models. This paper joins both methods together and expands upon these works by modifying the problem formulation to address the reliability analysis of a cluster-based MAWN. The cluster-based MAWN is deployed in applications with constraints on networking resources such as bandwidth and energy. This paper presents the problem's formulation, a discussion of applicable reliability metrics for the MAWN, and illustration of a Monte Carlo simulation method through the analysis of several example networks.

  12. Reliability analysis of wind turbines exposed to dynamic loads

    Sørensen, John Dalsgaard


    . Therefore the turbine components should be designed to have sufficient reliability with respect to both extreme and fatigue loads also not be too costly (and safe). This paper presents models for uncertainty modeling and reliability assessment of especially the structural components such as tower, blades...... the reliability of the structural components. Illustrative examples are presented considering uncertainty modeling and reliability assessment for structural wind turbine components exposed to extreme loads and fatigue, respectively.......Wind turbines are exposed to highly dynamic loads that cause fatigue and extreme load effects which are subject to significant uncertainties. Further, reduction of cost of energy for wind turbines are very important in order to make wind energy competitive compared to other energy sources...

  13. Reliability Analysis of Wireless Sensor Networks Using Markovian Model

    Jin Zhu


    Full Text Available This paper investigates reliability analysis of wireless sensor networks whose topology is switching among possible connections which are governed by a Markovian chain. We give the quantized relations between network topology, data acquisition rate, nodes' calculation ability, and network reliability. By applying Lyapunov method, sufficient conditions of network reliability are proposed for such topology switching networks with constant or varying data acquisition rate. With the conditions satisfied, the quantity of data transported over wireless network node will not exceed node capacity such that reliability is ensured. Our theoretical work helps to provide a deeper understanding of real-world wireless sensor networks, which may find its application in the fields of network design and topology control.

  14. Statistical models and methods for reliability and survival analysis

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo


    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  15. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William


    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  16. Notes on numerical reliability of several statistical analysis programs

    Landwehr, J.M.; Tasker, Gary D.


    This report presents a benchmark analysis of several statistical analysis programs currently in use in the USGS. The benchmark consists of a comparison between the values provided by a statistical analysis program for variables in the reference data set ANASTY and their known or calculated theoretical values. The ANASTY data set is an amendment of the Wilkinson NASTY data set that has been used in the statistical literature to assess the reliability (computational correctness) of calculated analytical results.

  17. Distribution System Reliability Analysis for Smart Grid Applications

    Aljohani, Tawfiq Masad

    Reliability of power systems is a key aspect in modern power system planning, design, and operation. The ascendance of the smart grid concept has provided high hopes of developing an intelligent network that is capable of being a self-healing grid, offering the ability to overcome the interruption problems that face the utility and cost it tens of millions in repair and loss. To address its reliability concerns, the power utilities and interested parties have spent extensive amount of time and effort to analyze and study the reliability of the generation and transmission sectors of the power grid. Only recently has attention shifted to be focused on improving the reliability of the distribution network, the connection joint between the power providers and the consumers where most of the electricity problems occur. In this work, we will examine the effect of the smart grid applications in improving the reliability of the power distribution networks. The test system used in conducting this thesis is the IEEE 34 node test feeder, released in 2003 by the Distribution System Analysis Subcommittee of the IEEE Power Engineering Society. The objective is to analyze the feeder for the optimal placement of the automatic switching devices and quantify their proper installation based on the performance of the distribution system. The measures will be the changes in the reliability system indices including SAIDI, SAIFI, and EUE. The goal is to design and simulate the effect of the installation of the Distributed Generators (DGs) on the utility's distribution system and measure the potential improvement of its reliability. The software used in this work is DISREL, which is intelligent power distribution software that is developed by General Reliability Co.

  18. Semigroup Method for a Mathematical Model in Reliability Analysis

    Geni Gupur; LI Xue-zhi


    The system which consists of a reliable machine, an unreliable machine and a storage buffer with infinite many workpieces has been studied. The existence of a unique positive time-dependent solution of the model corresponding to the system has been obtained by using C0-semigroup theory of linear operators in functional analysis.

  19. Reliability-Based Robustness Analysis for a Croatian Sports Hall

    Čizmar, Dean; Kirkegaard, Poul Henning; Sørensen, John Dalsgaard;


    This paper presents a probabilistic approach for structural robustness assessment for a timber structure built a few years ago. The robustness analysis is based on a structural reliability based framework for robustness and a simplified mechanical system modelling of a timber truss system. A comp...

  20. Analysis on Operation Reliability of Generating Units in 2005

    Zuo Xiaowen; Chu Xue


    @@ The weighted average equivalent availability factor of thermal power units in 2005 was 92.34%, an increase of 0.64 percentage points as compared to that in 2004. The average equivalent availability factor in 2005 was 92.22%, a decrease of 0.95 percentage points as compared to that in 2004. The nationwide operation reliability of generating units in 2005 was analyzed completely in this paper.

  1. National Launch System comparative economic analysis

    Prince, A.


    Results are presented from an analysis of economic benefits (or losses), in the form of the life cycle cost savings, resulting from the development of the National Launch System (NLS) family of launch vehicles. The analysis was carried out by comparing various NLS-based architectures with the current Shuttle/Titan IV fleet. The basic methodology behind this NLS analysis was to develop a set of annual payload requirements for the Space Station Freedom and LEO, to design launch vehicle architectures around these requirements, and to perform life-cycle cost analyses on all of the architectures. A SEI requirement was included. Launch failure costs were estimated and combined with the relative reliability assumptions to measure the effects of losses. Based on the analysis, a Shuttle/NLS architecture evolving into a pressurized-logistics-carrier/NLS architecture appears to offer the best long-term cost benefit.

  2. Classification using least squares support vector machine for reliability analysis

    Zhi-wei GUO; Guang-chen BAI


    In order to improve the efficiency of the support vector machine (SVM) for classification to deal with a large amount of samples,the least squares support vector machine (LSSVM) for classification methods is introduced into the reliability analysis.To reduce the computational cost,the solution of the SVM is transformed from a quadratic programming to a group of linear equations.The numerical results indicate that the reliability method based on the LSSVM for classification has higher accuracy and requires less computational cost than the SVM method.

  3. Human Reliability Analysis for Digital Human-Machine Interfaces

    Ronald L. Boring


    This paper addresses the fact that existing human reliability analysis (HRA) methods do not provide guidance on digital human-machine interfaces (HMIs). Digital HMIs are becoming ubiquitous in nuclear power operations, whether through control room modernization or new-build control rooms. Legacy analog technologies like instrumentation and control (I&C) systems are costly to support, and vendors no longer develop or support analog technology, which is considered technologically obsolete. Yet, despite the inevitability of digital HMI, no current HRA method provides guidance on how to treat human reliability considerations for digital technologies.

  4. Modelling application for cognitive reliability and error analysis method

    Fabio De Felice


    Full Text Available The automation of production systems has delegated to machines the execution of highly repetitive and standardized tasks. In the last decade, however, the failure of the automatic factory model has led to partially automated configurations of production systems. Therefore, in this scenario, centrality and responsibility of the role entrusted to the human operators are exalted because it requires problem solving and decision making ability. Thus, human operator is the core of a cognitive process that leads to decisions, influencing the safety of the whole system in function of their reliability. The aim of this paper is to propose a modelling application for cognitive reliability and error analysis method.

  5. Fatigue damage reliability analysis for Nanjing Yangtze river bridge using structural health monitoring data

    HE Xu-hui; CHEN Zheng-qing; YU Zhi-wu; HUANG Fang-lin


    To evaluate the fatigue damage reliability of critical members of the Nanjing Yangtze river bridge, according to the stress-number curve and Miner's rule, the corresponding expressions for calculating the structural fatigue damage reliability were derived. Fatigue damage reliability analysis of some critical members of the Nanjing Yangtze river bridge was carried out by using the strain-time histories measured by the structural health monitoring system of the bridge. The corresponding stress spectra were obtained by the real-time rain-flow counting method.Results of fatigue damage were calculated respectively by the reliability method at different reliability and compared with Miner's rule. The results show that the fatigue damage of critical members of the Nanjing Yangtze river bridge is very small due to its low live-load stress level.

  6. Generating function approach to reliability analysis of structural systems


    The generating function approach is an important tool for performance assessment in multi-state systems. Aiming at strength reliability analysis of structural systems, generating function approach is introduced and developed. Static reliability models of statically determinate, indeterminate systems and fatigue reliability models are built by constructing special generating functions, which are used to describe probability distributions of strength (resistance), stress (load) and fatigue life, by defining composite operators of generating functions and performance structure functions thereof. When composition operators are executed, computational costs can be reduced by a big margin by means of collecting like terms. The results of theoretical analysis and numerical simulation show that the generating function approach can be widely used for probability modeling of large complex systems with hierarchical structures due to the unified form, compact expression, computer program realizability and high universality. Because the new method considers twin loads giving rise to component failure dependency, it can provide a theoretical reference and act as a powerful tool for static, dynamic reliability analysis in civil engineering structures and mechanical equipment systems with multi-mode damage coupling.

  7. Accident Sequence Evaluation Program: Human reliability analysis procedure

    Swain, A.D.


    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  8. Strength Reliability Analysis of Turbine Blade Using Surrogate Models

    Wei Duan


    Full Text Available There are many stochastic parameters that have an effect on the reliability of steam turbine blades performance in practical operation. In order to improve the reliability of blade design, it is necessary to take these stochastic parameters into account. In this study, a variable cross-section twisted blade is investigated and geometrical parameters, material parameters and load parameters are considered as random variables. A reliability analysis method as a combination of a Finite Element Method (FEM, a surrogate model and Monte Carlo Simulation (MCS, is applied to solve the blade reliability analysis. Based on the blade finite element parametrical model and the experimental design, two kinds of surrogate models, Polynomial Response Surface (PRS and Artificial Neural Network (ANN, are applied to construct the approximation analytical expressions between the blade responses (including maximum stress and deflection and random input variables, which act as a surrogate of finite element solver to drastically reduce the number of simulations required. Then the surrogate is used for most of the samples needed in the Monte Carlo method and the statistical parameters and cumulative distribution functions of the maximum stress and deflection are obtained by Monte Carlo simulation. Finally, the probabilistic sensitivities analysis, which combines the magnitude of the gradient and the width of the scatter range of the random input variables, is applied to evaluate how much the maximum stress and deflection of the blade are influenced by the random nature of input parameters.

  9. Using a Hybrid Cost-FMEA Analysis for Wind Turbine Reliability Analysis

    Nacef Tazi


    Full Text Available Failure mode and effects analysis (FMEA has been proven to be an effective methodology to improve system design reliability. However, the standard approach reveals some weaknesses when applied to wind turbine systems. The conventional criticality assessment method has been criticized as having many limitations such as the weighting of severity and detection factors. In this paper, we aim to overcome these drawbacks and develop a hybrid cost-FMEA by integrating cost factors to assess the criticality, these costs vary from replacement costs to expected failure costs. Then, a quantitative comparative study is carried out to point out average failure rate, main cause of failure, expected failure costs and failure detection techniques. A special reliability analysis of gearbox and rotor-blades are presented.

  10. Asymptotic Sampling for Reliability Analysis of Adhesive Bonded Stepped Lap Composite Joints

    Kimiaeifar, Amin; Lund, Erik; Thomsen, Ole Thybo;


    Reliability analysis coupled with finite element analysis (FEA) of composite structures is computationally very demanding and requires a large number of simulations to achieve an accurate prediction of the probability of failure with a small standard error. In this paper Asymptotic Sampling, which...... failure in the composite and adhesive layers, respectively, and the results are compared with the target reliability level implicitly used in the wind turbine standard IEC 61400-1. The accuracy and efficiency of Asymptotic Sampling is investigated by comparing the results with predictions obtained using...

  11. Modeling and Analysis of Component Faults and Reliability

    Le Guilly, Thibaut; Olsen, Petur; Ravn, Anders Peter;


    that are automatically generated. The stochastic information on the faults is used to estimate the reliability of the fault affected system. The reliability is given with respect to properties of the system state space. We illustrate the process on a concrete example using the Uppaal model checker for validating...... the ideal system model and the fault modeling. Then the statistical version of the tool, UppaalSMC, is used to find reliability estimates.......This chapter presents a process to design and validate models of reactive systems in the form of communicating timed automata. The models are extended with faults associated with probabilities of occurrence. This enables a fault tree analysis of the system using minimal cut sets...

  12. Reliability Analysis of Free Jet Scour Below Dams

    Chuanqi Li


    Full Text Available Current formulas for calculating scour depth below of a free over fall are mostly deterministic in nature and do not adequately consider the uncertainties of various scouring parameters. A reliability-based assessment of scour, taking into account uncertainties of parameters and coefficients involved, should be performed. This paper studies the reliability of a dam foundation under the threat of scour. A model for calculating the reliability of scour and estimating the probability of failure of the dam foundation subjected to scour is presented. The Maximum Entropy Method is applied to construct the probability density function (PDF of the performance function subject to the moment constraints. Monte Carlo simulation (MCS is applied for uncertainty analysis. An example is considered, and there liability of its scour is computed, the influence of various random variables on the probability failure is analyzed.

  13. Identifying Sources of Difference in Reliability in Content Analysis

    Elizabeth Murphy


    Full Text Available This paper reports on a case study which identifies and illustrates sources of difference in agreement in relation to reliability in a context of quantitative content analysis of a transcript of an online asynchronous discussion (OAD. Transcripts of 10 students in a month-long online asynchronous discussion were coded by two coders using an instrument with two categories, five processes, and 19 indicators of Problem Formulation and Resolution (PFR. Sources of difference were identified in relation to: coders; tasks; and students. Reliability values were calculated at the levels of categories, processes, and indicators. At the most detailed level of coding on the basis of the indicator, findings revealed that the overall level of reliability between coders was .591 when measured with Cohen’s kappa. The difference between tasks at the same level ranged from .349 to .664, and the difference between participants ranged from .390 to .907. Implications for training and research are discussed.

  14. Reliability analysis of two unit parallel repairable industrial system

    Mohit Kumar Kakkar


    Full Text Available The aim of this work is to present a reliability and profit analysis of a two-dissimilar parallel unit system under the assumption that operative unit cannot fail after post repair inspection and replacement and there is only one repair facility. Failure and repair times of each unit are assumed to be uncorrelated. Using regenerative point technique various reliability characteristics are obtained which are useful to system designers and industrial managers. Graphical behaviors of mean time to system failure (MTSF and profit function have also been studied. In this paper, some important measures of reliability characteristics of a two non-identical unit standby system model with repair, inspection and post repair are obtained using regenerative point technique.

  15. Reliability and maintainability analysis of electrical system of drum shearers

    SEYED Hadi Hoseinie; MOHAMMAD Ataei; REZA Khalokakaie; UDAY Kumar


    The reliability and maintainability of electrical system of drum shearer at Parvade.l Coal Mine in central Iran was analyzed. The maintenance and failure data were collected during 19 months of shearer operation. According to trend and serial correlation tests, the data were independent and identically distributed (iid) and therefore the statistical techniques were used for modeling. The data analysis show that the time between failures (TBF) and time to repair (TTR) data obey the lognormal and Weibull 3 parameters distribution respectively. Reliability-based preventive maintenance time intervals for electrical system of the drum shearer were calculated with regard to reliability plot. The reliability-based maintenance intervals for 90%, 80%, 70% and 50% reliability level are respectively 9.91, 17.96, 27.56 and 56.1 h. Also the calculations show that time to repair (TTR) of this system varies in range 0.17-4 h with 1.002 h as mean time to repair (MTTR). There is a 80% chance that the electrical system of shearer of Parvade.l mine repair will be accomplished within 1.45 h.

  16. Analysis of the Reliability of the "Alternator- Alternator Belt" System

    Ivan Mavrin


    Full Text Available Before starting and also during the exploitation of va1ioussystems, it is vety imp011ant to know how the system and itsparts will behave during operation regarding breakdowns, i.e.failures. It is possible to predict the service behaviour of a systemby determining the functions of reliability, as well as frequencyand intensity of failures.The paper considers the theoretical basics of the functionsof reliability, frequency and intensity of failures for the twomain approaches. One includes 6 equal intetvals and the other13 unequal intetvals for the concrete case taken from practice.The reliability of the "alternator- alternator belt" system installedin the buses, has been analysed, according to the empiricaldata on failures.The empitical data on failures provide empirical functionsof reliability and frequency and intensity of failures, that arepresented in tables and graphically. The first analysis perfO!med by dividing the mean time between failures into 6 equaltime intervals has given the forms of empirical functions of fa ilurefrequency and intensity that approximately cotTespond totypical functions. By dividing the failure phase into 13 unequalintetvals with two failures in each interval, these functions indicateexplicit transitions from early failure inte1val into the randomfailure interval, i.e. into the ageing intetval. Functions thusobtained are more accurate and represent a better solution forthe given case.In order to estimate reliability of these systems with greateraccuracy, a greater number of failures needs to be analysed.

  17. Reliability analysis method for slope stability based on sample weight

    Zhi-gang YANG


    Full Text Available The single safety factor criteria for slope stability evaluation, derived from the rigid limit equilibrium method or finite element method (FEM, may not include some important information, especially for steep slopes with complex geological conditions. This paper presents a new reliability method that uses sample weight analysis. Based on the distribution characteristics of random variables, the minimal sample size of every random variable is extracted according to a small sample t-distribution under a certain expected value, and the weight coefficient of each extracted sample is considered to be its contribution to the random variables. Then, the weight coefficients of the random sample combinations are determined using the Bayes formula, and different sample combinations are taken as the input for slope stability analysis. According to one-to-one mapping between the input sample combination and the output safety coefficient, the reliability index of slope stability can be obtained with the multiplication principle. Slope stability analysis of the left bank of the Baihetan Project is used as an example, and the analysis results show that the present method is reasonable and practicable for the reliability analysis of steep slopes with complex geological conditions.

  18. Semantic Web for Reliable Citation Analysis in Scholarly Publishing

    Ruben Tous


    Full Text Available Analysis of the impact of scholarly artifacts is constrained by current unreliable practices in cross-referencing, citation discovering, and citation indexing and analysis, which have not kept pace with the technological advances that are occurring in several areas like knowledge management and security. Because citation analysis has become the primary component in scholarly impact factor calculation, and considering the relevance of this metric within both the scholarly publishing value chain and (especially important the professional curriculum evaluation of scholarly professionals, we defend that current practices need to be revised. This paper describes a reference architecture that aims to provide openness and reliability to the citation-tracking lifecycle. The solution relies on the use of digitally signed semantic metadata in the different stages of the scholarly publishing workflow in such a manner that authors, publishers, repositories, and citation-analysis systems will have access to independent reliable evidences that are resistant to forgery, impersonation, and repudiation. As far as we know, this is the first paper to combine Semantic Web technologies and public-key cryptography to achieve reliable citation analysis in scholarly publishing

  19. A comparative analysis of reliability, maintainability and availability for two alternatives of the production submarine systems: ANM and submarine ducts versus BOP and a subsea well testing tree; Analise comparativa da confiabilidade, mantenabilidade e disponibilidade para duas alternativas de sistemas submarino de producao: ANM e dutos submarinos versus BOP e arvore submarina de teste

    Souza, Arlindo Antonio de; Polillo Filho, Adolfo; Santos, Otto Luiz Alcantara [PETROBRAS, Rio de Janeiro, RJ (Brazil)


    This technical article presents a study using the concepts of the Engineering of the Reliability and Risk Analysis with the objective of doing a comparative evaluation of the reliability of two alternative production systems for a marine well: one composed by a wet christmas tree (ANM) producing through underwater ducts (flow lines) and other, usually used in tests of long duration, using a subsea BOP and a subsea well testing tree (AST). The central point of the work was the evaluation of the probability of happening an event considered as critic, denominated 'critical flaw', during the well production life. The work uses one of the procedures and methodologies adopted in the Well Construction Engineering, GERISK, together with four computer applications for data treatment, generation of flaw distribution curves and times of repair, modelling and Monte Carlo simulations. The adopted strategy was the one of starting from the existent report, to assume an interval for the possible real value of the relevant parameters and then to establish the scenarios (more probable, optimist and pessimist). Based on those sceneries, the considered premises, the modelling and the reliabilities obtained for each one of the variables, the simulations have been made. As results, are presented the medium readiness, MTTFF (Mean Time To First Failure), the number of flaws and the expected costs. The work also displays the sensibility analysis in respect to the time of production of the well. (author)


    Mark Glaister


    Full Text Available The aims of this study were to examine familiarization and reliability associated with a 40-m maximal shuttle run test (40-m MST, and to compare performance measures from the test with those of a typical unidirectional multiple sprint running test (UMSRT. 12 men and 4 women completed four trials of the 40-m MST (8 × 40-m; 20 s rest periods followed by one trial of a UMSRT (12 × 30-m; repeated every 35 s; with seven days between trials. All trials were conducted indoors and performance times were recorded via twin-beam photocells. Significant between-trial differences in mean 40-m MST times were indicative of learning effects between trials 1 and 2. Test-retest reliability across the remaining trials as determined by coefficient of variation (CV and intraclass correlation coefficient (ICC revealed: a very good reliability for measures of fastest and mean shuttle time (CV = 1.1 - 1.3%; ICC = 0.91 - 0.92; b good reliability for measures of blood lactate (CV = 10.1 - 23.9%; ICC = 0.74 - 0.82 and ratings of perceived exertion (CV = 5.3 - 7.6%; ICC = 0.79 - 0.84; and c poor reliability for measures of fatigue (CV = 38.7%; ICC = 0.59. Comparisons between performance indices of the 40-m MST and the UMSRT revealed significant correlations between all measures, except pre-test blood lactate concentration (r = 0. 47. Whilst the 40-m MST does not appear to provide more information than can be gleaned from a typical UMSRT, following the completion of a familiarization trial, the 40-m MST provides an alternative and, except for fatigue measures, reliable means of evaluating repeated sprint ability


    S.A. Mohammadi


    Full Text Available Mean shift algorithms are among the most functional tracking methods which are accurate and havealmost simple computation. Different versions of this algorithm are developed which are differ in templateupdating and their window sizes. To measure the reliability and accuracy of these methods one shouldnormally rely on visual results or number of iteration. In this paper we introduce two new parameterswhich can be used to compare the algorithms especially when their results are close to each other.

  2. Fatigue Reliability Analysis of a Mono-Tower Platform

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; Brincker, Rune


    In this paper, a fatigue reliability analysis of a Mono-tower platform is presented. The failure mode, fatigue failure in the butt welds, is investigated with two different models. The one with the fatigue strength expressed through SN relations, the other with the fatigue strength expressed thro...... of the natural period, damping ratio, current, stress spectrum and parameters describing the fatigue strength. Further, soil damping is shown to be significant for the Mono-tower.......In this paper, a fatigue reliability analysis of a Mono-tower platform is presented. The failure mode, fatigue failure in the butt welds, is investigated with two different models. The one with the fatigue strength expressed through SN relations, the other with the fatigue strength expressed...

  3. Analysis of Gumbel Model for Software Reliability Using Bayesian Paradigm

    Raj Kumar


    Full Text Available In this paper, we have illustrated the suitability of Gumbel Model for software reliability data. The model parameters are estimated using likelihood based inferential procedure: classical as well as Bayesian. The quasi Newton-Raphson algorithm is applied to obtain the maximum likelihood estimates and associated probability intervals. The Bayesian estimates of the parameters of Gumbel model are obtained using Markov Chain Monte Carlo(MCMC simulation method in OpenBUGS(established software for Bayesian analysis using Markov Chain Monte Carlo methods. The R functions are developed to study the statistical properties, model validation and comparison tools of the model and the output analysis of MCMC samples generated from OpenBUGS. Details of applying MCMC to parameter estimation for the Gumbel model are elaborated and a real software reliability data set is considered to illustrate the methods of inference discussed in this paper.

  4. Reliability analysis method applied in slope stability: slope prediction and forecast on stability analysis

    Wenjuan ZHANG; Li CHEN; Ning QU; Hai'an LIANG


    Landslide is one kind of geologic hazards that often happens all over the world. It brings huge losses to human life and property; therefore, it is very important to research it. This study focused in combination between single and regional landslide, traditional slope stability analysis method and reliability analysis method. Meanwhile, methods of prediction of slopes and reliability analysis were discussed.


    彭世济; 卢明银; 张达贤


    It is stipulated in the China national document, named"The Economical Appraisal Methods for Construction Projects" that dynamic analysis should dominate the project economical appraisal methods.This paper has set up a dynamic investment forecast model for Yuanbaoshan Surface Coal Mine. Based on this model, the investment reliability using simulation and analytic methods has been analysed, anti the probability that the designed internal rate of return can reach 8.4%, from economic points of view, have been also studied.

  6. Analysis and Reliability Performance Comparison of Different Facial Image Features

    J. Madhavan


    Full Text Available This study performs reliability analysis on the different facial features with weighted retrieval accuracy on increasing facial database images. There are many methods analyzed in the existing papers with constant facial databases mentioned in the literature review. There were not much work carried out to study the performance in terms of reliability and also how the method will perform on increasing the size of the database. In this study certain feature extraction methods were analyzed on the regular performance measure and also the performance measures are modified to fit the real time requirements by giving weight ages for the closer matches. In this study four facial feature extraction methods are performed, they are DWT with PCA, LWT with PCA, HMM with SVD and Gabor wavelet with HMM. Reliability of these methods are analyzed and reported. Among all these methods Gabor wavelet with HMM gives more reliability than other three methods performed. Experiments are carried out to evaluate the proposed approach on the Olivetti Research Laboratory (ORL face database.

  7. Reliability analysis for new technology-based transmitters

    Brissaud, Florent, E-mail: florent.brissaud.2007@utt.f [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France); Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and STMR UMR CNRS 6279, 12 rue Marie Curie, BP 2060, 10010 Troyes cedex (France); Barros, Anne; Berenguer, Christophe [Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and STMR UMR CNRS 6279, 12 rue Marie Curie, BP 2060, 10010 Troyes cedex (France); Charpentier, Dominique [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France)


    The reliability analysis of new technology-based transmitters has to deal with specific issues: various interactions between both material elements and functions, undefined behaviours under faulty conditions, several transmitted data, and little reliability feedback. To handle these particularities, a '3-step' model is proposed, based on goal tree-success tree (GTST) approaches to represent both the functional and material aspects, and includes the faults and failures as a third part for supporting reliability analyses. The behavioural aspects are provided by relationship matrices, also denoted master logic diagrams (MLD), with stochastic values which represent direct relationships between system elements. Relationship analyses are then proposed to assess the effect of any fault or failure on any material element or function. Taking these relationships into account, the probabilities of malfunction and failure modes are evaluated according to time. Furthermore, uncertainty analyses tend to show that even if the input data and system behaviour are not well known, these previous results can be obtained in a relatively precise way. An illustration is provided by a case study on an infrared gas transmitter. These properties make the proposed model and corresponding reliability analyses especially suitable for intelligent transmitters (or 'smart sensors').

  8. Familiarization, reliability, and comparability of a 40-m maximal shuttle run test.

    Glaister, Mark; Hauck, Hanna; Abraham, Corinne S; Merry, Kevin L; Beaver, Dean; Woods, Bernadette; McInnes, Gillian


    The aims of this study were to examine familiarization and reliability associated with a 40-m maximal shuttle run test (40-m MST), and to compare performance measures from the test with those of a typical unidirectional multiple sprint running test (UMSRT). 12 men and 4 women completed four trials of the 40-m MST (8 × 40-m; 20 s rest periods) followed by one trial of a UMSRT (12 × 30-m; repeated every 35 s); with seven days between trials. All trials were conducted indoors and performance times were recorded via twin-beam photocells. Significant between-trial differences in mean 40-m MST times were indicative of learning effects between trials 1 and 2. Test-retest reliability across the remaining trials as determined by coefficient of variation (CV) and intraclass correlation coefficient (ICC) revealed: a) very good reliability for measures of fastest and mean shuttle time (CV = 1.1 - 1.3%; ICC = 0.91 - 0.92); b) good reliability for measures of blood lactate (CV = 10.1 - 23.9%; ICC = 0.74 - 0.82) and ratings of perceived exertion (CV = 5.3 - 7.6%; ICC = 0.79 - 0.84); and c) poor reliability for measures of fatigue (CV = 38.7%; ICC = 0.59). Comparisons between performance indices of the 40-m MST and the UMSRT revealed significant correlations between all measures, except pre-test blood lactate concentration (r = 0. 47). Whilst the 40-m MST does not appear to provide more information than can be gleaned from a typical UMSRT, following the completion of a familiarization trial, the 40-m MST provides an alternative and, except for fatigue measures, reliable means of evaluating repeated sprint ability. Key pointsTests of multiple sprint performance are a popular means of evaluating repeated sprint ability.Multiple sprint tests incorporating changes of direction may be more ecologically valid than unidirectional protocols.The 40-m maximal shuttle run test is a reliable way of evaluating repeated sprint ability following the completion of one familiarization trial

  9. Reliability of fixed and jack-up structures: a comparative study

    Morandi, A.C.; Frieze, P.A. [MSL Engineering, Sunninghill, Ascot (United Kingdom); Birkinshaw, M. [Health and Safety Executive, London (United Kingdom); Smith, D.; Dixon, A.T. [Health and Safety Executive, Bootle (United Kingdom)


    This paper presents results of a comparison between the structural reliability of a jacket designed to the limit of API RP 2A-LRFD and of a jack-up designed to the limit of the SNAME T and R Bulletin 5-5A. Both platforms were assumed as operating in the same location and at the same water depth when evaluating metocean and geotechnical data. Component strength was evaluated on the basis of the latest ISO formulations and system strength was evaluated by pushover analysis using CAP/SeaStar. Reliability was evaluated using the Response Surface method. It was found that, for both platforms, failure probability at system level was about an order of magnitude smaller than at component level. The jack-up critical failure probabilities tended to be about an order of magnitude greater than the corresponding jacket results. (Author)

  10. Mutation Analysis Approach to Develop Reliable Object-Oriented Software

    Monalisa Sarma


    Full Text Available In general, modern programs are large and complex and it is essential that they should be highly reliable in applications. In order to develop highly reliable software, Java programming language developer provides a rich set of exceptions and exception handling mechanisms. Exception handling mechanisms are intended to help developers build robust programs. Given a program with exception handling constructs, for an effective testing, we are to detect whether all possible exceptions are raised and caught or not. However, complex exception handling constructs make it tedious to trace which exceptions are handled and where and which exceptions are passed on. In this paper, we address this problem and propose a mutation analysis approach to develop reliable object-oriented programs. We have applied a number of mutation operators to create a large set of mutant programs with different type of faults. We then generate test cases and test data to uncover exception related faults. The test suite so obtained is applied to the mutant programs measuring the mutation score and hence verifying whether mutant programs are effective or not. We have tested our approach with a number of case studies to substantiate the efficacy of the proposed mutation analysis technique.

  11. Strength Reliability Analysis of Stiffened Cylindrical Shells Considering Failure Correlation

    Xu Bai; Liping Sun; Wei Qin; Yongkun Lv


    The stiffened cylindrical shell is commonly used for the pressure hull of submersibles and the legs of offshore platforms. There are various failure modes because of uncertainty with the structural size and material properties, uncertainty of the calculation model and machining errors. Correlations among failure modes must be considered with the structural reliability of stiffened cylindrical shells. However, the traditional method cannot consider the correlations effectively. The aim of this study is to present a method of reliability analysis for stiffened cylindrical shells which considers the correlations among failure modes. Firstly, the joint failure probability calculation formula of two related failure modes is derived through use of the 2D joint probability density function. Secondly, the full probability formula of the tandem structural system is given with consideration to the correlations among failure modes. At last, the accuracy of the system reliability calculation is verified through use of the Monte Carlo simulation. Result of the analysis shows the failure probability of stiffened cylindrical shells can be gained through adding the failure probability of each mode.

  12. Comparative Analysis of Multistage Interconnection Networks.


    increasing-failure-rate (IFR) lifetime-distributions. II 4-- 57 65.6.1 Exact Reliability Analysis Let rsE (t) be the time-dependent reliability of the basic...elements in series. Hence, the reliability of an N x N SEN is given by RSEN(t) = [ rsE (t)12NL (5.4) For the 4 x 4 SEN, it is clear that the RSEN(t) = [ rSE (t)] 4 (5.5) since there are four identical SEs. The 4 x 4 SEN+ has six SEs; two in each of three stages. The four SEs which comprise


    Batrancea Ioan


    Full Text Available Banks in Romania offers its customers a wide range of products but which involves both risk taking. Therefore researchers seek to build rating models to help managers of banks to risk of non-recovery of loans and interest. In the following we highlight rating Raiffeisen Bank, BCR-ERSTE Bank and Transilvania Bank, based on the models CAAMPL and Stickney making a comparative analysis of the two rating models.

  14. Reliability Analysis of Penetration Systems Using Nondeterministic Methods



    Device penetration into media such as metal and soil is an application of some engineering interest. Often, these devices contain internal components and it is of paramount importance that all significant components survive the severe environment that accompanies the penetration event. In addition, the system must be robust to perturbations in its operating environment, some of which exhibit behavior which can only be quantified to within some level of uncertainty. In the analysis discussed herein, methods to address the reliability of internal components for a specific application system are discussed. The shock response spectrum (SRS) is utilized in conjunction with the Advanced Mean Value (AMV) and Response Surface methods to make probabilistic statements regarding the predicted reliability of internal components. Monte Carlo simulation methods are also explored.

  15. Dynamic Scapular Movement Analysis: Is It Feasible and Reliable in Stroke Patients during Arm Elevation?

    De Baets, Liesbet; Van Deun, Sara; Desloovere, Kaat; Jaspers, Ellen


    Knowledge of three-dimensional scapular movements is essential to understand post-stroke shoulder pain. The goal of the present work is to determine the feasibility and the within and between session reliability of a movement protocol for three-dimensional scapular movement analysis in stroke patients with mild to moderate impairment, using an optoelectronic measurement system. Scapular kinematics of 10 stroke patients and 10 healthy controls was recorded on two occasions during active anteflexion and abduction from 0° to 60° and from 0° to 120°. All tasks were executed unilaterally and bilaterally. The protocol’s feasibility was first assessed, followed by within and between session reliability of scapular total range of motion (ROM), joint angles at start position and of angular waveforms. Additionally, measurement errors were calculated for all parameters. Results indicated that the protocol was generally feasible for this group of patients and assessors. Within session reliability was very good for all tasks. Between sessions, scapular angles at start position were measured reliably for most tasks, while scapular ROM was more reliable during the 120° tasks. In general, scapular angles showed higher reliability during anteflexion compared to abduction, especially for protraction. Scapular lateral rotations resulted in smallest measurement errors. This study indicates that scapular kinematics can be measured reliably and with precision within one measurement session. In case of multiple test sessions, further methodological optimization is required for this protocol to be suitable for clinical decision-making and evaluation of treatment efficacy. PMID:24244414

  16. Dynamic scapular movement analysis: is it feasible and reliable in stroke patients during arm elevation?

    Liesbet De Baets

    Full Text Available Knowledge of three-dimensional scapular movements is essential to understand post-stroke shoulder pain. The goal of the present work is to determine the feasibility and the within and between session reliability of a movement protocol for three-dimensional scapular movement analysis in stroke patients with mild to moderate impairment, using an optoelectronic measurement system. Scapular kinematics of 10 stroke patients and 10 healthy controls was recorded on two occasions during active anteflexion and abduction from 0° to 60° and from 0° to 120°. All tasks were executed unilaterally and bilaterally. The protocol's feasibility was first assessed, followed by within and between session reliability of scapular total range of motion (ROM, joint angles at start position and of angular waveforms. Additionally, measurement errors were calculated for all parameters. Results indicated that the protocol was generally feasible for this group of patients and assessors. Within session reliability was very good for all tasks. Between sessions, scapular angles at start position were measured reliably for most tasks, while scapular ROM was more reliable during the 120° tasks. In general, scapular angles showed higher reliability during anteflexion compared to abduction, especially for protraction. Scapular lateral rotations resulted in smallest measurement errors. This study indicates that scapular kinematics can be measured reliably and with precision within one measurement session. In case of multiple test sessions, further methodological optimization is required for this protocol to be suitable for clinical decision-making and evaluation of treatment efficacy.

  17. Reliability and risk analysis data base development: an historical perspective

    Fragola, Joseph R


    Collection of empirical data and data base development for use in the prediction of the probability of future events has a long history. Dating back at least to the 17th century, safe passage events and mortality events were collected and analyzed to uncover prospective underlying classes and associated class attributes. Tabulations of these developed classes and associated attributes formed the underwriting basis for the fledgling insurance industry. Much earlier, master masons and architects used design rules of thumb to capture the experience of the ages and thereby produce structures of incredible longevity and reliability (Antona, E., Fragola, J. and Galvagni, R. Risk based decision analysis in design. Fourth SRA Europe Conference Proceedings, Rome, Italy, 18-20 October 1993). These rules served so well in producing robust designs that it was not until almost the 19th century that the analysis (Charlton, T.M., A History Of Theory Of Structures In The 19th Century, Cambridge University Press, Cambridge, UK, 1982) of masonry voussoir arches, begun by Galileo some two centuries earlier (Galilei, G. Discorsi e dimostrazioni mathematiche intorno a due nuove science, (Discourses and mathematical demonstrations concerning two new sciences, Leiden, The Netherlands, 1638), was placed on a sound scientific basis. Still, with the introduction of new materials (such as wrought iron and steel) and the lack of theoretical knowledge and computational facilities, approximate methods of structural design abounded well into the second half of the 20th century. To this day structural designers account for material variations and gaps in theoretical knowledge by employing factors of safety (Benvenuto, E., An Introduction to the History of Structural Mechanics, Part II: Vaulted Structures and Elastic Systems, Springer-Verlag, NY, 1991) or codes of practice (ASME Boiler and Pressure Vessel Code, ASME, New York) originally developed in the 19th century (Antona, E., Fragola, J. and


    Dustin Lawrence


    Full Text Available The purpose of this study was to inform decision makers at state and local levels, as well as property owners about the amount of water that can be supplied by rainwater harvesting systems in Texas so that it may be included in any future planning. Reliability of a rainwater tank is important because people want to know that a source of water can be depended on. Performance analyses were conducted on rainwater harvesting tanks for three Texas cities under different rainfall conditions and multiple scenarios to demonstrate the importance of optimizing rainwater tank design. Reliability curves were produced and reflect the percentage of days in a year that water can be supplied by a tank. Operational thresholds were reached in all scenarios and mark the point at which reliability increases by only 2% or less with an increase in tank size. A payback period analysis was conducted on tank sizes to estimate the amount of time it would take to recoup the cost of installing a rainwater harvesting system.

  19. A Bayesian Framework for Reliability Analysis of Spacecraft Deployments

    Evans, John W.; Gallo, Luis; Kaminsky, Mark


    Deployable subsystems are essential to mission success of most spacecraft. These subsystems enable critical functions including power, communications and thermal control. The loss of any of these functions will generally result in loss of the mission. These subsystems and their components often consist of unique designs and applications for which various standardized data sources are not applicable for estimating reliability and for assessing risks. In this study, a two stage sequential Bayesian framework for reliability estimation of spacecraft deployment was developed for this purpose. This process was then applied to the James Webb Space Telescope (JWST) Sunshield subsystem, a unique design intended for thermal control of the Optical Telescope Element. Initially, detailed studies of NASA deployment history, "heritage information", were conducted, extending over 45 years of spacecraft launches. This information was then coupled to a non-informative prior and a binomial likelihood function to create a posterior distribution for deployments of various subsystems uSing Monte Carlo Markov Chain sampling. Select distributions were then coupled to a subsequent analysis, using test data and anomaly occurrences on successive ground test deployments of scale model test articles of JWST hardware, to update the NASA heritage data. This allowed for a realistic prediction for the reliability of the complex Sunshield deployment, with credibility limits, within this two stage Bayesian framework.

  20. A Research Roadmap for Computation-Based Human Reliability Analysis

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  1. Reliability and risk analysis using artificial neural networks

    Robinson, D.G. [Sandia National Labs., Albuquerque, NM (United States)


    This paper discusses preliminary research at Sandia National Laboratories into the application of artificial neural networks for reliability and risk analysis. The goal of this effort is to develop a reliability based methodology that captures the complex relationship between uncertainty in material properties and manufacturing processes and the resulting uncertainty in life prediction estimates. The inputs to the neural network model are probability density functions describing system characteristics and the output is a statistical description of system performance. The most recent application of this methodology involves the comparison of various low-residue, lead-free soldering processes with the desire to minimize the associated waste streams with no reduction in product reliability. Model inputs include statistical descriptions of various material properties such as the coefficients of thermal expansion of solder and substrate. Consideration is also given to stochastic variation in the operational environment to which the electronic components might be exposed. Model output includes a probabilistic characterization of the fatigue life of the surface mounted component.

  2. Limits of reliability of optical properties of commercial glass in Mexico, a comparative analysis with experimental results; Limites de confiabilidad de propiedades opticas de vidrios comerciales en mexico, analisis comparativo con resultados experimentales

    Barrios Rodriguez, Pilar; Dorantes Rodriguez, Ruben J. [Universidad Autonoma Metropolitana Azcapotzalco, Mexico, D.F. (Mexico)


    The heat transfer through the buildings has been increased by the intensive use of glass in the building's covering; this situation has demanded more electrical energy for compensate the heat's gain or loss. Energy use in buildings is responsible for some 50% of CO{sub 2} emissions in many countries, so it's necessary a building's thermal design for a rational use of energy. The glass is an important material in the building's encircling, so it's essential to count on precise values of the glass's properties for design building's covering. Experimental evaluation on optical properties in some building's glasses to compare the experimental values with manufacturer's reported values inside tolerance limits. [Spanish] La ganancia de calor solar al interior de las edificaciones se ha visto incrementada por la tendencia actual en el uso intensivo y extensivo del vidrio en la envolvente, lo que ha aumentado la necesidad del uso de energia electrica para compensar con climatizacion artificial las ganancias y/o perdidas termicas en los espacios interiores de las edificaciones. El gasto energetico en edificios es responsable de cerca del 50% de emision de CO{sub 2} en varios paises, por lo que es necesario un diseno termico de la envolvente que contempla un uso racional de la energia. Por la importancia termica que el vidrio tiene en la envolvente de las edificaciones y la necesidad de contar con el valor de las propiedades lo mas preciso posible para su diseno, se planteo evaluar en forma experimental el comportamiento termico de algunos vidrios que tuviesen uso tanto real como potencial en las edificaciones de nuestro pais y comparar los valores obtenidos con los valores reportados por los fabricantes dentro de limites de tolerancia o confiabilidad.

  3. Fifty Years of THERP and Human Reliability Analysis

    Ronald L. Boring


    In 1962 at a Human Factors Society symposium, Alan Swain presented a paper introducing a Technique for Human Error Rate Prediction (THERP). This was followed in 1963 by a Sandia Laboratories monograph outlining basic human error quantification using THERP and, in 1964, by a special journal edition of Human Factors on quantification of human performance. Throughout the 1960s, Swain and his colleagues focused on collecting human performance data for the Sandia Human Error Rate Bank (SHERB), primarily in connection with supporting the reliability of nuclear weapons assembly in the US. In 1969, Swain met with Jens Rasmussen of Risø National Laboratory and discussed the applicability of THERP to nuclear power applications. By 1975, in WASH-1400, Swain had articulated the use of THERP for nuclear power applications, and the approach was finalized in the watershed publication of the NUREG/CR-1278 in 1983. THERP is now 50 years old, and remains the most well known and most widely used HRA method. In this paper, the author discusses the history of THERP, based on published reports and personal communication and interviews with Swain. The author also outlines the significance of THERP. The foundations of human reliability analysis are found in THERP: human failure events, task analysis, performance shaping factors, human error probabilities, dependence, event trees, recovery, and pre- and post-initiating events were all introduced in THERP. While THERP is not without its detractors, and it is showing signs of its age in the face of newer technological applications, the longevity of THERP is a testament of its tremendous significance. THERP started the field of human reliability analysis. This paper concludes with a discussion of THERP in the context of newer methods, which can be seen as extensions of or departures from Swain’s pioneering work.

  4. Reliability and Robustness Analysis of the Masinga Dam under Uncertainty

    Hayden Postle-Floyd


    Full Text Available Kenya’s water abstraction must meet the projected growth in municipal and irrigation demand by the end of 2030 in order to achieve the country’s industrial and economic development plan. The Masinga dam, on the Tana River, is the key to meeting this goal to satisfy the growing demands whilst also continuing to provide hydroelectric power generation. This study quantitatively assesses the reliability and robustness of the Masinga dam system under uncertain future supply and demand using probabilistic climate and population projections, and examines how long-term planning may improve the longevity of the dam. River flow and demand projections are used alongside each other as inputs to the dam system simulation model linked to an optimisation engine to maximise water availability. Water availability after demand satisfaction is assessed for future years, and the projected reliability of the system is calculated for selected years. The analysis shows that maximising power generation on a short-term year-by-year basis achieves 80%, 50% and 1% reliability by 2020, 2025 and 2030 onwards, respectively. Longer term optimal planning, however, has increased system reliability to up to 95% in 2020, 80% in 2025, and more than 40% in 2030 onwards. In addition, increasing the capacity of the reservoir by around 25% can significantly improve the robustness of the system for all future time periods. This study provides a platform for analysing the implication of different planning and management of Masinga dam and suggests that careful consideration should be given to account for growing municipal needs and irrigation schemes in both the immediate and the associated Tana River basin.

  5. Human Performance Modeling for Dynamic Human Reliability Analysis

    Boring, Ronald Laurids [Idaho National Laboratory; Joe, Jeffrey Clark [Idaho National Laboratory; Mandelli, Diego [Idaho National Laboratory


    Part of the U.S. Department of Energy’s (DOE’s) Light Water Reac- tor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Charac- terization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk framework. In this paper, we review simulation based and non simulation based human reliability analysis (HRA) methods. This paper summarizes the founda- tional information needed to develop a feasible approach to modeling human in- teractions in RISMC simulations.

  6. Reliability Analysis of a Mono-Tower Platform

    Kirkegaard, Poul Henning; Enevoldsen, I.; Sørensen, John Dalsgaard;

    In this paper a reliability analysis of a Mono-tower platform is presented. The failure modes, considered, are yelding in the tube cross-sections, and fatigue failure in the butt welds. The fatigue failure mode is investigated with a fatigue model, where the fatigue strength is expressed through SN...... for the fatigue limit state is a significant failure mode for the Mono.tower platform. Further, it is shown for the fatigue failure mode the the largest contributions to the overall uncertainty are due to the damping ratio, the inertia coefficient, the stress concentration factor, the model uncertainties...

  7. Reliability Analysis of a Mono-Tower Platform

    Kirkegaard, Poul Henning; Enevoldsen, I.; Sørensen, John Dalsgaard;


    In this paper, a reliability analysis of a Mono-tower platform is presented. Te failure modes considered are yielding in the tube cross sections and fatigue failure in the butts welds. The fatigue failrue mode is investigated with a fatigue model, where the fatigue strength is expressed through SN...... that the fatigue limit state is a significant failure mode for the Mono-tower platform. Further, it is shown for the fatigue failure mode that the largest contributions to the overall uncertainty are due to the damping ratio, the inertia coefficient, the stress concentration factor, the model uncertainties...


    G. W. Parry; J.A Forester; V.N. Dang; S. M. L. Hendrickson; M. Presley; E. Lois; J. Xing


    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure event (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.

  9. Integration of human reliability analysis into the high consequence process

    Houghton, F.K.; Morzinski, J.


    When performing a hazards analysis (HA) for a high consequence process, human error often plays a significant role in the hazards analysis. In order to integrate human error into the hazards analysis, a human reliability analysis (HRA) is performed. Human reliability is the probability that a person will correctly perform a system-required activity in a required time period and will perform no extraneous activity that will affect the correct performance. Even though human error is a very complex subject that can only approximately be addressed in risk assessment, an attempt must be made to estimate the effect of human errors. The HRA provides data that can be incorporated in the hazard analysis event. This paper will discuss the integration of HRA into a HA for the disassembly of a high explosive component. The process was designed to use a retaining fixture to hold the high explosive in place during a rotation of the component. This tool was designed as a redundant safety feature to help prevent a drop of the explosive. This paper will use the retaining fixture to demonstrate the following HRA methodology`s phases. The first phase is to perform a task analysis. The second phase is the identification of the potential human, both cognitive and psychomotor, functions performed by the worker. During the last phase the human errors are quantified. In reality, the HRA process is an iterative process in which the stages overlap and information gathered in one stage may be used to refine a previous stage. The rationale for the decision to use or not use the retaining fixture and the role the HRA played in the decision will be discussed.

  10. Reliability Index for Reinforced Concrete Frames using Nonlinear Pushover and Dynamic Analysis

    Ahmad A. Fallah


    Full Text Available In the conventional design and analysis methods affecting parameters loads, materials' strength, etc are not set as probable variables. Safety factors in the current Codes and Standards are usually obtained on the basis of judgment and experience, which may be improper or uneconomical. In technical literature, a method based on nonlinear static analysis is suggested to set Reliability Index on strength of structural systems. In this paper, a method based on Nonlinear Dynamic analysis with rising acceleration (or Incremental Dynamic Analysis is introduced, the results of which are compared with those of the previous (Static Pushover Analysis method and two concepts namely Redundancy Strength and Redundancy Variations are proposed as an index to these impacts. The Redundancy Variation Factor and Redundancy Strength Factor indices for reinforced concrete frames with varying number of bays and stories and different ductility potentials are computed and ultimately, Reliability Index is determined using these two indices.

  11. Toward reliable estimates of abundance: comparing index methods to assess the abundance of a Mammalian predator.

    Denise Güthlin

    Full Text Available Due to time and financial constraints indices are often used to obtain landscape-scale estimates of relative species abundance. Using two different field methods and comparing the results can help to detect possible bias or a non monotonic relationship between the index and the true abundance, providing more reliable results. We used data obtained from camera traps and feces counts to independently estimate relative abundance of red foxes in the Black Forest, a forested landscape in southern Germany. Applying negative binomial regression models, we identified landscape parameters that influence red fox abundance, which we then used to predict relative red fox abundance. We compared the estimated regression coefficients of the landscape parameters and the predicted abundance of the two methods. Further, we compared the costs and the precision of the two field methods. The predicted relative abundances were similar between the two methods, suggesting that the two indices were closely related to the true abundance of red foxes. For both methods, landscape diversity and edge density best described differences in the indices and had positive estimated effects on the relative fox abundance. In our study the costs of each method were of similar magnitude, but the sample size obtained from the feces counts (262 transects was larger than the camera trap sample size (88 camera locations. The precision of the camera traps was lower than the precision of the feces counts. The approach we applied can be used as a framework to compare and combine the results of two or more different field methods to estimate abundance and by this enhance the reliability of the result.

  12. Tailoring a Human Reliability Analysis to Your Industry Needs

    DeMott, D. L.


    Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed

  13. Reliability analysis and prediction of mixed mode load using Markov Chain Model

    Nikabdullah, N. [Department of Mechanical and Materials Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia and Institute of Space Science (ANGKASA), Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); Singh, S. S. K.; Alebrahim, R.; Azizi, M. A. [Department of Mechanical and Materials Engineering, Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); K, Elwaleed A. [Institute of Space Science (ANGKASA), Faculty of Engineering and Built Environment, Universiti Kebangsaan Malaysia (Malaysia); Noorani, M. S. M. [School of Mathematical Sciences, Faculty of Science and Technology, Universiti Kebangsaan Malaysia (Malaysia)


    The aim of this paper is to present the reliability analysis and prediction of mixed mode loading by using a simple two state Markov Chain Model for an automotive crankshaft. The reliability analysis and prediction for any automotive component or structure is important for analyzing and measuring the failure to increase the design life, eliminate or reduce the likelihood of failures and safety risk. The mechanical failures of the crankshaft are due of high bending and torsion stress concentration from high cycle and low rotating bending and torsional stress. The Markov Chain was used to model the two states based on the probability of failure due to bending and torsion stress. In most investigations it revealed that bending stress is much serve than torsional stress, therefore the probability criteria for the bending state would be higher compared to the torsion state. A statistical comparison between the developed Markov Chain Model and field data was done to observe the percentage of error. The reliability analysis and prediction was derived and illustrated from the Markov Chain Model were shown in the Weibull probability and cumulative distribution function, hazard rate and reliability curve and the bathtub curve. It can be concluded that Markov Chain Model has the ability to generate near similar data with minimal percentage of error and for a practical application; the proposed model provides a good accuracy in determining the reliability for the crankshaft under mixed mode loading.

  14. Inclusion of fatigue effects in human reliability analysis

    Griffith, Candice D. [Vanderbilt University, Nashville, TN (United States); Mahadevan, Sankaran, E-mail: [Vanderbilt University, Nashville, TN (United States)


    The effect of fatigue on human performance has been observed to be an important factor in many industrial accidents. However, defining and measuring fatigue is not easily accomplished. This creates difficulties in including fatigue effects in probabilistic risk assessments (PRA) of complex engineering systems that seek to include human reliability analysis (HRA). Thus the objectives of this paper are to discuss (1) the importance of the effects of fatigue on performance, (2) the difficulties associated with defining and measuring fatigue, (3) the current status of inclusion of fatigue in HRA methods, and (4) the future directions and challenges for the inclusion of fatigue, specifically sleep deprivation, in HRA. - Highlights: >We highlight the need for fatigue and sleep deprivation effects on performance to be included in human reliability analysis (HRA) methods. Current methods do not explicitly include sleep deprivation effects. > We discuss the difficulties in defining and measuring fatigue. > We review sleep deprivation research, and discuss the limitations and future needs of the current HRA methods.

  15. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Ronald L. Boring


    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  16. Transient Reliability Analysis Capability Developed for CARES/Life

    Nemeth, Noel N.


    The CARES/Life software developed at the NASA Glenn Research Center provides a general-purpose design tool that predicts the probability of the failure of a ceramic component as a function of its time in service. This award-winning software has been widely used by U.S. industry to establish the reliability and life of a brittle material (e.g., ceramic, intermetallic, and graphite) structures in a wide variety of 21st century applications.Present capabilities of the NASA CARES/Life code include probabilistic life prediction of ceramic components subjected to fast fracture, slow crack growth (stress corrosion), and cyclic fatigue failure modes. Currently, this code can compute the time-dependent reliability of ceramic structures subjected to simple time-dependent loading. For example, in slow crack growth failure conditions CARES/Life can handle sustained and linearly increasing time-dependent loads, whereas in cyclic fatigue applications various types of repetitive constant-amplitude loads can be accounted for. However, in real applications applied loads are rarely that simple but vary with time in more complex ways such as engine startup, shutdown, and dynamic and vibrational loads. In addition, when a given component is subjected to transient environmental and or thermal conditions, the material properties also vary with time. A methodology has now been developed to allow the CARES/Life computer code to perform reliability analysis of ceramic components undergoing transient thermal and mechanical loading. This means that CARES/Life will be able to analyze finite element models of ceramic components that simulate dynamic engine operating conditions. The methodology developed is generalized to account for material property variation (on strength distribution and fatigue) as a function of temperature. This allows CARES/Life to analyze components undergoing rapid temperature change in other words, components undergoing thermal shock. In addition, the capability has

  17. Productivity enhancement and reliability through AutoAnalysis

    Garetto, Anthony; Rademacher, Thomas; Schulz, Kristian


    The decreasing size and increasing complexity of photomask features, driven by the push to ever smaller technology nodes, places more and more challenges on the mask house, particularly in terms of yield management and cost reduction. Particularly challenging for mask shops is the inspection, repair and review cycle which requires more time and skill from operators due to the higher number of masks required per technology node and larger nuisance defect counts. While the measurement throughput of the AIMS™ platform has been improved in order to keep pace with these trends, the analysis of aerial images has seen little advancement and remains largely a manual process. This manual analysis of aerial images is time consuming, dependent on the skill level of the operator and significantly contributes to the overall mask manufacturing process flow. AutoAnalysis, the first application available for the FAVOR® platform, offers a solution to these problems by providing fully automated analysis of AIMS™ aerial images. Direct communication with the AIMS™ system allows automated data transfer and analysis parallel to the measurements. User defined report templates allow the relevant data to be output in a manner that can be tailored to various internal needs and support the requests of your customers. Productivity is significantly improved due to the fast analysis, operator time is saved and made available for other tasks and reliability is no longer a concern as the most defective region is always and consistently captured. In this paper the concept and approach of AutoAnalysis will be presented as well as an update to the status of the project. The benefits arising from the use of AutoAnalysis will be discussed in more detail and a study will be performed in order to demonstrate.

  18. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    Clayson, Peter E; Miller, Gregory A


    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue.

  19. Reliability analysis on a shell and tube heat exchanger

    Lingeswara, S.; Omar, R.; Mohd Ghazi, T. I.


    A shell and tube heat exchanger reliability was done in this study using past history data from a carbon black manufacturing plant. The heat exchanger reliability study is vital in all related industries as inappropriate maintenance and operation of the heat exchanger will lead to major Process Safety Events (PSE) and loss of production. The overall heat exchanger coefficient/effectiveness (Uo) and Mean Time between Failures (MTBF) were analyzed and calculated. The Aspen and down time data was taken from a typical carbon black shell and tube heat exchanger manufacturing plant. As a result of the Uo calculated and analyzed, it was observed that the Uo declined over a period caused by severe fouling and heat exchanger limitation. This limitation also requires further burn out period which leads to loss of production. The MTBF calculated is 649.35 hours which is very low compared to the standard 6000 hours for the good operation of shell and tube heat exchanger. The guidelines on heat exchanger repair, preventive and predictive maintenance was identified and highlighted for better heat exchanger inspection and repair in the future. The fouling of heat exchanger and the production loss will be continuous if proper heat exchanger operation and repair using standard operating procedure is not followed.

  20. Investigation for Ensuring the Reliability of the MELCOR Analysis Results

    Sung, Joonyoung; Maeng, Yunhwan; Lee, Jaeyoung [Handong Global Univ., Pohang (Korea, Republic of)


    Flow rate could be also main factor to be proven because it is in charge of a role which takes thermal balance through heat transfer in inner side of fuel assembly. Some problems about a reliability of MELCOR results could be posed in the 2{sup nd} technical report of NSRC project. In order to confirm whether MELCOR results are dependable, experimental data of Sandia Fuel Project 1 phase were used to be compared to be a reference. In Spent Fuel Pool (SFP) severe accident, especially in case of boil-off, partial loss of coolant accident, and complete loss of coolant accident; heat source and flow rate could be main points to analyze the MELCOR results. Heat source might be composed as decay heat and oxidation heat. Because heat source makes it possible to lead a zirconium fire situation if it is satisfied that heat accumulates in spent fuel rod and then cladding temperature could be raised continuously to be generated an oxidation heat, this might be a main factor to be confirmed. This work was proposed to investigate reliability of MELCOR results in order to confirm physical phenomena if SFP severe accident is occurred. Almost results showed that MELCOR results were significantly different by minute change of main parameter in identical condition. Therefore it could be necessary that oxidation coefficients have to be chosen as value to delineate real phenomena as possible.

  1. Reliability analysis and updating of deteriorating systems with subset simulation

    Schneider, Ronald; Thöns, Sebastian; Straub, Daniel


    Bayesian updating of the system deterioration model. The updated system reliability is then obtained through coupling the updated deterioration model with a probabilistic structural model. The underlying high-dimensional structural reliability problems are solved using subset simulation, which...

  2. Time-dependent reliability analysis and condition assessment of structures

    Ellingwood, B.R. [Johns Hopkins Univ., Baltimore, MD (United States)


    Structures generally play a passive role in assurance of safety in nuclear plant operation, but are important if the plant is to withstand the effect of extreme environmental or abnormal events. Relative to mechanical and electrical components, structural systems and components would be difficult and costly to replace. While the performance of steel or reinforced concrete structures in service generally has been very good, their strengths may deteriorate during an extended service life as a result of changes brought on by an aggressive environment, excessive loading, or accidental loading. Quantitative tools for condition assessment of aging structures can be developed using time-dependent structural reliability analysis methods. Such methods provide a framework for addressing the uncertainties attendant to aging in the decision process.

  3. New Mathematical Derivations Applicable to Safety and Reliability Analysis

    Cooper, J.A.; Ferson, S.


    Boolean logic expressions are often derived in safety and reliability analysis. Since the values of the operands are rarely exact, accounting for uncertainty with the tightest justifiable bounds is important. Accurate determination of result bounds is difficult when the inputs have constraints. One example of a constraint is that an uncertain variable that appears multiple times in a Boolean expression must always have the same value, although the value cannot be exactly specified. A solution for this repeated variable problem is demonstrated for two Boolean classes. The classes, termed functions with unate variables (including, but not limited to unate functions), and exclusive-or functions, frequently appear in Boolean equations for uncertain outcomes portrayed by logic trees (event trees and fault trees).

  4. Reliability analysis for the quench detection in the LHC machine

    Denz, R; Vergara-Fernández, A


    The Large Hadron Collider (LHC) will incorporate a large amount of superconducting elements that require protection in case of a quench. Key elements in the quench protection system are the electronic quench detectors. Their reliability will have an important impact on the down time as well as on the operational cost of the collider. The expected rates of both false and missed quenches have been computed for several redundant detection schemes. The developed model takes account of the maintainability of the system to optimise the frequency of foreseen checks, and evaluate their influence on the performance of different detection topologies. Seen the uncertainty of the failure rate of the components combined with the LHC tunnel environment, the study has been completed with a sensitivity analysis of the results. The chosen detection scheme and the maintainability strategy for each detector family are given.

  5. A reliability analysis of the revised competitiveness index.

    Harris, Paul B; Houston, John M


    This study examined the reliability of the Revised Competitiveness Index by investigating the test-retest reliability, interitem reliability, and factor structure of the measure based on a sample of 280 undergraduates (200 women, 80 men) ranging in age from 18 to 28 years (M = 20.1, SD = 2.1). The findings indicate that the Revised Competitiveness Index has high test-retest reliability, high inter-item reliability, and a stable factor structure. The results support the assertion that the Revised Competitiveness Index assesses competitiveness as a stable trait rather than a dynamic state.

  6. Higher reliability of triple-phase bone scintigraphy in cementless total hip arthroplasty compared to cementless bipolar hemiarthroplasty

    Burak Yoldas


    Conclusions: Due to the higher sensitivity, specificity and accuracy, TPBS has a more reliable diagnostic value for cementless THA in the diagnosis of periprosthetic infection compared to cementless BHA.

  7. Probabilistic durability assessment of concrete structures in marine environments: Reliability and sensitivity analysis

    Yu, Bo; Ning, Chao-lie; Li, Bing


    A probabilistic framework for durability assessment of concrete structures in marine environments was proposed in terms of reliability and sensitivity analysis, which takes into account the uncertainties under the environmental, material, structural and executional conditions. A time-dependent probabilistic model of chloride ingress was established first to consider the variations in various governing parameters, such as the chloride concentration, chloride diffusion coefficient, and age factor. Then the Nataf transformation was adopted to transform the non-normal random variables from the original physical space into the independent standard Normal space. After that the durability limit state function and its gradient vector with respect to the original physical parameters were derived analytically, based on which the first-order reliability method was adopted to analyze the time-dependent reliability and parametric sensitivity of concrete structures in marine environments. The accuracy of the proposed method was verified by comparing with the second-order reliability method and the Monte Carlo simulation. Finally, the influences of environmental conditions, material properties, structural parameters and execution conditions on the time-dependent reliability of concrete structures in marine environments were also investigated. The proposed probabilistic framework can be implemented in the decision-making algorithm for the maintenance and repair of deteriorating concrete structures in marine environments.

  8. Reliability of videotaped observational gait analysis in patients with orthopedic impairments

    Brunnekreef, J.J.; Uden, C. van; Moorsel, S. van; Kooloos, J.G.M.


    BACKGROUND: In clinical practice, visual gait observation is often used to determine gait disorders and to evaluate treatment. Several reliability studies on observational gait analysis have been described in the literature and generally showed moderate reliability. However, patients with orthopedic

  9. Failure Analysis towards Reliable Performance of Aero-Engines

    T. Jayakumar


    Full Text Available Aero-engines are critical components whose reliable performance decides the primary safety of anaircrafthelicopter. This is met by rigorous maintenance schedule with periodic inspection/nondestructive testingof various engine components. In spite of these measures, failure of areo-engines do occur rather frequentlyin comparison to failure of other components. Systematic failure analysis helps one to identify root causeof the failure, thus enabling remedial measures to prevent recurrence of such failures. Turbine blades madeof nickel or cobalt-based alloys are used in aero-engines. These blades are subjected to complex loadingconditions at elevated temperatures. The main causes of failure of blades are attributed to creep, thermalfatigue and hot corrosion. Premature failure of blades in the combustion zone was reported in one of theaero-engines. The engine had both the compressor and the free-turbine in a common shaft. Detailedfailure analysis revealed the presence of creep voids in the blades that failed. Failure of turbine bladeswas also detected in another aero-engine operating in a coastal environment. In this failure, the protectivecoating on the blades was cracked at many locations. Grain boundary spikes were observed on these locations.The primary cause of this failure was the hot corrosion followed by creep damage

  10. Wind energy Computerized Maintenance Management System (CMMS) : data collection recommendations for reliability analysis.

    Peters, Valerie A.; Ogilvie, Alistair; Veers, Paul S.


    This report addresses the general data requirements for reliability analysis of fielded wind turbines and other wind plant equipment. The report provides a list of the data needed to support reliability and availability analysis, and gives specific recommendations for a Computerized Maintenance Management System (CMMS) to support automated analysis. This data collection recommendations report was written by Sandia National Laboratories to address the general data requirements for reliability analysis of fielded wind turbines. This report is intended to help the reader develop a basic understanding of what data are needed from a Computerized Maintenance Management System (CMMS) and other data systems, for reliability analysis. The report provides: (1) a list of the data needed to support reliability and availability analysis; and (2) specific recommendations for a CMMS to support automated analysis. Though written for reliability analysis of wind turbines, much of the information is applicable to a wider variety of equipment and a wider variety of analysis and reporting needs.

  11. Fuzzy Reliability Analysis of the Shaft of a Steam Turbine


    Field surveying shows that the failure of the steam turbine's coupling is due to fatigue that is caused by compound stress. Fuzzy mathematics was applied to get the membership function of the fatigue strength rule. A formula of fuzzy reliability of the coupling was derived and a theory of coupling's fuzzy reliability is set up. The calculating method of the fuzzy reliability is explained by an illustrative example.

  12. An Intelligent Method for Structural Reliability Analysis Based on Response Surface

    桂劲松; 刘红; 康海贵


    As water depth increases, the structural safety and reliability of a system become more and more important and challenging. Therefore, the structural reliability method must be applied in ocean engineering design such as offshore platform design. If the performance function is known in structural reliability analysis, the first-order second-moment method is often used. If the performance function could not be definitely expressed, the response surface method is always used because it has a very clear train of thought and simple programming. However, the traditional response surface method fits the response surface of quadratic polynomials where the problem of accuracy could not be solved, because the true limit state surface can be fitted well only in the area near the checking point. In this paper, an intelligent computing method based on the whole response surface is proposed, which can be used for the situation where the performance function could not be definitely expressed in structural reliability analysis. In this method, a response surface of the fuzzy neural network for the whole area should be constructed first, and then the structural reliability can be calculated by the genetic algorithm. In the proposed method, all the sample points for the training network come from the whole area, so the true limit state surface in the whole area can be fitted. Through calculational examples and comparative analysis, it can be known that the proposed method is much better than the traditional response surface method of quadratic polynomials, because, the amount of calculation of finite element analysis is largely reduced, the accuracy of calculation is improved,and the true limit state surface can be fitted very well in the whole area. So, the method proposed in this paper is suitable for engineering application.

  13. A comparative study of reliability of self report of tobacco use among patients with bipolar and somatoform disorders

    Yatan Pal Singh Balhara


    Full Text Available Objective: To compare the use and reliability of self-reported tobacco use (both smoked and smokeless among patients with bipolar disorder and somatoform disorders. Materials and Methods: The study was conducted at psychiatry out-patient department of a tertiary care hospital. A total of 50 consecutive patients were recruited. The subjects were asked about the use of tobacco products (smoked as well as smokeless over the past one week. Those reporting affirmatively in response to the question were assessed using Fagerstrom Test for Nicotine Dependence (FTND scales. Quantitative urinary cotinine levels were assessed using Enzyme-linked immunosorbent assay (ELISA. Results: Calculation of Cohen′s kappa using cross tabulation revealed discordance between the self-reported use of smoked as well as smokeless tobacco products in both the groups. Analysis using the lower cut off of 50 ng/ ml also revealed discordance between the self-reported tobacco use (smoked as well as smokeless for both the groups. Conclusions: The reliability of self-report is questionable among both these groups for smoking as well as smokeless tobacco products.

  14. Reliable Classification of Geologic Surfaces Using Texture Analysis

    Foil, G.; Howarth, D.; Abbey, W. J.; Bekker, D. L.; Castano, R.; Thompson, D. R.; Wagstaff, K.


    Communication delays and bandwidth constraints are major obstacles for remote exploration spacecraft. Due to such restrictions, spacecraft could make use of onboard science data analysis to maximize scientific gain, through capabilities such as the generation of bandwidth-efficient representative maps of scenes, autonomous instrument targeting to exploit targets of opportunity between communications, and downlink prioritization to ensure fast delivery of tactically-important data. Of particular importance to remote exploration is the precision of such methods and their ability to reliably reproduce consistent results in novel environments. Spacecraft resources are highly oversubscribed, so any onboard data analysis must provide a high degree of confidence in its assessment. The TextureCam project is constructing a "smart camera" that can analyze surface images to autonomously identify scientifically interesting targets and direct narrow field-of-view instruments. The TextureCam instrument incorporates onboard scene interpretation and mapping to assist these autonomous science activities. Computer vision algorithms map scenes such as those encountered during rover traverses. The approach, based on a machine learning strategy, trains a statistical model to recognize different geologic surface types and then classifies every pixel in a new scene according to these categories. We describe three methods for increasing the precision of the TextureCam instrument. The first uses ancillary data to segment challenging scenes into smaller regions having homogeneous properties. These subproblems are individually easier to solve, preventing uncertainty in one region from contaminating those that can be confidently classified. The second involves a Bayesian approach that maximizes the likelihood of correct classifications by abstaining from ambiguous ones. We evaluate these two techniques on a set of images acquired during field expeditions in the Mojave Desert. Finally, the

  15. Reliability Analysis and Modeling of ZigBee Networks

    Lin, Cheng-Min

    The architecture of ZigBee networks focuses on developing low-cost, low-speed ubiquitous communication between devices. The ZigBee technique is based on IEEE 802.15.4, which specifies the physical layer and medium access control (MAC) for a low rate wireless personal area network (LR-WPAN). Currently, numerous wireless sensor networks have adapted the ZigBee open standard to develop various services to promote improved communication quality in our daily lives. The problem of system and network reliability in providing stable services has become more important because these services will be stopped if the system and network reliability is unstable. The ZigBee standard has three kinds of networks; star, tree and mesh. The paper models the ZigBee protocol stack from the physical layer to the application layer and analyzes these layer reliability and mean time to failure (MTTF). Channel resource usage, device role, network topology and application objects are used to evaluate reliability in the physical, medium access control, network, and application layers, respectively. In the star or tree networks, a series system and the reliability block diagram (RBD) technique can be used to solve their reliability problem. However, a division technology is applied here to overcome the problem because the network complexity is higher than that of the others. A mesh network using division technology is classified into several non-reducible series systems and edge parallel systems. Hence, the reliability of mesh networks is easily solved using series-parallel systems through our proposed scheme. The numerical results demonstrate that the reliability will increase for mesh networks when the number of edges in parallel systems increases while the reliability quickly drops when the number of edges and the number of nodes increase for all three networks. More use of resources is another factor impact on reliability decreasing. However, lower network reliability will occur due to

  16. Procedure for conducting a human-reliability analysis for nuclear power plants. Final report

    Bell, B.J.; Swain, A.D.


    This document describes in detail a procedure to be followed in conducting a human reliability analysis as part of a probabilistic risk assessment when such an analysis is performed according to the methods described in NUREG/CR-1278, Handbook for Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications. An overview of the procedure describing the major elements of a human reliability analysis is presented along with a detailed description of each element and an example of an actual analysis. An appendix consists of some sample human reliability analysis problems for further study.

  17. Aviation Fuel System Reliability and Fail-Safety Analysis. Promising Alternative Ways for Improving the Fuel System Reliability

    I. S. Shumilov


    Full Text Available The paper deals with design requirements for an aviation fuel system (AFS, AFS basic design requirements, reliability, and design precautions to avoid AFS failure. Compares the reliability and fail-safety of AFS and aircraft hydraulic system (AHS, considers the promising alternative ways to raise reliability of fuel systems, as well as elaborates recommendations to improve reliability of the pipeline system components and pipeline systems, in general, based on the selection of design solutions.It is extremely advisable to design the AFS and AHS in accordance with Aviation Regulations АП25 and Accident Prevention Guidelines, ICAO (International Civil Aviation Association, which will reduce risk of emergency situations, and in some cases even avoid heavy disasters.ATS and AHS designs should be based on the uniform principles to ensure the highest reliability and safety. However, currently, this principle is not enough kept, and AFS looses in reliability and fail-safety as compared with AHS. When there are the examined failures (single and their combinations the guidelines to ensure the AFS efficiency should be the same as those of norm-adopted in the Regulations АП25 for AHS. This will significantly increase reliability and fail-safety of the fuel systems and aircraft flights, in general, despite a slight increase in AFS mass.The proposed improvements through the use of components redundancy of the fuel system will greatly raise reliability of the fuel system of a passenger aircraft, which will, without serious consequences for the flight, withstand up to 2 failures, its reliability and fail-safety design will be similar to those of the AHS, however, above improvement measures will lead to a slightly increasing total mass of the fuel system.It is advisable to set a second pump on the engine in parallel with the first one. It will run in case the first one fails for some reasons. The second pump, like the first pump, can be driven from the

  18. Application of Support Vector Machine to Reliability Analysis of Engine Systems

    Zhang Xinfeng


    Full Text Available Reliability analysis plays a very important role for assessing the performance and making maintenance plans of engine systems. This research presents a comparative study of the predictive performances of support vector machines (SVM , least square support vector machine (LSSVM and neural network time series models for forecasting failures and reliability in engine systems. Further, the reliability indexes of engine systems are computed by the weibull probability paper programmed with Matlab. The results shows that the probability distribution of the forecasting outcomes is consistent to the distribution of the actual data, which all follow weibull distribution and the predictions by SVM and LSSVM can provide accurate predictions of the characteristic life. So SVM and LSSVM are both another choice of engine system reliability analysis. Moreover, the predictive precise of the method based on LSSVM is higher than that of SVM. In small samples, the prediction by LSSVM will be more popular, because its compution cost is lower and the precise can be more satisfied.

  19. Moment Method Based on Fuzzy Reliability Sensitivity Analysis for a Degradable Structural System

    Song Jun; Lu Zhenzhou


    For a degradable structural system with fuzzy failure region, a moment method based on fuzzy reliability sensitivity algorithm is presented. According to the value assignment of porformance function, the integral region for calculating the fuzzy failure probability is first split into a series of subregions in which the membership function values of the performance function within the fuzzy failure region can be approximated by a set of constants. The fuzzy failure probability is then transformed into a sum of products oftbe random failure probabilities and the approximate constants of the membership function in the subregions. Furthermore, the fuzzy reliability sensitivity analysis is transformed into a series of random reliability sensitivity analysis, and the random reliability sensitivity can be obtained by the constructed moment method. The primary advantages of the presented method include higher efficiency for implicit performance function with low and medium dimensionality and wide applicability to multiple failure modes and nonnormal basic random variables. The limitation is that the required computation effort grows exponentially with the increase of dimensionality of the basic random vari-able; hence, it is not suitable for high dimensionality problem. Compared with the available methods, the presented one is pretty com-petitive in the case that the dimensionality is lower than 10. The presented examples are used to verify the advantages and indicate the limitations.

  20. Effectiveness and reliability analysis of emergency measures for flood prevention

    Lendering, K.T.; Jonkman, S.N.; Kok, M.


    During flood events emergency measures are used to prevent breaches in flood defences. However, there is still limited insight in their reliability and effectiveness. The objective of this paper is to develop a method to determine the reliability and effectiveness of emergency measures for flood def

  1. Wind turbine reliability : a database and analysis approach.

    Linsday, James (ARES Corporation); Briand, Daniel; Hill, Roger Ray; Stinebaugh, Jennifer A.; Benjamin, Allan S. (ARES Corporation)


    The US wind Industry has experienced remarkable growth since the turn of the century. At the same time, the physical size and electrical generation capabilities of wind turbines has also experienced remarkable growth. As the market continues to expand, and as wind generation continues to gain a significant share of the generation portfolio, the reliability of wind turbine technology becomes increasingly important. This report addresses how operations and maintenance costs are related to unreliability - that is the failures experienced by systems and components. Reliability tools are demonstrated, data needed to understand and catalog failure events is described, and practical wind turbine reliability models are illustrated, including preliminary results. This report also presents a continuing process of how to proceed with controlling industry requirements, needs, and expectations related to Reliability, Availability, Maintainability, and Safety. A simply stated goal of this process is to better understand and to improve the operable reliability of wind turbine installations.

  2. Advanced response surface method for mechanical reliability analysis

    L(U) Zhen-zhou; ZHAO Jie; YUE Zhu-feng


    Based on the classical response surface method (RSM), a novel RSM using improved experimental points (EPs) is presented for reliability analysis. Two novel points are included in the presented method. One is the use of linear interpolation, from which the total EPs for determining the RS are selected to be closer to the actual failure surface;the other is the application of sequential linear interpolation to control the distance between the surrounding EPs and the center EP, by which the presented method can ensure that the RS fits the actual failure surface in the region of maximum likelihood as the center EPs converge to the actual most probable point (MPP). Since the fitting precision of the RS to the actual failure surface in the vicinity of the MPP, which has significant contribution to the probability of the failure surface being exceeded, is increased by the presented method, the precision of the failure probability calculated by RS is increased as well. Numerical examples illustrate the accuracy and efficiency of the presented method.

  3. Sociological analysis and comparative education

    Woock, Roger R.


    It is argued that comparative education is essentially a derivative field of study, in that it borrows theories and methods from academic disciplines. After a brief humanistic phase, in which history and philosophy were central for comparative education, sociology became an important source. In the mid-50's and 60's, sociology in the United States was characterised by Structural Functionalism as a theory, and Social Survey as a dominant methodology. Both were incorporated into the development of comparative education. Increasingly in the 70's, and certainly today, the new developments in sociology are characterised by an attack on Positivism, which is seen as the philosophical position underlying both functionalism and survey methods. New or re-discovered theories with their attendant methodologies included Marxism, Phenomenological Sociology, Critical Theory, and Historical Social Science. The current relationship between comparative education and social science is one of uncertainty, but since social science is seen to be returning to its European roots, the hope is held out for the development of an integrated social theory and method which will provide a much stronger basis for developments in comparative education.

  4. Constellation Ground Systems Launch Availability Analysis: Enhancing Highly Reliable Launch Systems Design

    Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.


    Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, in a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation; testing results, and other information. Where appropriate, actual performance history was used for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to verify compliance with requirements and to highlight design or performance shortcomings for further decision-making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability and maintainability analysis, and present findings and observation based on analysis leading to the Ground Systems Preliminary Design Review milestone.

  5. Extending Failure Modes and Effects Analysis Approach for Reliability Analysis at the Software Architecture Design Level

    Sozer, Hasan; Tekinerdogan, Bedir; Aksit, Mehmet; Lemos, de Rogerio; Gacek, Cristina


    Several reliability engineering approaches have been proposed to identify and recover from failures. A well-known and mature approach is the Failure Mode and Effect Analysis (FMEA) method that is usually utilized together with Fault Tree Analysis (FTA) to analyze and diagnose the causes of failures.

  6. Assessing the Reliability of Digitalized Cephalometric Analysis in Comparison with Manual Cephalometric Analysis

    Farooq, Mohammed Umar; Khan, Mohd. Asadullah; Imran, Shahid; Qureshi, Arshad; Ahmed, Syed Afroz; Kumar, Sujan; Rahman, Mohd. Aziz Ur


    Introduction For more than seven decades orthodontist used cephalometric analysis as one of the main diagnostic tools which can be performed manually or by software. The use of computers in treatment planning is expected to avoid errors and make it less time consuming with effective evaluation and high reproducibility. Aim This study was done to evaluate and compare the accuracy and reliability of cephalometric measurements between computerized method of direct digital radiographs and conventional tracing. Materials and Methods Digital and conventional hand tracing cephalometric analysis of 50 patients were done. Thirty anatomical landmarks were defined on each radiograph by a single investi-gator, 5 skeletal analysis (Steiner, Wits, Tweeds, McNamara, Rakosi Jarabaks) and 28 variables were calculated. Results The variables showed consistency between the two methods except for 1-NA, Y-axis and interincisal angle measurements which were higher in manual tracing and higher facial axis angle in digital tracing. Conclusion Most of the commonly used measurements were accurate except some measurements between the digital tracing with FACAD® and manual methods. The advantages of digital imaging such as enhancement, transmission, archiving and low radiation dosages makes it to be preferred over conventional method in daily use. PMID:27891451

  7. Suitability review of FMEA and reliability analysis for digital plant protection system and digital engineered safety features actuation system

    Kim, I. S.; Kim, T. K.; Kim, M. C.; Kim, B. S.; Hwang, S. W.; Ryu, K. C. [Hanyang Univ., Seoul (Korea, Republic of)


    Of the many items that should be checked out during a review stage of the licensing application for the I and C system of Ulchin 5 and 6 units, this report relates to a suitability review of the reliability analysis of Digital Plant Protection System (DPPS) and Digital Engineered Safety Features Actuation System (DESFAS). In the reliability analysis performed by the system designer, ABB-CE, fault tree analysis was used as the main methods along with Failure Modes and Effect Analysis (FMEA). However, the present regulatory technique dose not allow the system reliability analysis and its results to be appropriately evaluated. Hence, this study was carried out focusing on the following four items ; development of general review items by which to check the validity of a reliability analysis, and the subsequent review of suitability of the reliability analysis for Ulchin 5 and 6 DPPS and DESFAS L development of detailed review items by which to check the validity of an FMEA, and the subsequent review of suitability of the FMEA for Ulchin 5 and 6 DPPS and DESFAS ; development of detailed review items by which to check the validity of a fault tree analysis, and the subsequent review of suitability of the fault tree for Ulchin 5 and 6 DPPS and DESFAS ; an integrated review of the safety and reliability of the Ulchin 5 and 6 DPPS and DESFAS based on the results of the various reviews above and also of a reliability comparison between the digital systems and the comparable analog systems, i.e., and analog Plant Protection System (PPS) and and analog Engineered Safety Features Actuation System (ESFAS). According to the review mentioned above, the reliability analysis of Ulchin 5 and 6 DPPS and DESFAS generally satisfies the review requirements. However, some shortcomings of the analysis were identified in our review such that the assumed test periods for several equipment were not properly incorporated in the analysis, and failures of some equipment were not included in the

  8. System Reliability Analysis of Redundant Condition Monitoring Systems

    YI Pengxing; HU Youming; YANG Shuzi; WU Bo; CUI Feng


    The development and application of new reliability models and methods are presented to analyze the system reliability of complex condition monitoring systems. The methods include a method analyzing failure modes of a type of redundant condition monitoring systems (RCMS) by invoking failure tree model, Markov modeling techniques for analyzing system reliability of RCMS, and methods for estimating Markov model parameters. Furthermore, a computing case is investigated and many conclusions upon this case are summarized. Results show that the method proposed here is practical and valuable for designing condition monitoring systems and their maintenance.

  9. Reliability Modeling and Analysis of SCI Topological Network

    Hongzhe Xu


    Full Text Available The problem of reliability modeling on the Scalable Coherent Interface (SCI rings and topological network is studied. The reliability models of three SCI rings are developed and the factors which influence the reliability of SCI rings are studied. By calculating the shortest path matrix and the path quantity matrix of different types SCI network topology, the communication characteristics of SCI network are obtained. For the situations of the node-damage and edge-damage, the survivability of SCI topological network is studied.

  10. Application of Reliability Analysis for Optimal Design of Monolithic Vertical Wall Breakwaters

    Burcharth, H. F.; Sørensen, John Dalsgaard; Christiani, E.


    Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of some of the most important failure modes are described. The failures are sliding and slip surface failure of a rubble mound and a clay foundation. Relevant design...... variables are identified and a reliability-based design optimization procedure is formulated. Results from an illustrative example are given....

  11. Reliability and comparability of psychosis patients' retrospective reports of childhood abuse.

    Fisher, Helen L; Craig, Thomas K; Fearon, Paul; Morgan, Kevin; Dazzan, Paola; Lappin, Julia; Hutchinson, Gerard; Doody, Gillian A; Jones, Peter B; McGuffin, Peter; Murray, Robin M; Leff, Julian; Morgan, Craig


    An increasing number of studies are demonstrating an association between childhood abuse and psychosis. However, the majority of these rely on retrospective self-reports in adulthood that may be unduly influenced by current psychopathology. We therefore set out to explore the reliability and comparability of first-presentation psychosis patients' reports of childhood abuse. Psychosis case subjects were drawn from the Aetiology and Ethnicity of Schizophrenia and Other Psychoses (ÆSOP) epidemiological study and completed the Childhood Experience of Care and Abuse Questionnaire to elicit abusive experiences that occurred prior to 16 years of age. High levels of concurrent validity were demonstrated with the Parental Bonding Instrument (antipathy: r(s)=0.350-0.737, P<.001; neglect: r(s)=0.688-0.715, P<.001), and good convergent validity was shown with clinical case notes (sexual abuse: κ=0.526, P<.001; physical abuse: κ=0.394, P<.001). Psychosis patients' reports were also reasonably stable over a 7-year period (sexual abuse: κ=0.590, P<.01; physical abuse: κ=0.634, P<.001; antipathy: κ=0.492, P<.01; neglect: κ=0.432, P<.05). Additionally, their reports of childhood abuse were not associated with current severity of psychotic symptoms (sexual abuse: U=1768.5, P=.998; physical abuse: U=2167.5, P=.815; antipathy: U=2216.5, P=.988; neglect: U=1906.0, P=.835) or depressed mood (sexual abuse: χ(2)=0.634, P=.277; physical abuse: χ(2)=0.159, P=.419; antipathy: χ(2)=0.868, P=.229; neglect: χ(2)=0.639, P=.274). These findings provide justification for the use in future studies of retrospective reports of childhood abuse obtained from individuals with psychotic disorders.

  12. Reliability Analysis of Phased Mission Systems by the Considering the Sensitivity Analysis, Uncertainty and Common Cause Failure Analysis using the GO-FLOW Methodology

    Muhammad Hashim


    Full Text Available The reliability is the probability that a device will perform its required function under stated conditions for a specified period of time. The Common Cause Failure (CCFs is the multiple failures and has long been recognized (U.S. NRC, 1975 as an important issue in the Probabilistic Safety Assessment (PSA and uncertainty and sensitivity analysis has the important information for the evaluation of system reliability. In this study, two cases has been considered, in the first case, author have made the analysis of reliability of PWR safety system by GO-FLOW methodology alternatively to Fault Tree Analysis and Even Tree because it is success-oriented system analysis technique and comparatively easy to conduct the reliability analysis of the complex system. In the second case, sensitivity analysis has been made in order to prioritize the important parameters which have largest contribution to system reliability and also for common cause failure analysis and uncertainty analysis. For an example of phased mission system, PWR containment spray system has been considered.

  13. Operation of Reliability Analysis Center (FY85-87)


    environmental conditions at the time of the reported failure as well as the exact nature of the failure. 4 The diskette format (FMDR-21A) contains...based upon the reliability and maintainability standards and tasks delineated in NAC R&M-STD-ROO010 (Reliability Program Requirements Seleccion ). These...characteristics, environmental conditions at the time of the reported failure, and the exact nature of the failure, which has been categorized as follows


    Zhao Jingyi; Zhuoru; Wang Yiqun


    According to the demand of high reliability of the primary cylinder of the hydraulic press,the reliability model of the primary cylinder is built after its reliability analysis.The stress of the primary cylinder is analyzed by finite element software-MARC,and the structure reliability of the cylinder based on stress-strength model is predicted,which would provide the reference to the design.

  15. Reliability and Security Analysis on Two-Cell Dynamic Redundant System

    Hongsheng Su


    Full Text Available Based on analysis on reliability and security on three types of two-cell dynamic redundant systems which has been widely applied in modern railway signal system, whose isomorphic Markov model was established in this paper. During modeling several important factors, including common-cause failure, coverage of diagnostic systems, online maintainability, and periodic inspection maintenance, and as well as many failure modes, were considered, which made the established model more credible. Through analysis and calculation on reliability and security indexes of the three types of two-module dynamic redundant structures, the paper acquires a significant conclusion, i.e., the safety and reliability of the kind of structure possesses an upper limit, and can not be inordinately improved through the hardware and software comparison methods under the failure and repairing rate fixed. Finally, the paper performs the simulation investigations, and compares the calculation results of the three redundant systems, and analysis each advantages and disadvantages, and gives out each application scope, which provides a theoretical technical support for the railway signal equipments selection.

  16. Development of Markov model of emergency diesel generator for dynamic reliability analysis

    Jin, Young Ho; Choi, Sun Yeong; Yang, Joon Eon [Korea Atomic Energy Research Institute, Taejon (Korea)


    The EDG (Emergency Diesal Generator) of nuclear power plant is one of the most important equipments in mitigating accidents. The FT (Fault Tree) method is widely used to assess the reliability of safety systems like an EDG in nuclear power plant. This method, however, has limitations in modeling dynamic features of safety systems exactly. We, hence, have developed a Markov model to represent the stochastic process of dynamic systems whose states change as time moves on. The Markov model enables us to develop a dynamic reliability model of EDG. This model can represent all possible states of EDG comparing to the FRANTIC code developed by U.S. NRC for the reliability analysis of standby systems. to access the regulation policy for test interval, we performed two simulations based on the generic data and plant specific data of YGN 3, respectively by using the developed model. We also estimate the effects of various repair rates and the fractions of starting failures by demand shock to the reliability of EDG. And finally, Aging effect is analyzed. (author). 23 refs., 19 figs., 9 tabs.

  17. Reliability Analysis for the Fatigue Limit State of the ASTRID Offshore Platform

    Vrouwenvelder, A.C.W.M.; Gostelie, E.M.


    A reliability analysis with respect to fatigue failure was performed for a concrete gravity platform designed for the Troll field. The reliability analysis was incorporated in the practical design-loop to gain more insight into the complex fatigue problem. In the analysis several parameters relating


    C.L. Liu; Z.Z. Lü; Y.L. Xu


    Reliability analysis methods based on the linear damage accumulation law (LDAL) and load-life interference model are studied in this paper. According to the equal probability rule, the equivalent loads are derived, and the reliability analysis method based on load-life interference model and recurrence formula is constructed. In conjunction with finite element analysis (FEA) program, the reliability of an aero engine turbine disk under low cycle fatigue (LCF) condition has been analyzed. The results show the turbine disk is safety and the above reliability analysis methods are feasible.

  19. The European COPHES/DEMOCOPHES project: towards transnational comparability and reliability of human biomonitoring results.

    Schindler, Birgit Karin; Esteban, Marta; Koch, Holger Martin; Castano, Argelia; Koslitz, Stephan; Cañas, Ana; Casteleyn, Ludwine; Kolossa-Gehring, Marike; Schwedler, Gerda; Schoeters, Greet; Hond, Elly Den; Sepai, Ovnair; Exley, Karen; Bloemen, Louis; Horvat, Milena; Knudsen, Lisbeth E; Joas, Anke; Joas, Reinhard; Biot, Pierre; Aerts, Dominique; Lopez, Ana; Huetos, Olga; Katsonouri, Andromachi; Maurer-Chronakis, Katja; Kasparova, Lucie; Vrbík, Karel; Rudnai, Peter; Naray, Miklos; Guignard, Cedric; Fischer, Marc E; Ligocka, Danuta; Janasik, Beata; Reis, M Fátima; Namorado, Sónia; Pop, Cristian; Dumitrascu, Irina; Halzlova, Katarina; Fabianova, Eleonora; Mazej, Darja; Tratnik, Janja Snoj; Berglund, Marika; Jönsson, Bo; Lehmann, Andrea; Crettaz, Pierre; Frederiksen, Hanne; Nielsen, Flemming; McGrath, Helena; Nesbitt, Ian; De Cremer, Koen; Vanermen, Guido; Koppen, Gudrun; Wilhelm, Michael; Becker, Kerstin; Angerer, Jürgen


    COPHES/DEMOCOPHES has its origins in the European Environment and Health Action Plan of 2004 to "develop a coherent approach on human biomonitoring (HBM) in Europe". Within this twin-project it was targeted to collect specimens from 120 mother-child-pairs in each of the 17 participating European countries. These specimens were investigated for six biomarkers (mercury in hair; creatinine, cotinine, cadmium, phthalate metabolites and bisphenol A in urine). The results for mercury in hair are described in a separate paper. Each participating member state was requested to contract laboratories, for capacity building reasons ideally within its borders, carrying out the chemical analyses. To ensure comparability of analytical data a Quality Assurance Unit (QAU) was established which provided the participating laboratories with standard operating procedures (SOP) and with control material. This material was specially prepared from native, non-spiked, pooled urine samples and was tested for homogeneity and stability. Four external quality assessment exercises were carried out. Highly esteemed laboratories from all over the world served as reference laboratories. Web conferences after each external quality assessment exercise functioned as a new and effective tool to improve analytical performance, to build capacity and to educate less experienced laboratories. Of the 38 laboratories participating in the quality assurance exercises 14 laboratories qualified for cadmium, 14 for creatinine, 9 for cotinine, 7 for phthalate metabolites and 5 for bisphenol A in urine. In the last of the four external quality assessment exercises the laboratories that qualified for DEMOCOPHES performed the determinations in urine with relative standard deviations (low/high concentration) of 18.0/2.1% for cotinine, 14.8/5.1% for cadmium, 4.7/3.4% for creatinine. Relative standard deviations for the newly emerging biomarkers were higher, with values between 13.5 and 20.5% for bisphenol A and

  20. Methods for communication-network reliability analysis - Probabilistic graph reduction

    Shooman, Andrew M.; Kershenbaum, Aaron

    The authors have designed and implemented a graph-reduction algorithm for computing the k-terminal reliability of an arbitrary network with possibly unreliable nodes. The two contributions of the present work are a version of the delta-y transformation for k-terminal reliability and an extension of Satyanarayana and Wood's polygon to chain transformations to handle graphs with imperfect vertices. The exact algorithm is faster than or equal to that of Satyanarayana and Wood and the simple algorithm without delta-y and polygon to chain transformations for every problem considered. The exact algorithm runs in linear time on series-parallel graphs and is faster than the above-stated algorithms for huge problems which run in exponential time. The approximate algorithms reduce the computation time for the network reliability problem by two to three orders of magnitude for large problems, while providing reasonably accurate answers in most cases.

  1. Reliability modeling and analysis of smart power systems

    Karki, Rajesh; Verma, Ajit Kumar


    The volume presents the research work in understanding, modeling and quantifying the risks associated with different ways of implementing smart grid technology in power systems in order to plan and operate a modern power system with an acceptable level of reliability. Power systems throughout the world are undergoing significant changes creating new challenges to system planning and operation in order to provide reliable and efficient use of electrical energy. The appropriate use of smart grid technology is an important drive in mitigating these problems and requires considerable research acti

  2. Windfarm Generation Assessment for Reliability Analysis of Power Systems

    Barberis Negra, Nicola; Bak-Jensen, Birgitte; Holmstrøm, O.


    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  3. Windfarm Generation Assessment for ReliabilityAnalysis of Power Systems

    Negra, Nicola Barberis; Holmstrøm, Ole; Bak-Jensen, Birgitte


    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...... in a reliability model and the generation of a windfarm is evaluated by means of sequential Monte Carlo simulation. Results are used to analyse how each mentioned Factor influences the assessment, and why and when they should be included in the model....

  4. Technical information report: Plasma melter operation, reliability, and maintenance analysis

    Hendrickson, D.W. [ed.


    This document provides a technical report of operability, reliability, and maintenance of a plasma melter for low-level waste vitrification, in support of the Hanford Tank Waste Remediation System (TWRS) Low-Level Waste (LLW) Vitrification Program. A process description is provided that minimizes maintenance and downtime and includes material and energy balances, equipment sizes and arrangement, startup/operation/maintence/shutdown cycle descriptions, and basis for scale-up to a 200 metric ton/day production facility. Operational requirements are provided including utilities, feeds, labor, and maintenance. Equipment reliability estimates and maintenance requirements are provided which includes a list of failure modes, responses, and consequences.

  5. Embedded mechatronic systems 1 analysis of failures, predictive reliability

    El Hami, Abdelkhalak


    In operation, mechatronics embedded systems are stressed by loads of different causes: climate (temperature, humidity), vibration, electrical and electromagnetic. These stresses in components which induce failure mechanisms should be identified and modeled for better control. AUDACE is a collaborative project of the cluster Mov'eo that address issues specific to mechatronic reliability embedded systems. AUDACE means analyzing the causes of failure of components of mechatronic systems onboard. The goal of the project is to optimize the design of mechatronic devices by reliability. The projec

  6. Comparing Criteria for Attachment Disorders: Establishing Reliability and Validity in High-Risk Samples.

    Boris, Neil W.; Hinshaw-Fuselier, Sarah S.; Smyke, Anna T.; Scheeringa, Michael S.; Heller, Sherryl S.; Zeanah, Charles H.


    Objective: To determine whether published subtypes of attachment disorder can be reliably identified by trained clinicians reviewing data from high-risk populations and to investigate the relationship between disorder classification and standardized measures of attachment behavior. Method: Twenty or more children aged 18 to 48 months and their…

  7. Evaluating Written Patient Information for Eczema in German: Comparing the Reliability of Two Instruments, DISCERN and EQIP.

    Megan E McCool

    Full Text Available Patients actively seek information about how to cope with their health problems, but the quality of the information available varies. A number of instruments have been developed to assess the quality of patient information, primarily though in English. Little is known about the reliability of these instruments when applied to patient information in German. The objective of our study was to investigate and compare the reliability of two validated instruments, DISCERN and EQIP, in order to determine which of these instruments is better suited for a further study pertaining to the quality of information available to German patients with eczema. Two independent raters evaluated a random sample of 20 informational brochures in German. All the brochures addressed eczema as a disorder and/or therapy options and care. Intra-rater and inter-rater reliability were assessed by calculating intra-class correlation coefficients, agreement was tested with weighted kappas, and the correlation of the raters' scores for each instrument was measured with Pearson's correlation coefficient. DISCERN demonstrated substantial intra- and inter-rater reliability. It also showed slightly better agreement than EQIP. There was a strong correlation of the raters' scores for both instruments. The findings of this study support the reliability of both DISCERN and EQIP. However, based on the results of the inter-rater reliability, agreement and correlation analyses, we consider DISCERN to be the more precise tool for our project on patient information concerning the treatment and care of eczema.

  8. Structural characterization of genomes by large scale sequence-structure threading: application of reliability analysis in structural genomics

    Brunham Robert C


    Full Text Available Abstract Background We establish that the occurrence of protein folds among genomes can be accurately described with a Weibull function. Systems which exhibit Weibull character can be interpreted with reliability theory commonly used in engineering analysis. For instance, Weibull distributions are widely used in reliability, maintainability and safety work to model time-to-failure of mechanical devices, mechanisms, building constructions and equipment. Results We have found that the Weibull function describes protein fold distribution within and among genomes more accurately than conventional power functions which have been used in a number of structural genomic studies reported to date. It has also been found that the Weibull reliability parameter β for protein fold distributions varies between genomes and may reflect differences in rates of gene duplication in evolutionary history of organisms. Conclusions The results of this work demonstrate that reliability analysis can provide useful insights and testable predictions in the fields of comparative and structural genomics.

  9. Probability maps as a measure of reliability for indivisibility analysis

    Joksić Dušan


    Full Text Available Digital terrain models (DTMs represent segments of spatial data bases related to presentation of terrain features and landforms. Square grid elevation models (DEMs have emerged as the most widely used structure during the past decade because of their simplicity and simple computer implementation. They have become an important segment of Topographic Information Systems (TIS, storing natural and artificial landscape in forms of digital models. This kind of a data structure is especially suitable for morph metric terrain evaluation and analysis, which is very important in environmental and urban planning and Earth surface modeling applications. One of the most often used functionalities of Geographical information systems software packages is indivisibility or view shed analysis of terrain. Indivisibility determination from analog topographic maps may be very exhausting, because of the large number of profiles that have to be extracted and compared. Terrain representation in form of the DEMs databases facilitates this task. This paper describes simple algorithm for terrain view shed analysis by using DEMs database structures, taking into consideration the influence of uncertainties of such data to the results obtained thus far. The concept of probability maps is introduced as a mean for evaluation of results, and is presented as thematic display.

  10. A Reliable Method for Rhythm Analysis during Cardiopulmonary Resuscitation

    U. Ayala


    Full Text Available Interruptions in cardiopulmonary resuscitation (CPR compromise defibrillation success. However, CPR must be interrupted to analyze the rhythm because although current methods for rhythm analysis during CPR have high sensitivity for shockable rhythms, the specificity for nonshockable rhythms is still too low. This paper introduces a new approach to rhythm analysis during CPR that combines two strategies: a state-of-the-art CPR artifact suppression filter and a shock advice algorithm (SAA designed to optimally classify the filtered signal. Emphasis is on designing an algorithm with high specificity. The SAA includes a detector for low electrical activity rhythms to increase the specificity, and a shock/no-shock decision algorithm based on a support vector machine classifier using slope and frequency features. For this study, 1185 shockable and 6482 nonshockable 9-s segments corrupted by CPR artifacts were obtained from 247 patients suffering out-of-hospital cardiac arrest. The segments were split into a training and a test set. For the test set, the sensitivity and specificity for rhythm analysis during CPR were 91.0% and 96.6%, respectively. This new approach shows an important increase in specificity without compromising the sensitivity when compared to previous studies.

  11. A Reliable Method for Rhythm Analysis during Cardiopulmonary Resuscitation

    Ayala, U.; Irusta, U.; Ruiz, J.; Eftestøl, T.; Kramer-Johansen, J.; Alonso-Atienza, F.; Alonso, E.; González-Otero, D.


    Interruptions in cardiopulmonary resuscitation (CPR) compromise defibrillation success. However, CPR must be interrupted to analyze the rhythm because although current methods for rhythm analysis during CPR have high sensitivity for shockable rhythms, the specificity for nonshockable rhythms is still too low. This paper introduces a new approach to rhythm analysis during CPR that combines two strategies: a state-of-the-art CPR artifact suppression filter and a shock advice algorithm (SAA) designed to optimally classify the filtered signal. Emphasis is on designing an algorithm with high specificity. The SAA includes a detector for low electrical activity rhythms to increase the specificity, and a shock/no-shock decision algorithm based on a support vector machine classifier using slope and frequency features. For this study, 1185 shockable and 6482 nonshockable 9-s segments corrupted by CPR artifacts were obtained from 247 patients suffering out-of-hospital cardiac arrest. The segments were split into a training and a test set. For the test set, the sensitivity and specificity for rhythm analysis during CPR were 91.0% and 96.6%, respectively. This new approach shows an important increase in specificity without compromising the sensitivity when compared to previous studies. PMID:24895621

  12. Resource allocatiion: sequential data collection for reliability analysis involving systems and component level data

    Anderson-cooke, Christine M [Los Alamos National Laboratory


    In analyzing the reliability of complex systems, several types of data from full-system tests to component level tests are commonly available and are used. After a preliminary analysis, additional resources may be available to collect new data. The goal of resource allocation is to identify the best new data to collect to maximally improve the prediction of system reliability. While several possible definitions of 'maximally improve' are possible, we focus on reducing the uncertainty or the width of the uncertainty interval for the prediction of system reliability at a user-specified age(s). In this paper, we present an algorithm that allows us to estimate the anticipated improvement to the analysis with the addition of new data, based on current understanding of all of the statistical model parameters. This quantitative assessment of the anticipated improvement can be helpful to justify the benefits of collecting new data. Additionally by comparing different potential allocations, it is possible to determine what new data should be collected to improve our understanding of the response. This optimization takes into account the relative cost of different data types and can be based on flexible allocation options, or subject to logistical constraints.

  13. Reliability analysis of common hazardous waste treatment processes

    Waters, R.D. [Vanderbilt Univ., Nashville, TN (United States)


    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  14. Windfarm generation assessment for reliability analysis of power systems

    Negra, N.B.; Holmstrøm, O.; Bak-Jensen, B.;


    Due to the fast development of wind generation in the past ten years, increasing interest has been paid to techniques for assessing different aspects of power systems with a large amount of installed wind generation. One of these aspects concerns power system reliability. Windfarm modelling plays...

  15. Architecture-Based Reliability Analysis of Web Services

    Rahmani, Cobra Mariam


    In a Service Oriented Architecture (SOA), the hierarchical complexity of Web Services (WS) and their interactions with the underlying Application Server (AS) create new challenges in providing a realistic estimate of WS performance and reliability. The current approaches often treat the entire WS environment as a black-box. Thus, the sensitivity…

  16. Statistical Analysis of Human Reliability of Armored Equipment

    LIU Wei-ping; CAO Wei-guo; REN Jing


    Human errors of seven types of armored equipment, which occur during the course of field test, are statistically analyzed. The human error-to-armored equipment failure ratio is obtained. The causes of human errors are analyzed. The distribution law of human errors is acquired. The ratio of human errors and human reliability index are also calculated.

  17. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Bruce Weaver


    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  18. The Barthel Index: comparing inter-rater reliability between nurses and doctors in an older adult rehabilitation unit.

    Hartigan, Irene


    To ensure accuracy in recording the Barthel Index (BI) in older people, it is essential to determine who is best placed to administer the index. The aim of this study was to compare doctors\\' and nurses\\' reliability in scoring the BI.

  19. Learning effect of isokinetic measurements in healthy subjects, and reliability and comparability of Biodex and Lido dynamometers

    Lund, Hans; Søndergaard, K; Zachariassen, T


    The aim of this study was to examine the learning effect during a set of isokinetic measurements, to evaluate the reliability of the Biodex System 3 PRO dynamometer, and to compare the Biodex System 3 PRO and the Lido Active dynamometers on both extension and flexion over the elbow and the knee...... extension (P = 0.18) and elbow extension (P = 0.63). However, elbow flexion showed a 14.8% (95% CI: 11.2-18.4%; P = 0.0001) higher peak torque on Biodex. In conclusion, no learning effect was observed and the Biodex proved to be a highly reliable isokinetic dynamometer. A difference was observed when...

  20. Reliability Analysis for Tunnel Supports System by Using Finite Element Method

    E. Bukaçi


    Full Text Available Reliability analysis is a method that can be used in almost any geotechnical engineering problem. Using this method requires the knowledge of parameter uncertainties, which can be expressed by their standard deviation value. By performing reliability analysis to tunnel supports design, can be obtained a range of safety factors and by using them, probability of failure can be calculated. Problem becomes more complex when this analysis is performed for numerical methods, such as Finite Element Method. This paper gives a solution to how reliability analysis can be performed to design tunnel supports, by using Point Estimate Method to calculate reliability index. As a case study, is chosen one of the energy tunnels at Fan Hydropower plant, in Rrëshen Albania. As results, values of factor of safety and probability of failure are calculated. Also some suggestions using reliability analysis with numerical methods are given.

  1. Reliability Analysis and Standardization of Spacecraft Command Generation Processes

    Meshkat, Leila; Grenander, Sven; Evensen, Ken


    center dot In order to reduce commanding errors that are caused by humans, we create an approach and corresponding artifacts for standardizing the command generation process and conducting risk management during the design and assurance of such processes. center dot The literature review conducted during the standardization process revealed that very few atomic level human activities are associated with even a broad set of missions. center dot Applicable human reliability metrics for performing these atomic level tasks are available. center dot The process for building a "Periodic Table" of Command and Control Functions as well as Probabilistic Risk Assessment (PRA) models is demonstrated. center dot The PRA models are executed using data from human reliability data banks. center dot The Periodic Table is related to the PRA models via Fault Links.

  2. Mapping Green Spaces in Bishkek—How Reliable can Spatial Analysis Be?

    Peter Hofmann


    Full Text Available Within urban areas, green spaces play a critically important role in the quality of life. They have remarkable impact on the local microclimate and the regional climate of the city. Quantifying the ‘greenness’ of urban areas allows comparing urban areas at several levels, as well as monitoring the evolution of green spaces in urban areas, thus serving as a tool for urban and developmental planning. Different categories of vegetation have different impacts on recreation potential and microclimate, as well as on the individual perception of green spaces. However, when quantifying the ‘greenness’ of urban areas the reliability of the underlying information is important in order to qualify analysis results. The reliability of geo-information derived from remote sensing data is usually assessed by ground truth validation or by comparison with other reference data. When applying methods of object based image analysis (OBIA and fuzzy classification, the degrees of fuzzy membership per object in general describe to what degree an object fits (prototypical class descriptions. Thus, analyzing the fuzzy membership degrees can contribute to the estimation of reliability and stability of classification results, even when no reference data are available. This paper presents an object based method using fuzzy class assignments to outline and classify three different classes of vegetation from GeoEye imagery. The classification result, its reliability and stability are evaluated using the reference-free parameters Best Classification Result and Classification Stability as introduced by Benz et al. in 2004 and implemented in the software package eCognition ( To demonstrate the application potentials of results a scenario for quantifying urban ‘greenness’ is presented.

  3. Reliability importance analysis of Markovian systems at steady state using perturbation analysis

    Phuc Do Van [Institut Charles Delaunay - FRE CNRS 2848, Systems Modeling and Dependability Group, Universite de technologie de Troyes, 12, rue Marie Curie, BP 2060-10010 Troyes cedex (France); Barros, Anne [Institut Charles Delaunay - FRE CNRS 2848, Systems Modeling and Dependability Group, Universite de technologie de Troyes, 12, rue Marie Curie, BP 2060-10010 Troyes cedex (France)], E-mail:; Berenguer, Christophe [Institut Charles Delaunay - FRE CNRS 2848, Systems Modeling and Dependability Group, Universite de technologie de Troyes, 12, rue Marie Curie, BP 2060-10010 Troyes cedex (France)


    Sensitivity analysis has been primarily defined for static systems, i.e. systems described by combinatorial reliability models (fault or event trees). Several structural and probabilistic measures have been proposed to assess the components importance. For dynamic systems including inter-component and functional dependencies (cold spare, shared load, shared resources, etc.), and described by Markov models or, more generally, by discrete events dynamic systems models, the problem of sensitivity analysis remains widely open. In this paper, the perturbation method is used to estimate an importance factor, called multi-directional sensitivity measure, in the framework of Markovian systems. Some numerical examples are introduced to show why this method offers a promising tool for steady-state sensitivity analysis of Markov processes in reliability studies.

  4. Analysis of Syetem Reliability in Manufacturing Cell Based on Triangular Fuzzy Number

    ZHANG Caibo; HAN Botang; SUN Changsen; XU Chunjie


    Due to lacking of test-data and field-data in reliability research during the design stage of manufacturing cell system. The degree of manufacturing cell system reliability research is increased. In order to deal with the deficient data and the uncertainty occurred from analysis and judgment, the paper discussed a method for studying reliability of manufacturing cell system through the analysis of fuzzy fault tree, which was based on triangular fuzzy number. At last, calculation case indicated that it would have great significance for ascertaining reliability index, maintenance and establishing keeping strategy towards manufacturing cell system.

  5. Reliability Analysis of Bearing Capacity of Large-Diameter Piles under Osterberg Test

    Lei Nie


    Full Text Available This study gives the reliability analysis of bearing capacity of large-diameter piles under osterberg test. The limit state equation of dimensionless random variables is utilized in the reliability analysis of vertical bearing capacity of large-diameter piles based on Osterberg loading tests. And the reliability index and the resistance partial coefficient under the current specifications are calculated using calibration method. The results show: the reliable index of large-diameter piles is correlated with the load effect ratio and is smaller than the ordinary piles; resistance partial coefficient of 1.53 is proper in design of large-diameter piles.

  6. Local probabilistic sensitivity measures for comparing FORM and Monte Carlo calculations illustrated with dike ring reliability calculations

    Cooke, Roger M.; van Noortwijk, Jan M.


    We define local probabilistic sensitivity measures as proportional to ∂E( X i| Z = z)/ ∂z, where Z is a function of random variables XI,…, X n. These measures are local in that they depend only on the neighborhood of Z = z, but unlike other local sensitivity measures, the local probabilistic sensitivity of X i does not depend on values of other input variables. For the independent linear normal model, or indeed for any model for which X i has linear regression on Z, the above measure equals σx iρ ( Z,X i)/ σz. When linear regression does not hold, the new sensitivity measures can be compared with the correlation coefficients to indicate degree of departure from linearity. We say that Z is probabilistically dissonant in X i at Z = z if Z is increasing (decreasing) in X i at z, but probabilistically decreasing (increasing) at z. Probabilistic dissonance is rather common in complicated models. The new measures are able to pick up this probabilistic dissonance. These notions are illustrated with data from an ongoing uncertainty analysis of dike ring reliability.

  7. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S


    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  8. Acquisition and statistical analysis of reliability data for I and C parts in plant protection system

    Lim, T. J.; Byun, S. S.; Han, S. H.; Lee, H. J.; Lim, J. S.; Oh, S. J.; Park, K. Y.; Song, H. S. [Soongsil Univ., Seoul (Korea)


    This project has been performed in order to construct I and C part reliability databases for detailed analysis of plant protection system and to develop a methodology for analysing trip set point drifts. Reliability database for the I and C parts of plant protection system is required to perform the detailed analysis. First, we have developed an electronic part reliability prediction code based on MIL-HDBK-217F. Then we have collected generic reliability data for the I and C parts in plant protection system. Statistical analysis procedure has been developed to process the data. Then the generic reliability database has been constructed. We have also collected plant specific reliability data for the I and C parts in plant protection system for YGN 3,4 and UCN 3,4 units. Plant specific reliability database for I and C parts has been developed by the Bayesian procedure. We have also developed an statistical analysis procedure for set point drift, and performed analysis of drift effects for trip set point. The basis for the detailed analysis can be provided from the reliability database for the PPS I and C parts. The safety of the KSNP and succeeding NPPs can be proved by reducing the uncertainty of PSA. Economic and efficient operation of NPP can be possible by optimizing the test period to reduce utility's burden. 14 refs., 215 figs., 137 tabs. (Author)

  9. Mathematical modeling and reliability analysis of a 3D Li-ion battery



    Full Text Available The three-dimensional (3D Li-ion battery presents an effective solution to issues affecting its two-dimensional counterparts, as it is able to attain high energy capacities for the same areal footprint without sacrificing power density. A 3D battery has key structural features extending in and fully utilizing 3D space, allowing it to achieve greater reliability and longevity. This study applies an electrochemical-thermal coupled model to a checkerboard array of alternating positive and negative electrodes in a 3D architecture with either square or circular electrodes. The mathematical model comprises the transient conservation of charge, species, and energy together with electroneutrality, constitutive relations and relevant initial and boundary conditions. A reliability analysis carried out to simulate malfunctioning of either a positive or negative electrode reveals that although there are deviations in electrochemical and thermal behavior for electrodes adjacent to the malfunctioning electrode as compared to that in a fully-functioning array, there is little effect on electrodes further away, demonstrating the redundancy that a 3D electrode array provides. The results demonstrate that implementation of 3D batteries allow it to reliably and safely deliver power even if a component malfunctions, a strong advantage over conventional 2D batteries.

  10. Reliability and availability analysis of dependent-dynamic systems with DRBDs

    Distefano, Salvatore [University of Messina, Department of Mathematics, Engineering Faculty, Contrada di Dio, S. Agata, 98166 Messina (Italy)], E-mail:; Puliafito, Antonio [University of Messina, Department of Mathematics, Engineering Faculty, Contrada di Dio, S. Agata, 98166 Messina (Italy)], E-mail:


    Reliability/availability evaluation is an important, often indispensable, step in designing and analyzing (critical) systems, whose importance is constantly growing. When the complexity of a system is high, dynamic effects can arise or become significant. The system might be affected by dependent, cascade, on-demand and/or common cause failures, its units could interfere (load sharing, inter/sequence-dependency), and so on. It is also of great interest to evaluate redundancy and maintenance policies but, since dynamic behaviors usually do not satisfy the stochastic independence assumption, notations such as reliability block diagrams (RBDs), fault trees (FTs) or reliability graphs (RGs) become approximated/simplified techniques, unable to capture dynamic-dependent behaviors. To overcome such problem we developed a new formalism derived from RBDs: the dynamic RBDs (DRBDs). In this paper we explain how the DRBDs notation is able to adequately model and therefore analyze dynamic-dependent behaviors and complex systems. Particular emphasis is given to the modeling and the analysis phases, from both the theoretical and the practical point of views. Several case studies of dynamic-dependent systems, selected from literature and related to different application fields, are proposed. In this way we also compare the DRBDs approach with other methodologies, demonstrating its effectiveness.

  11. Reliability and Sensitivity Analysis of Cast Iron Water Pipes for Agricultural Food Irrigation

    Yanling Ni


    Full Text Available This study aims to investigate the reliability and sensitivity of cast iron water pipes for agricultural food irrigation. The Monte Carlo simulation method is used for fracture assessment and reliability analysis of cast iron pipes for agricultural food irrigation. Fracture toughness is considered as a limit state function for corrosion affected cast iron pipes. Then the influence of failure mode on the probability of pipe failure has been discussed. Sensitivity analysis also is carried out to show the effect of changing basic parameters on the reliability and life time of the pipe. The analysis results show that the applied methodology can consider different random variables for estimating of life time of the pipe and it can also provide scientific guidance for rehabilitation and maintenance plans for agricultural food irrigation. In addition, the results of the failure and reliability analysis in this study can be useful for designing of more reliable new pipeline systems for agricultural food irrigation.

  12. Structural Reliability Analysis for Implicit Performance with Legendre Orthogonal Neural Network Method

    Lirong Sha; Tongyu Wang


    In order to evaluate the failure probability of a complicated structure, the structural responses usually need to be estimated by some numerical analysis methods such as finite element method ( FEM) . The response surface method ( RSM) can be used to reduce the computational effort required for reliability analysis when the performance functions are implicit. However, the conventional RSM is time⁃consuming or cumbersome if the number of random variables is large. This paper proposes a Legendre orthogonal neural network ( LONN)⁃based RSM to estimate the structural reliability. In this method, the relationship between the random variables and structural responses is established by a LONN model. Then the LONN model is connected to a reliability analysis method, i.e. first⁃order reliability methods (FORM) to calculate the failure probability of the structure. Numerical examples show that the proposed approach is applicable to structural reliability analysis, as well as the structure with implicit performance functions.

  13. Non-probabilistic fuzzy reliability analysis of pile foundation stability by interval theory


    Randomness and fuzziness are among the attributes of the influential factors for stability assessment of pile foundation.According to these two characteristics, the triangular fuzzy number analysis approach was introduced to determine the probability-distributed function of mechanical parameters. Then the functional function of reliability analysis was constructed based on the study of bearing mechanism of pile foundation, and the way to calculate interval values of the functional function was developed by using improved interval-truncation approach and operation rules of interval numbers. Afterwards, the non-probabilistic fuzzy reliability analysis method was applied to assessing the pile foundation, from which a method was presented for nonprobabilistic fuzzy reliability analysis of pile foundation stability by interval theory. Finally, the probability distribution curve of nonprobabilistic fuzzy reliability indexes of practical pile foundation was concluded. Its failure possibility is 0.91%, which shows that the pile foundation is stable and reliable.

  14. Reliability analysis of a wastewater treatment plant using fault tree analysis and Monte Carlo simulation.

    Taheriyoun, Masoud; Moradinejad, Saber


    The reliability of a wastewater treatment plant is a critical issue when the effluent is reused or discharged to water resources. Main factors affecting the performance of the wastewater treatment plant are the variation of the influent, inherent variability in the treatment processes, deficiencies in design, mechanical equipment, and operational failures. Thus, meeting the established reuse/discharge criteria requires assessment of plant reliability. Among many techniques developed in system reliability analysis, fault tree analysis (FTA) is one of the popular and efficient methods. FTA is a top down, deductive failure analysis in which an undesired state of a system is analyzed. In this study, the problem of reliability was studied on Tehran West Town wastewater treatment plant. This plant is a conventional activated sludge process, and the effluent is reused in landscape irrigation. The fault tree diagram was established with the violation of allowable effluent BOD as the top event in the diagram, and the deficiencies of the system were identified based on the developed model. Some basic events are operator's mistake, physical damage, and design problems. The analytical method is minimal cut sets (based on numerical probability) and Monte Carlo simulation. Basic event probabilities were calculated according to available data and experts' opinions. The results showed that human factors, especially human error had a great effect on top event occurrence. The mechanical, climate, and sewer system factors were in subsequent tier. Literature shows applying FTA has been seldom used in the past wastewater treatment plant (WWTP) risk analysis studies. Thus, the developed FTA model in this study considerably improves the insight into causal failure analysis of a WWTP. It provides an efficient tool for WWTP operators and decision makers to achieve the standard limits in wastewater reuse and discharge to the environment.

  15. Reliability of the ATD Angle in Dermatoglyphic Analysis.

    Brunson, Emily K; Hohnan, Darryl J; Giovas, Christina M


    The "ATD" angle is a dermatoglyphic trait formed by drawing lines between the triradii below the first and last digits and the most proximal triradius on the hypothenar region of the palm. This trait has been widely used in dermatoglyphic studies, but several researchers have questioned its utility, specifically whether or not it can be measured reliably. The purpose of this research was to examine the measurement reliability of this trait. Finger and palm prints were taken using the carbon paper and tape method from the right and left hands of 100 individuals. Each "ATD" angle was read twice, at different times, by Reader A, using a goniometer and a magnifying glass, and three times by a Reader B, using Adobe Photoshop. Inter-class correlation coefficients were estimated for the intra- and inter-reader measurements of the "ATD" angles. Reader A was able to quantify ATD angles on 149 out of 200 prints (74.5%), and Reader B on 179 out of 200 prints (89.5%). Both readers agreed on whether an angle existed on a print 89.8% of the time for the right hand and 78.0% for the left. Intra-reader correlations were 0.97 or greater for both readers. Inter-reader correlations for "ATD" angles measured by both readers ranged from 0.92 to 0.96. These results suggest that the "ATD" angle can be measured reliably, and further imply that measurement using a software program may provide an advantage over other methods.

  16. Comparative genomic analysis of eutherian interferon-γ-inducible GTPases.

    Premzl, Marko


    The interferon-γ-inducible GTPases, IFGGs, are intracellular proteins involved in immune response against pathogens. A comprehensive comparative genomic review and analysis of eutherian IFGGs was carried out using public genomic sequences. The 64 eutherian IFGG genes were examined in detail and annotated. The eutherian IFGG promoter types were first catalogued followed by a phylogenetic analysis of eutherian IFGGs, which described five major IFGG clusters. The patterns of differential gene expansions and protein regions that may regulate IFGG catalytic features suggested a new classification of eutherian IFGGs. This mini-review has also provided new tests of reliability of public genomic sequences as well as tests of protein molecular evolution.

  17. Markovian reliability analysis under uncertainty with an application on the shutdown system of the Clinch River Breeder Reactor

    Papazoglou, I A; Gyftopoulos, E P


    A methodology for the assessment of the uncertainties about the reliability of nuclear reactor systems described by Markov models is developed, and the uncertainties about the probability of loss of coolable core geometry (LCG) of the Clinch River Breeder Reactor (CRBR) due to shutdown system failures, are assessed. Uncertainties are expressed by assuming the failure rates, the repair rates and all other input variables of reliability analysis as random variables, distributed according to known probability density functions (pdf). The pdf of the reliability is then calculated by the moment matching technique. Two methods have been employed for the determination of the moments of the reliability: the Monte Carlo simulation; and the Taylor-series expansion. These methods are adopted to Markovian problems and compared for accuracy and efficiency.

  18. A disjoint algorithm for seismic reliability analysis of lifeline networks


    The algorithm is based on constructing a disjoin kg t set of the minimal paths in a network system. In this paper,cubic notation was used to describe the logic function of a network in a well-balanced state, and then the sharp-product operation was used to construct the disjoint minimal path set of the network. A computer program has been developed, and when combined with decomposition technology, the reliability of a general lifeline network can be effectively and automatically calculated.

  19. Reliability Analysis of Timber Structures through NDT Data Upgrading

    Sousa, Hélder; Sørensen, John Dalsgaard; Kirkegaard, Poul Henning

    for reliability calculation. In chapter 4, updating methods are conceptualized and defined. Special attention is drawn upon Bayesian methods and its implementation. Also a topic for updating based in inspection of deterioration is provided. State of the art definitions and proposed measurement indices......The first part of this document presents, in chapter 2, a description of timber characteristics and common used NDT and MDT for timber elements. Stochastic models for timber properties and damage accumulation models are also referred. According to timber’s properties a framework is proposed...

  20. A Comparative Study on Error Analysis

    Wu, Xiaoli; Zhang, Chun


    Title: A Comparative Study on Error Analysis Subtitle: - Belgian (L1) and Danish (L1) learners’ use of Chinese (L2) comparative sentences in written production Xiaoli Wu, Chun Zhang Abstract: Making errors is an inevitable and necessary part of learning. The collection, classification and analysis...... the occurrence of errors either in linguistic or pedagogical terms. The purpose of the current study is to demonstrate the theoretical and practical relevance of error analysis approach in CFL by investigating two cases - (1) Belgian (L1) learners’ use of Chinese (L2) comparative sentences in written production....... Finally, pedagogical implication of CFL is discussed and future research is suggested. Keywords: error analysis, comparative sentences, comparative structure ‘‘bǐ - 比’, Chinese as a foreign language (CFL), written production...

  1. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    TONG Lili; CAO Xuewu


    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.


    R.K. Agnihotri


    Full Text Available The present paper deals with the reliability analysis of a system of boiler used in garment industry.The system consists of a single unit of boiler which plays an important role in garment industry. Usingregenerative point technique with Markov renewal process various reliability characteristics of interest areobtained.

  3. Reliability of three-dimensional gait analysis in cervical spondylotic myelopathy.

    McDermott, Ailish


    Gait impairment is one of the primary symptoms of cervical spondylotic myelopathy (CSM). Detailed assessment is possible using three-dimensional gait analysis (3DGA), however the reliability of 3DGA for this population has not been established. The aim of this study was to evaluate the test-retest reliability of temporal-spatial, kinematic and kinetic parameters in a CSM population.

  4. Convergence among Data Sources, Response Bias, and Reliability and Validity of a Structured Job Analysis Questionnaire.

    Smith, Jack E.; Hakel, Milton D.


    Examined are questions pertinent to the use of the Position Analysis Questionnaire: Who can use the PAQ reliably and validly? Must one rely on trained job analysts? Can people having no direct contact with the job use the PAQ reliably and validly? Do response biases influence PAQ responses? (Author/KC)

  5. Risk and reliability analysis theory and applications : in honor of Prof. Armen Der Kiureghian


    This book presents a unique collection of contributions from some of the foremost scholars in the field of risk and reliability analysis. Combining the most advanced analysis techniques with practical applications, it is one of the most comprehensive and up-to-date books available on risk-based engineering. All the fundamental concepts needed to conduct risk and reliability assessments are covered in detail, providing readers with a sound understanding of the field and making the book a powerful tool for students and researchers alike. This book was prepared in honor of Professor Armen Der Kiureghian, one of the fathers of modern risk and reliability analysis.

  6. Reliablity analysis of gravity dams by response surface method

    Humar, Nina; Kryžanowski, Andrej; Brilly, Mitja; Schnabl, Simon


    A dam failure is one of the most important problems in dam industry. Since the mechanical behavior of dams is usually a complex phenomenon existing classical mathematical models are generally insufficient to adequately predict the dam failure and thus the safety of dams. Therefore, numerical reliability methods are often used to model such a complex mechanical phenomena. Thus, the main purpose of the present paper is to present the response surface method as a powerful mathematical tool used to study and foresee the dam safety considering a set of collected monitoring data. The derived mathematical model is applied to a case study, the Moste dam, which is the highest concrete gravity dam in Slovenia. Based on the derived model, the ambient/state variables are correlated with the dam deformation in order to gain a forecasting tool able to define the critical thresholds for dam management.

  7. Creating comparability among reliability coefficients: the case of Cronbach alpha and Cohen kappa.

    Becker, G


    Cronbach alpha and Cohen kappa were compared and found to differ along two major facets. A fourfold classification system based on these facets clarifies the double contrast and produces a common metric allowing direct comparability. A new estimator, coefficient beta, is introduced in the process and is presented as a complement to coefficient alpha in estimating the psychometric properties of test scores and ratings.


    Yao Chengyu; Zhao Jingyi


    To overcome the design limitations of traditional hydraulic control system for synthetic rubber press and such faults as high fault rate, low reliability, high energy-consuming and which always led to shutting down of post-treatment product line for synthetic rubber, brand-new hydraulic system combining with PC control and two-way cartridge valves for the press is developed, whose reliability is analyzed, reliability model of the hydraulic system for the press is established by analyzing processing steps, and reliability simulation of each step and the whole system is carried out by software MATLAB, which is verified through reliability test. The fixed time test has proved not that theory analysis is sound, but the system has characteristics of reasonable design and high reliability,and can lower the required power supply and operational energy cost.

  9. Reactor scram experience for shutdown system reliability analysis. [BWR; PWR

    Edison, G.E.; Pugliese, S.L.; Sacramo, R.F.


    Scram experience in a number of operating light water reactors has been reviewed. The date and reactor power of each scram was compiled from monthly operating reports and personal communications with the operating plant personnel. The average scram frequency from ''significant'' power (defined as P/sub trip//P/sub max greater than/ approximately 20 percent) was determined as a function of operating life. This relationship was then used to estimate the total number of reactor trips from above approximately 20 percent of full power expected to occur during the life of a nuclear power plant. The shape of the scram frequency vs. operating life curve resembles a typical reliability bathtub curve (failure rate vs. time), but without a rising ''wearout'' phase due to the lack of operating data near the end of plant design life. For this case the failures are represented by ''bugs'' in the plant system design, construction, and operation which lead to scram. The number of scrams would appear to level out at an average of around three per year; the standard deviations from the mean value indicate an uncertainty of about 50 percent. The total number of scrams from significant power that could be expected in a plant designed for a 40-year life would be about 130 if no wearout phase develops near the end of life.

  10. Reliability of 3D upper limb motion analysis in children with obstetric brachial plexus palsy.

    Mahon, Judy; Malone, Ailish; Kiernan, Damien; Meldrum, Dara


    Kinematics, measured by 3D upper limb motion analysis (3D-ULMA), can potentially increase understanding of movement patterns by quantifying individual joint contributions. Reliability in children with obstetric brachial plexus palsy (OBPP) has not been established.

  11. Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD): Development of Image Analysis Criteria and Examiner Reliability for Image Analysis

    Ahmad, Mansur; Hollender, Lars; Odont; Anderson, Quentin; Kartha, Krishnan; Ohrbach, Richard K.; Truelove, Edmond L.; John, Mike T.; Schiffman, Eric L.


    Introduction As a part of a multi-site RDC/TMD Validation Project, comprehensive TMJ diagnostic criteria were developed for image analysis using panoramic radiography, magnetic resonance imaging (MRI), and computed tomography (CT). Methods Inter-examiner reliability was estimated using the kappa (k) statistic, and agreement between rater pairs was characterized by overall, positive, and negative percent agreement. CT was the reference standard for assessing validity of other imaging modalities for detecting osteoarthritis (OA). Results For the radiological diagnosis of OA, reliability of the three examiners was poor for panoramic radiography (k = 0.16), fair for MRI (k = 0.46), and close to the threshold for excellent for CT (k = 0.71). Using MRI, reliability was excellent for diagnosing disc displacements (DD) with reduction (k = 0.78) and for DD without reduction (k = 0.94), and was good for effusion (k = 0.64). Overall percent agreement for pair-wise ratings was ≥ 82% for all conditions. Positive percent agreement for diagnosing OA was 19% for panoramic radiography, 59% for MRI, and 84% for CT. Using MRI, positive percent agreement for diagnoses of any DD was 95% and for effusion was 81%. Negative percent agreement was ≥ 88% for all conditions. Compared to CT, panoramic radiography and MRI had poor to marginal sensitivity, respectively, but excellent specificity, in detecting OA. Conclusion Comprehensive image analysis criteria for RDC/TMD Validation Project were developed, which can reliably be employed for assessing OA using CT, and for disc position and effusion using MRI. PMID:19464658

  12. Analysis methods for structure reliability of piping components

    Schimpfke, T.; Grebner, H.; Sievers, J. [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Koeln (Germany)


    In the frame of the German reactor safety research program of the Federal Ministry of Economics and Labour (BMWA) GRS has started to develop an analysis code named PROST (PRObabilistic STructure analysis) for estimating the leak and break probabilities of piping systems in nuclear power plants. The long-term objective of this development is to provide failure probabilities of passive components for probabilistic safety analysis of nuclear power plants. Up to now the code can be used for calculating fatigue problems. The paper mentions the main capabilities and theoretical background of the present PROST development and presents some of the results of a benchmark analysis in the frame of the European project NURBIM (Nuclear Risk Based Inspection Methodologies for Passive Components). (orig.)

  13. Parametric and semiparametric models with applications to reliability, survival analysis, and quality of life

    Nikulin, M; Mesbah, M; Limnios, N


    Parametric and semiparametric models are tools with a wide range of applications to reliability, survival analysis, and quality of life. This self-contained volume examines these tools in survey articles written by experts currently working on the development and evaluation of models and methods. While a number of chapters deal with general theory, several explore more specific connections and recent results in "real-world" reliability theory, survival analysis, and related fields.

  14. Reliability analysis of a gravity-based foundation for wind turbines

    Vahdatirad, Mohammad Javad; Griffiths, D. V.; Andersen, Lars Vabbersgaard


    Deterministic code-based designs proposed for wind turbine foundations, are typically biased on the conservative side, and overestimate the probability of failure which can lead to higher than necessary construction cost. In this study reliability analysis of a gravity-based foundation concerning...... technique to perform the reliability analysis. The calibrated code-based design approach leads to savings of up to 20% in the concrete foundation volume, depending on the target annual reliability level. The study can form the basis for future optimization on deterministic-based designs for wind turbine...... foundations....

  15. Use of Fault Tree Analysis for Automotive Reliability and Safety Analysis

    Lambert, H


    Fault tree analysis (FTA) evolved from the aerospace industry in the 1960's. A fault tree is deductive logic model that is generated with a top undesired event in mind. FTA answers the question, ''how can something occur?'' as opposed to failure modes and effects analysis (FMEA) that is inductive and answers the question, ''what if?'' FTA is used in risk, reliability and safety assessments. FTA is currently being used by several industries such as nuclear power and chemical processing. Typically the automotive industries uses failure modes and effects analysis (FMEA) such as design FMEAs and process FMEAs. The use of FTA has spread to the automotive industry. This paper discusses the use of FTA for automotive applications. With the addition automotive electronics for various applications in systems such as engine/power control, cruise control and braking/traction, FTA is well suited to address failure modes within these systems. FTA can determine the importance of these failure modes from various perspectives such as cost, reliability and safety. A fault tree analysis of a car starting system is presented as an example.

  16. Live versus Video Observations: Comparing the Reliability and Validity of Two Methods of Assessing Classroom Quality

    Curby, Timothy W.; Johnson, Price; Mashburn, Andrew J.; Carlis, Lydia


    When conducting classroom observations, researchers are often confronted with the decision of whether to conduct observations live or by using pre-recorded video. The present study focuses on comparing and contrasting observations of live and video administrations of the Classroom Assessment Scoring System-PreK (CLASS-PreK). Associations between…

  17. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)


    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  18. Comparative Analysis of Uncertainties in Urban Surface Runoff Modelling

    Thorndahl, Søren; Schaarup-Jensen, Kjeld


    In the present paper a comparison between three different surface runoff models, in the numerical urban drainage tool MOUSE, is conducted. Analysing parameter uncertainty, it is shown that the models are very sensitive with regards to the choice of hydrological parameters, when combined overflow...... analysis, further research in improved parameter assessment for surface runoff models is needed....... volumes are compared - especially when the models are uncalibrated. The occurrences of flooding and surcharge are highly dependent on both hydrological and hydrodynamic parameters. Thus, the conclusion of the paper is that if the use of model simulations is to be a reliable tool for drainage system...

  19. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)


    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  20. A Review: Passive System Reliability Analysis – Accomplishments and Unresolved Issues



    Full Text Available Reliability assessment of passive safety systems is one of the important issues, since safety of advanced nuclear reactors rely on several passive features. In this context, a few methodologies such as Reliability Evaluation of Passive Safety System (REPAS, Reliability Methods for Passive Safety Functions (RMPS and Analysis of Passive Systems ReliAbility (APSRA have been developed in the past. These methodologies have been used to assess reliability of various passive safety systems. While these methodologies have certain features in common, but they differ in considering certain issues; for example, treatment of model uncertainties, deviation of geometric and process parameters from their nominal values, etc. This paper presents the state of the art on passive system reliability assessment methodologies, the accomplishments and remaining issues. In this review three critical issues pertaining to passive systems performance and reliability have been identified. The first issue is, applicability of best estimate codes and model uncertainty. The best estimate codes based phenomenological simulations of natural convection passive systems could have significant amount of uncertainties, these uncertainties must be incorporated in appropriate manner in the performance and reliability analysis of such systems. The second issue is the treatment of dynamic failure characteristics of components of passive systems. REPAS, RMPS and APSRA methodologies do not consider dynamic failures of components or process, which may have strong influence on the failure of passive systems. The influence of dynamic failure characteristics of components on system failure probability is presented with the help of a dynamic reliability methodology based on Monte Carlo simulation. The analysis of a benchmark problem of Hold-up tank shows the error in failure probability estimation by not considering the dynamism of components. It is thus suggested that dynamic reliability

  1. Stochastic Response and Reliability Analysis of Hysteretic Structures

    Mørk, Kim Jørgensen

    During the last 30 years response analysis of structures under random excitation has been studied in detail. These studies are motivated by the fact that most of natures excitations, such as earthquakes, wind and wave loads exhibit randomly fluctuating characters. For safety reasons this randomness...

  2. Reliability analysis of the control system of large-scale vertical mixing equipment


    The control system of vertical mixing equipment is a concentrate distributed monitoring system (CDMS).A reliability analysis model was built and its analysis was conducted based on reliability modeling theories such as the graph theory,Markov process,and redundancy theory.Analysis and operational results show that the control system can meet all technical requirements for high energy composite solid propellant manufacturing.The reliability performance of the control system can be considerably improved by adopting a control strategy combined with the hot spared redundancy of the primary system and the cold spared redundancy of the emergent one.The reliability performance of the control system can be also improved by adopting the redundancy strategy or improving the quality of each component and cable of the system.

  3. Multidisciplinary Inverse Reliability Analysis Based on Collaborative Optimization with Combination of Linear Approximations

    Xin-Jia Meng


    Full Text Available Multidisciplinary reliability is an important part of the reliability-based multidisciplinary design optimization (RBMDO. However, it usually has a considerable amount of calculation. The purpose of this paper is to improve the computational efficiency of multidisciplinary inverse reliability analysis. A multidisciplinary inverse reliability analysis method based on collaborative optimization with combination of linear approximations (CLA-CO is proposed in this paper. In the proposed method, the multidisciplinary reliability assessment problem is first transformed into a problem of most probable failure point (MPP search of inverse reliability, and then the process of searching for MPP of multidisciplinary inverse reliability is performed based on the framework of CLA-CO. This method improves the MPP searching process through two elements. One is treating the discipline analyses as the equality constraints in the subsystem optimization, and the other is using linear approximations corresponding to subsystem responses as the replacement of the consistency equality constraint in system optimization. With these two elements, the proposed method realizes the parallel analysis of each discipline, and it also has a higher computational efficiency. Additionally, there are no difficulties in applying the proposed method to problems with nonnormal distribution variables. One mathematical test problem and an electronic packaging problem are used to demonstrate the effectiveness of the proposed method.

  4. Evaluation of reliability of Coats-Redfern method for kinetic analysis of non-isothermal TGA

    R. Ebrahimi-Kahrizsangi; M. H. Abbasi


    A critical examination was made on the reliability of kinetic parameters of nonisothermal thermoanalytical rate measurement by the widely applied Coats-Redfern(CR) equation. For this purpose, simulated TGA curves were made for reactions with different kinetic models, including chemical, diffusion (Janders) and mixed mechanism at different heating rates. The results show that, for reactions controlled kinetically by one mechanism, all solid state reaction models show linear trends by use of CR method and this method can not distinct the correct reaction model. For reactions with mixed mechanism, the CR method shows nonlinear trends and the reaction models and kinetic parameters can not be extracted from CR curves. The overall conclusion from this comparative appraisal of the characteristics of the CR approach to kinetic analysis of TGA data is that the CR approach is generally unsuitable for determination of kinetic parameters.

  5. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)


    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  6. Sensitivity analysis for reliable design verification of nuclear turbosets

    Zentner, Irmela, E-mail: irmela.zentner@edf.f [Lamsid-Laboratory for Mechanics of Aging Industrial Structures, UMR CNRS/EDF, 1, avenue Du General de Gaulle, 92141 Clamart (France); EDF R and D-Structural Mechanics and Acoustics Department, 1, avenue Du General de Gaulle, 92141 Clamart (France); Tarantola, Stefano [Joint Research Centre of the European Commission-Institute for Protection and Security of the Citizen, T.P. 361, 21027 Ispra (Italy); Rocquigny, E. de [Ecole Centrale Paris-Applied Mathematics and Systems Department (MAS), Grande Voie des Vignes, 92 295 Chatenay-Malabry (France)


    In this paper, we present an application of sensitivity analysis for design verification of nuclear turbosets. Before the acquisition of a turbogenerator, energy power operators perform independent design assessment in order to assure safe operating conditions of the new machine in its environment. Variables of interest are related to the vibration behaviour of the machine: its eigenfrequencies and dynamic sensitivity to unbalance. In the framework of design verification, epistemic uncertainties are preponderant. This lack of knowledge is due to inexistent or imprecise information about the design as well as to interaction of the rotating machinery with supporting and sub-structures. Sensitivity analysis enables the analyst to rank sources of uncertainty with respect to their importance and, possibly, to screen out insignificant sources of uncertainty. Further studies, if necessary, can then focus on predominant parameters. In particular, the constructor can be asked for detailed information only about the most significant parameters.

  7. Reliability analysis of M/G/1 queues with general retrial times and server breakdowns

    WANG Jinting


    This paper concerns the reliability issues as well as queueing analysis of M/G/1 retrial queues with general retrial times and server subject to breakdowns and repairs. We assume that the server is unreliable and customers who find the server busy or down are queued in the retrial orbit in accordance with a first-come-first-served discipline. Only the customer at the head of the orbit queue is allowed for access to the server. The necessary and sufficient condition for the system to be stable is given. Using a supplementary variable method, we obtain the Laplace-Stieltjes transform of the reliability function of the server and a steady state solution for both queueing and reliability measures of interest. Some main reliability indexes, such as the availability, failure frequency, and the reliability function of the server, are obtained.

  8. Finite State Machine Based Evaluation Model for Web Service Reliability Analysis

    M, Thirumaran; Abarna, S; P, Lakshmi


    Now-a-days they are very much considering about the changes to be done at shorter time since the reaction time needs are decreasing every moment. Business Logic Evaluation Model (BLEM) are the proposed solution targeting business logic automation and facilitating business experts to write sophisticated business rules and complex calculations without costly custom programming. BLEM is powerful enough to handle service manageability issues by analyzing and evaluating the computability and traceability and other criteria of modified business logic at run time. The web service and QOS grows expensively based on the reliability of the service. Hence the service provider of today things that reliability is the major factor and any problem in the reliability of the service should overcome then and there in order to achieve the expected level of reliability. In our paper we propose business logic evaluation model for web service reliability analysis using Finite State Machine (FSM) where FSM will be extended to analy...

  9. A survey on the human reliability analysis methods for the design of Korean next generation reactor

    Lee, Yong Hee; Lee, J. W.; Park, J. C.; Kwack, H. Y.; Lee, K. Y.; Park, J. K.; Kim, I. S.; Jung, K. W


    Enhanced features through applying recent domestic technologies may characterize the safety and efficiency of KNGR(Korea Next Generation Reactor). Human engineered interface and control room environment are expected to be beneficial to the human aspects of KNGR design. However, since the current method for human reliability analysis is not up to date after THERP/SHARP, it becomes hard to assess the potential of human errors due to both of the positive and negative effect of the design changes in KNGR. This is a state of the art report on the human reliability analysis methods that are potentially available for the application to the KNGR design. We surveyed every technical aspects of existing HRA methods, and compared them in order to obtain the requirements for the assessment of human error potentials within KNGR design. We categorized the more than 10 methods into the first and the second generation according to the suggestion of Dr. Hollnagel. THERP was revisited in detail. ATHEANA proposed by US NRC for an advanced design and CREAM proposed by Dr. Hollnagel were reviewed and compared. We conclude that the key requirements might include the enhancement in the early steps for human error identification and the quantification steps with considerations of more extended error shaping factors over PSFs(performance shaping factors). The utilization of the steps and approaches of ATHEANA and CREAM will be beneficial to the attainment of an appropriate HRA method for KNGR. However, the steps and data from THERP will be still maintained because of the continuity with previous PSA activities in KNGR design.

  10. Reliability analysis of flood embankments taking into account a stochastic distribution of hydraulic loading

    Amabile Alessia


    Full Text Available Flooding is a worldwide phenomenon. Over the last few decades the world has experienced a rising number of devastating flood events and the trend in such natural disasters is increasing. Furthermore, escalations in both the probability and magnitude of flood hazards are expected as a result of climate change. Flood defence embankments are one of the major flood defence measures and reliability assessment for these structures is therefore a very important process. Routine hydro-mechanical models for the stability of flood embankments are based on the assumptions of steady-state through-flow and zero pore-pressures above the phreatic surface, i.e. negative capillary pressure (suction is ignored. Despite common belief, these assumptions may not always lead to conservative design. In addition, hydraulic loading is stochastic in nature and flood embankment stability should therefore be assessed in probabilistic terms. This cannot be accommodated by steady-state flow models. The paper presents an approach for reliability analysis of flood embankment taking into account the transient water through-flow. The factor of safety of the embankment is assessed in probabilistic terms based on a stochastic distribution for the hydraulic loading. Two different probabilistic approaches are tested to compare and validate the results.


    姜年朝; 周光明; 张逊; 戴勇; 倪俊; 张志清


    A high cycle fatigue reliability analysis approach to helicopter rotor hub is proposed under working load spectrum .Automatic calculation for the approach is implemented through writing the calculating programs .In the system ,the modification of geometric model of rotor hub is controlled by several parameters ,and finite element method and S-N curve method are then employed to solve the fatigue life by automatically assigned parameters .A database between assigned parameters and fatigue life is obtained via Latin Hypercube Sampling (LHS) on toler-ance zone of rotor hub .Different data-fitting technologies are used and compared to determine a highest-precision approximation for this database .The parameters are assumed to be independent of each other and follow normal distributions .Fatigue reliability is then computed by the Monte Carlo (MC) method and the mean-value first order second moment (M FOSM ) method .Results show that the approach has high efficiency and precision ,and is suit-able for engineering application .


    Bikova E.V.


    Full Text Available The comparative analysis of ecological indicators designed and specified for Moldova and similar indicators of the countries of CIS is made in the work. Some general items of information about power systems of the countries of CIS (the established capacities, manufacture of the electric power are given, the analysis of dynamics of emissions GHG- СО2, NOx, SO2 in Moldova and comparison with the emissions level in other countries of CIS is made.

  13. The reliability of the Canadian triage and acuity scale: Meta-analysis

    Amir Mirhaghi


    Full Text Available Background: Although the Canadian Triage and Acuity Scale (CTAS have been developed since two decades ago, the reliability of the CTAS has not been questioned comparing to moderating variable. Aims: The study was to provide a meta-analytic review of the reliability of the CTAS in order to reveal to what extent the CTAS is reliable. Materials and Methods: Electronic databases were searched to March 2014. Only studies were included that had reported samples size, reliability coefficients, adequate description of the CTAS reliability assessment. The guidelines for reporting reliability and agreement studies (GRRAS were used. Two reviewers independently examined abstracts and extracted data. The effect size was obtained by the z-transformation of reliability coefficients. Data were pooled with random-effects models and meta-regression was done based on method of moments estimator. Results: Fourteen studies were included. Pooled coefficient for the CTAS was substantial 0.672 (CI 95%: 0.599-0.735. Mistriage is less than 50%. Agreement upon the adult version, among nurse-physician and near countries is higher than pediatrics version, other raters and farther countries, respectively. Conclusion: The CTAS showed acceptable level of overall reliability in the emergency department but need more development to reach almost perfect agreement.


    Ammar Asad


    Full Text Available This study is undertaken to make a comparative analysis of celebrity advertisement and non-celebrity advertisement with respect to attitude toward advertisement, attitude toward brand, purchase intentions, and advertising attributes. For this purpose, a simple random sample of 200 students studying four different disciplines was taken from the Private University in Lahore. For econometric proof, reliability analysis, descriptive analysis, and independent sample T-test was used to interpret the results. Our findings show that there is no significant difference between celebrity and non-celebrity advertisement with respect to attitude toward advertisement, attitude toward brand, purchasing intentions, and advertising attributes. The limitations and recommendations of this research are also given.

  15. A Reliability-Based Analysis of Bicyclist Red-Light Running Behavior at Urban Intersections

    Mei Huan


    Full Text Available This paper describes the red-light running behavior of bicyclists at urban intersections based on reliability analysis approach. Bicyclists’ crossing behavior was collected by video recording. Four proportional hazard models by the Cox, exponential, Weibull, and Gompertz distributions were proposed to analyze the covariate effects on safety crossing reliability. The influential variables include personal characteristics, movement information, and situation factors. The results indicate that the Cox hazard model gives the best description of bicyclists’ red-light running behavior. Bicyclists’ safety crossing reliabilities decrease as their waiting times increase. There are about 15.5% of bicyclists with negligible waiting times, who are at high risk of red-light running and very low safety crossing reliabilities. The proposed reliability models can capture the covariates’ effects on bicyclists’ crossing behavior at signalized intersections. Both personal characteristics and traffic conditions have significant effects on bicyclists’ safety crossing reliability. A bicyclist is more likely to have low safety crossing reliability and high violation risk when more riders are crossing against the red light, and they wait closer to the motorized lane. These findings provide valuable insights in understanding bicyclists’ violation behavior; and their implications in assessing bicyclists’ safety crossing reliability were discussed.

  16. Strategy for Synthesis of Flexible Heat Exchanger Networks Embedded with System Reliability Analysis

    YI Dake; HAN Zhizhong; WANG Kefeng; YAO Pingjing


    System reliability can produce a strong influence on the performance of the heat exchanger network (HEN).In this paper,an optimization method with system reliability analysis for flexible HEN by genetic/simulated annealing algorithms (GA/SA) is presented.Initial flexible arrangements of HEN is received by pseudo-temperature enthalpy diagram.For determining system reliability of HEN,the connections of heat exchangers(HEXs) and independent subsystems in the HEN are analyzed by the connection sequence matrix(CSM),and the system reliability is measured by the independent subsystem including maximum number of HEXs in the HEN.As for the HEN that did not meet system reliability,HEN decoupling is applied and the independent subsystems in the HEN are changed by removing decoupling HEX,and thus the system reliability is elevated.After that,heat duty redistribution based on the relevant elements of the heat load loops and HEX areas are optimized in GA/SA.Then,the favorable network configuration,which matches both the most economical cost and system reliability criterion,is located.Moreover,particular features belonging to suitable decoupling HEX are extracted from calculations.Corresponding numerical example is presented to verify that the proposed strategy is effective to formulate optimal flexible HEN with system reliability measurement.

  17. Report on the analysis of field data relating to the reliability of solar hot water systems.

    Menicucci, David F. (Building Specialists, Inc., Albuquerque, NM)


    Utilities are overseeing the installations of thousand of solar hot water (SHW) systems. Utility planners have begun to ask for quantitative measures of the expected lifetimes of these systems so that they can properly forecast their loads. This report, which augments a 2009 reliability analysis effort by Sandia National Laboratories (SNL), addresses this need. Additional reliability data have been collected, added to the existing database, and analyzed. The results are presented. Additionally, formal reliability theory is described, including the bathtub curve, which is the most common model to characterize the lifetime reliability character of systems, and for predicting failures in the field. Reliability theory is used to assess the SNL reliability database. This assessment shows that the database is heavily weighted with data that describe the reliability of SHW systems early in their lives, during the warranty period. But it contains few measured data to describe the ends of SHW systems lives. End-of-life data are the most critical ones to define sufficiently the reliability of SHW systems in order to answer the questions that the utilities pose. Several ideas are presented for collecting the required data, including photometric analysis of aerial photographs of installed collectors, statistical and neural network analysis of energy bills from solar homes, and the development of simple algorithms to allow conventional SHW controllers to announce system failures and record the details of the event, similar to how aircraft black box recorders perform. Some information is also presented about public expectations for the longevity of a SHW system, information that is useful in developing reliability goals.

  18. Developing a highly reliable cae analysis model of the mechanisms that cause bolt loosening in automobiles

    Ken Hashimoto


    Full Text Available In this study, we developed a highly reliable CAE analysis model of the mechanisms that cause loosening of bolt fasteners, which has been a bottleneck in automobile development and design, using a technical element model for highly accurate CAE that we had previously developed, and verified its validity. Specifically, drawing on knowledge gained from our clarification of the mechanisms that cause loosening of bolt fasteners using actual machine tests, we conducted an accelerated bench test consisting of a threedimensional vibration load test of the loosening of bolt fasteners used in mounts and rear suspension arms, where interviews with personnel at an automaker indicated loosening was most pronounced, and reproduced actual machine tests with CAE analysis based on a technical element model for highly accurate CAE analysis. Based on these results, we were able to reproduce dynamic behavior in which larger screw pitches (lead angles lead to greater non-uniformity of surface pressure, particularly around the nut seating surface, causing loosening to occur in areas with the lowest surface pressure. Furthermore, we implemented highly accurate CAE analysis with no error (gap compared to actual machine tests.

  19. Estimation of AM fungal colonization - Comparability and reliability of classical methods.

    Füzy, Anna; Biró, Ibolya; Kovács, Ramóna; Takács, Tünde


    The characterization of mycorrhizal status in hosts can be a good indicator of symbiotic associations in inoculation experiments or in ecological research. The most common microscopic-based observation methods, such as (i) the gridline intersect method, (ii) the magnified intersections method and (iii) the five-class system of Trouvelot were tested to find the most simple, easily executable, effective and objective ones and their appropriate parameters for characterization of mycorrhizal status. In a pot experiment, white clover (Trifolium repens L.) host plant was inoculated with 6 (BEG144; syn. Rhizophagus intradices) in pumice substrate to monitor the AMF colonization properties during host growth. Eleven (seven classical and four new) colonization parameters were estimated by three researchers in twelve sampling times during plant growth. Variations among methods, observers, parallels, or individual plants were determined and analysed to select the most appropriate parameters and sampling times for monitoring. The comparability of the parameters of the three methods was also tested. As a result of the experiment classical parameters were selected for hyphal colonization: colonization frequency in the first stage or colonization density in the later period, and arbuscular richness of roots. A new parameter was recommended to determine vesicule and spore content of colonized roots at later stages of symbiosis.

  20. Reliability and life-cycle analysis of deteriorating systems

    Sánchez-Silva, Mauricio


    This book compiles and critically discusses modern engineering system degradation models and their impact on engineering decisions. In particular, the authors focus on modeling the uncertain nature of degradation considering both conceptual discussions and formal mathematical formulations. It also describes the basics concepts and the various modeling aspects of life-cycle analysis (LCA).  It highlights the role of degradation in LCA and defines optimum design and operation parameters. Given the relationship between operational decisions and the performance of the system’s condition over time, maintenance models are also discussed. The concepts and models presented have applications in a large variety of engineering fields such as Civil, Environmental, Industrial, Electrical and Mechanical engineering. However, special emphasis is given to problems related to large infrastructure systems. The book is intended to be used both as a reference resource for researchers and practitioners and as an academic text ...

  1. Reliability Analysis of Repairable Systems Using Stochastic Point Processes

    TAN Fu-rong; JIANG Zhi-bin; BAI Tong-shuo


    In order to analyze the failure data from repairable systems, the homogeneous Poisson process(HPP) is usually used. In general, HPP cannot be applied to analyze the entire life cycle of a complex, re-pairable system because the rate of occurrence of failures (ROCOF) of the system changes over time rather thanremains stable. However, from a practical point of view, it is always preferred to apply the simplest methodto address problems and to obtain useful practical results. Therefore, we attempted to use the HPP model toanalyze the failure data from real repairable systems. A graphic method and the Laplace test were also usedin the analysis. Results of numerical applications show that the HPP model may be a useful tool for the entirelife cycle of repairable systems.

  2. Implementation and Analysis of Probabilistic Methods for Gate-Level Circuit Reliability Estimation

    WANG Zhen; JIANG Jianhui; YANG Guang


    The development of VLSI technology results in the dramatically improvement of the performance of integrated circuits. However, it brings more challenges to the aspect of reliability. Integrated circuits become more susceptible to soft errors. Therefore, it is imperative to study the reliability of circuits under the soft error. This paper implements three probabilistic methods (two pass, error propagation probability, and probabilistic transfer matrix) for estimating gate-level circuit reliability on PC. The functions and performance of these methods are compared by experiments using ISCAS85 and 74-series circuits.

  3. Tensile reliability analysis for gravity dam foundation surface based on FEM and response surface method

    Tong-chun LI; Dan-dan LI; Zhi-qiang WANG


    In this study,the limit state equation for tensile reliability analysis of the foundation surface of a gravity dam was established.The possible crack length was set as the action effect and allowable crack length was set as the resistance in the limit state.The nonlinear FEM was used to obtain the crack length of the foundation surface of the gravity dam,and the linear response surface method based on the orthogonal test design method was used to calculate the reliability,providing a reasonable and simple method for calculating the reliability of the serviceability limit state.The Longtan RCC gravity dam was chosen as an example.An orthogonal test,including eleven factors and two levels,was conducted,and the tensile reliability was calculated.The analysis shows that this method is reasonable.

  4. Methodological Approach for Performing Human Reliability and Error Analysis in Railway Transportation System

    Fabio De Felice


    Full Text Available Today, billions of dollars are being spent annually world wide to develop, manufacture, and operate transportation system such trains, ships, aircraft, and motor vehicles. Around 70 to 90 percent oftransportation crashes are, directly or indirectly, the result of human error. In fact, with the development of technology, system reliability has increased dramatically during the past decades, while human reliability has remained unchanged over the same period. Accordingly, human error is now considered as the most significant source of accidents or incidents in safety-critical systems. The aim of the paper is the proposal of a methodological approach to improve the transportation system reliability and in particular railway transportation system. The methodology presented is based on Failure Modes, Effects and Criticality Analysis (FMECA and Human Reliability Analysis (HRA.

  5. Mechanical system reliability analysis using a combination of graph theory and Boolean function

    Tang, J


    A new method based on graph theory and Boolean function for assessing reliability of mechanical systems is proposed. The procedure for this approach consists of two parts. By using the graph theory, the formula for the reliability of a mechanical system that considers the interrelations of subsystems or components is generated. Use of the Boolean function to examine the failure interactions of two particular elements of the system, followed with demonstrations of how to incorporate such failure dependencies into the analysis of larger systems, a constructive algorithm for quantifying the genuine interconnections between the subsystems or components is provided. The combination of graph theory and Boolean function provides an effective way to evaluate the reliability of a large, complex mechanical system. A numerical example demonstrates that this method an effective approaches in system reliability analysis.

  6. Technology development of maintenance optimization and reliability analysis for safety features in nuclear power plants

    Kim, Tae Woon; Choi, Seong Soo; Lee, Dong Gue; Kim, Young Il


    The reliability data management system (RDMS) for safety systems of PHWR type plants has been developed and utilized in the reliability analysis of the special safety systems of Wolsong Unit 1,2 with plant overhaul period lengthened. The RDMS is developed for the periodic efficient reliability analysis of the safety systems of Wolsong Unit 1,2. In addition, this system provides the function of analyzing the effects on safety system unavailability if the test period of a test procedure changes as well as the function of optimizing the test periods of safety-related test procedures. The RDMS can be utilized in handling the requests of the regulatory institute actively with regard to the reliability validation of safety systems. (author)

  7. Analysis and Application of Mechanical System Reliability Model Based on Copula Function

    An Hai


    Full Text Available There is complicated correlations in mechanical system. By using the advantages of copula function to solve the related issues, this paper proposes the mechanical system reliability model based on copula function. And makes a detailed research for the serial and parallel mechanical system model and gets their reliability function respectively. Finally, the application research is carried out for serial mechanical system reliability model to prove its validity by example. Using Copula theory to make mechanical system reliability modeling and its expectation, studying the distribution of the random variables (marginal distribution of the mechanical product’ life and associated structure of variables separately, can reduce the difficulty of multivariate probabilistic modeling and analysis to make the modeling and analysis process more clearly.

  8. Signal Quality Outage Analysis for Ultra-Reliable Communications in Cellular Networks

    Gerardino, Guillermo Andrés Pocovi; Alvarez, Beatriz Soret; Lauridsen, Mads


    Ultra-reliable communications over wireless will open the possibility for a wide range of novel use cases and applications. In cellular networks, achieving reliable communication is challenging due to many factors, particularly the fading of the desired signal and the interference. In this regard......, we investigate the potential of several techniques to combat these main threats. The analysis shows that traditional microscopic multiple-input multiple-output schemes with 2x2 or 4x4 antenna configurations are not enough to fulfil stringent reliability requirements. It is revealed how such antenna...

  9. Latency Analysis of Systems with Multiple Interfaces for Ultra-Reliable M2M Communication

    Nielsen, Jimmy Jessen; Popovski, Petar


    One of the ways to satisfy the requirements of ultra-reliable low latency communication for mission critical Machine-type Communications (MTC) applications is to integrate multiple communication interfaces. In order to estimate the performance in terms of latency and reliability...... of such an integrated communication system, we propose an analysis framework that combines traditional reliability models with technology-specific latency probability distributions. In our proposed model we demonstrate how failure correlation between technologies can be taken into account. We show for the considered...

  10. Reliability analysis of tunnel surrounding rock stability by Monte-Carlo method

    XI Jia-mi; YANG Geng-she


    Discussed advantages of improved Monte-Carlo method and feasibility aboutproposed approach applying in reliability analysis for tunnel surrounding rock stability. Onthe basis of deterministic parsing for tunnel surrounding rock, reliability computing methodof surrounding rock stability was derived from improved Monte-Carlo method. The com-puting method considered random of related parameters, and therefore satisfies relativityamong parameters. The proposed method can reasonably determine reliability of sur-rounding rock stability. Calculation results show that this method is a scientific method indiscriminating and checking surrounding rock stability.

  11. Personal Publications Lists Serve as a Reliable Calibration Parameter to Compare Coverage in Academic Citation Databases with Scientific Social Media

    Emma Hughes


    Full Text Available A Review of: Hilbert, F., Barth, J., Gremm, J., Gros, D., Haiter, J., Henkel, M., Reinhardt, W., & Stock, W.G. (2015. Coverage of academic citation databases compared with coverage of scientific social media: personal publication lists as calibration parameters. Online Information Review 39(2: 255-264. Objective – The purpose of this study was to explore coverage rates of information science publications in academic citation databases and scientific social media using a new method of personal publication lists as a calibration parameter. The research questions were: How many publications are covered in different databases, which has the best coverage, and what institutions are represented and how does the language of the publication play a role? Design – Bibliometric analysis. Setting – Academic citation databases (Web of Science, Scopus, Google Scholar and scientific social media (Mendeley, CiteULike, Bibsonomy. Subjects – 1,017 library and information science publications produced by 76 information scientists at 5 German-speaking universities in Germany and Austria. Methods – Only documents which were published between 1 January 2003 and 31 December 2012 were included. In that time the 76 information scientists had produced 1,017 documents. The information scientists confirmed that their publication lists were complete and these served as the calibration parameter for the study. The citations from the publication lists were searched in three academic databases: Google Scholar, Web of Science (WoS, and Scopus; as well as three social media citation sites: Mendeley, CiteULike, and BibSonomy and the results were compared. The publications were searched for by author name and words from the title. Main results – None of the databases investigated had 100% coverage. In the academic databases, Google Scholar had the highest amount of coverage with an average of 63%, Scopus an average of 31%, and

  12. Comparative genomic analysis of esophageal cancers.

    Caygill, Christine P J; Gatenby, Piers A C; Herceg, Zdenko; Lima, Sheila C S; Pinto, Luis F R; Watson, Anthony; Wu, Ming-Shiang


    The following, from the 12th OESO World Conference: Cancers of the Esophagus, includes commentaries on comparative genomic analysis of esophageal cancers: genomic polymorphisms, the genetic and epigenetic drivers in esophageal cancers, and the collection of data in the UK Barrett's Oesophagus Registry.

  13. Investigation of Common Symptoms of Cancer and Reliability Analysis


    Objective: To identify cancer distribution and treatment requirements, a questionnaire on cancer patients was conducted. It was our objective to validate a series of symptoms commonly used in traditional Chinese medicine (TCM). Methods: The M. D. Anderson Symptom Assessment Inventory (MDASI) was used with 10 more TCM items added. Questions regarding TCM application requested in cancer care were also asked. A multi-center, cross-sectional study was conducted in 340 patients from 4 hospitals in Beijing and Dalian. SPSS and Excel software were adopted for statistical analysis. The questionnaire was self-evaluated with the Cronbach's alpha score. Results: The most common symptoms were fatigue 89.4%, sleep disturbance 74.4%, dry mouth 72.9%, poor appetite 72.9%, and difficulty remembering 71.2%. These symptoms affected work (89.8%), mood (82.6%),and activity (76.8%), resulting in poor quality of life. Eighty percent of the patients wanted to regulate the body with TCM. Almost 100% of the patients were interested in acquiring knowledge regarding the integrated traditional Chinese medicine (TCM) and Western medicine (WM) in the treatment and rehabilitation of cancer. Cronbach's alpha score indicated that there was acceptable internal consistency within both the MDASI and TCM items, 0.86 for MDASI, 0.78 for TCM, and 0.90 for MDASI-TCM (23 items). Conclusions: Fatigue, sleep disturbance, dry mouth, poor appetite, and difficulty remembering are the most common symptoms in cancer patients. These greatly affect the quality of life for these patients. Patients expressed a strong desire for TCM holistic regulation. The MDASI and its TCM-adapted model could be a critical tool for the quantitative study of TCM symptoms.

  14. Segmental analysis of indocyanine green pharmacokinetics for the reliable diagnosis of functional vascular insufficiency

    Kang, Yujung; Lee, Jungsul; An, Yuri; Jeon, Jongwook; Choi, Chulhee


    Accurate and reliable diagnosis of functional insufficiency of peripheral vasculature is essential since Raynaud phenomenon (RP), most common form of peripheral vascular insufficiency, is commonly associated with systemic vascular disorders. We have previously demonstrated that dynamic imaging of near-infrared fluorophore indocyanine green (ICG) can be a noninvasive and sensitive tool to measure tissue perfusion. In the present study, we demonstrated that combined analysis of multiple parameters, especially onset time and modified Tmax which means the time from onset of ICG fluorescence to Tmax, can be used as a reliable diagnostic tool for RP. To validate the method, we performed the conventional thermographic analysis combined with cold challenge and rewarming along with ICG dynamic imaging and segmental analysis. A case-control analysis demonstrated that segmental pattern of ICG dynamics in both hands was significantly different between normal and RP case, suggesting the possibility of clinical application of this novel method for the convenient and reliable diagnosis of RP.

  15. Multiobject Reliability Analysis of Turbine Blisk with Multidiscipline under Multiphysical Field Interaction

    Chun-Yi Zhang


    Full Text Available To study accurately the influence of the deformation, stress, and strain of turbine blisk on the performance of aeroengine, the comprehensive reliability analysis of turbine blisk with multiple disciplines and multiple objects was performed based on multiple response surface method (MRSM and fluid-thermal-solid coupling technique. Firstly, the basic thought of MRSM was introduced. And then the mathematical model of MRSM was established with quadratic polynomial. Finally, the multiple reliability analyses of deformation, stress, and strain of turbine blisk were completed under multiphysical field coupling by the MRSM, and the comprehensive performance of turbine blisk was evaluated. From the reliability analysis, it is demonstrated that the reliability degrees of the deformation, stress, and strain for turbine blisk are 0.9942, 0.9935, 0.9954, and 0.9919, respectively, when the allowable deformation, stress, and strain are 3.7 × 10−3 m, 1.07 × 109 Pa, and 1.12 × 10−2 m/m, respectively; besides, the comprehensive reliability degree of turbine blisk is 0.9919, which basically satisfies the engineering requirement of aeroengine. The efforts of this paper provide a promising approach method for multidiscipline multiobject reliability analysis.

  16. Reliability and Validity of Quantitative Video Analysis of Baseball Pitching Motion.

    Oyama, Sakiko; Sosa, Araceli; Campbell, Rebekah; Correa, Alexandra


    Video recordings are used to quantitatively analyze pitchers' techniques. However, reliability and validity of such analysis is unknown. The purpose of the study was to investigate the reliability and validity of joint and segment angles identified during a pitching motion using video analysis. Thirty high school baseball pitchers participated. The pitching motion was captured using 2 high-speed video cameras and a motion capture system. Two raters reviewed the videos to digitize the body segments to calculate 2-dimensional angles. The corresponding 3-dimensional angles were calculated from the motion capture data. Intrarater reliability, interrater reliability, and validity of the 2-dimensional angles were determined. The intrarater and interrater reliability of the 2-dimensional angles were high for most variables. The trunk contralateral flexion at maximum external rotation was the only variable with high validity. Trunk contralateral flexion at ball release, trunk forward flexion at foot contact and ball release, shoulder elevation angle at foot contact, and maximum shoulder external rotation had moderate validity. Two-dimensional angles at the shoulder, elbow, and trunk could be measured with high reliability. However, the angles are not necessarily anatomically correct, and thus use of quantitative video analysis should be limited to angles that can be measured with good validity.

  17. Reliability evaluation and analysis of sugarcane 7000 series harvesters in sugarcane harvesting

    P Najafi


    hours were used. Usually, two methods are usedfor machine reliability modeling. The first is Pareto analysis and the second is statistical modeling of failure distributions (Barabadi and Kumar, 2007. For failures distribution modeling data need to be found, that are independent and identically (iid distributed or not. For this, trend test and serial correlation tests are used. If the data has a trend, those are not iid and its parameters are computed from the power law process. For the data that does not havea trend, serial correlation testare performed. If the correlation coefficient is less than 0.05 the data is not iid. Therefore, its parameters reach via branching poison process or other similar methods; if the correlation coefficient is more than 0.05, the data are iid. Therefore, the classical statistical methods will be used for reliability modeling. Trend test results are compared with statistical parameter. A test for serial correlation was also done by plotting the ith TBF against the (i-1th TBF, i ¼ 1; 2; . . . ; n: If the plotted points are randomly scattered without any pattern, it can be interpreted that there is no correlation in general among the TBFs data and the data is independent. To continue, one must choose as the best fit distribution for TBF data. Few tests can be used for best fit distribution that include chi squared test and Kolmogorov–Smirnov (K-S test. Chi squared test is not valid when the data are less than 50. Therefore, when the TBF data are less than 50, K-S test must be used. Hence, the K-S test can be used for each TBF data numbers. When the failure distribution has been determined, the reliability model may be computed by equation (2.Results and discussion: Results of trend analysis for TBF data of sugarcane harvester machines showed that the calculated statistics U for all machines was more than chi squared value that was extracted fromthe chi square table with 2 (n-1 degrees of freedom and 5 percent level of significance. Hence

  18. Analysis of strain gage reliability in F-100 jet engine testing at NASA Lewis Research Center

    Holanda, R.


    A reliability analysis was performed on 64 strain gage systems mounted on the 3 rotor stages of the fan of a YF-100 engine. The strain gages were used in a 65 hour fan flutter research program which included about 5 hours of blade flutter. The analysis was part of a reliability improvement program. Eighty-four percent of the strain gages survived the test and performed satisfactorily. A post test analysis determined most failure causes. Five failures were caused by open circuits, three failed gages showed elevated circuit resistance, and one gage circuit was grounded. One failure was undetermined.

  19. Problems Related to Use of Some Terms in System Reliability Analysis

    Nadezda Hanusova


    Full Text Available The paper deals with problems of using dependability terms, defined in actual standard STN IEC 50 (191: International electrotechnical dictionary, chap. 191: Dependability and quality of service (1993, in a technical systems dependability analysis. The goal of the paper is to find a relation between terms introduced in the mentioned standard and used in the technical systems dependability analysis and rules and practices used in a system analysis of the system theory. Description of a part of the system life cycle related to reliability is used as a starting point. The part of a system life cycle is described by the state diagram and reliability relevant therms are assigned.

  20. Application of response surface method for contact fatigue reliability analysis of spur gear with consideration of EHL

    胡贇; 刘少军; 丁晟; 廖雅诗


    In order to consider the effects of elastohydrodynamic lubrication (EHL) on contact fatigue reliability of spur gear, an accurate and efficient method that combines with response surface method (RSM) and first order second moment method (FOSM) was developed for estimating the contact fatigue reliability of spur gear under EHL. The mechanical model of contact stress analysis of spur gear under EHL was established, in which the oil film pressure was mapped into hertz contact zone. Considering the randomness of EHL, material properties and fatigue strength correction factors, the proposed method was used to analyze the contact fatigue reliability of spur gear under EHL. Compared with the results of 1.5×105 by traditional Monte-Carlo, the difference between the two failure probability results calculated by the above mentioned methods is 2.2×10−4,the relative error of the failure probability results is 26.8%, and time-consuming only accounts for 0.14% of the traditional Monte-Carlo method (MCM). Sensitivity analysis results are in very good agreement with practical cognition. Analysis results show that the proposed method is precise and efficient, and could correctly reflect the influence of EHL on contact fatigue reliability of spur gear.

  1. An Investigation on the Reliability of Deformation Analysis at Simulated Network Depending on the Precise Point Position Technique

    Durdag, U. M.; Erdogan, B.; Hekimoglu, S.


    Deformation analysis plays an important role for human life safety; hence investigating the reliability of the obtained results from deformation analysis is crucial. The deformation monitoring network is established and the observations are analyzed periodically. The main problem in the deformation analysis is that if there is more than one displaced point in the monitoring network, the analysis methods smear the disturbing effects of the displaced points over all other points which are not displaced. Therefore, only one displaced point can be detected successfully. The Precise Point Positioning (PPP) gives opportunity to prevent smearing effect of the displaced points. In this study, we have simulated a monitoring network that consisting four object points and generated six different scenarios. The displacements were added to the points by using a device that the GPS antenna was easily moved horizontally and the seven hours static GPS measurements were carried out. The measurements were analyzed by using online Automatic Precise Positioning Service (APPS) to obtain the coordinates and covariance matrices. The results of the APPS were used in the deformation analysis. The detected points and true displaced points were compared with each other to obtain reliability of the method. According to the results, the analysis still detect stable points as displaced points. For the next step, we are going to search the reason of the wrong results and deal with acquiring more reliable results.

  2. Content Analysis in Mass Communication: Assessment and Reporting of Intercoder Reliability.

    Lombard, Matthew; Snyder-Duch, Jennifer; Bracken, Cheryl Campanella


    Reviews the importance of intercoder agreement for content analysis in mass communication research. Describes several indices for calculating this type of reliability (varying in appropriateness, complexity, and apparent prevalence of use). Presents a content analysis of content analyses reported in communication journals to establish how…

  3. Comprehensive Reliability Allocation Method for CNC Lathes Based on Cubic Transformed Functions of Failure Mode and Effects Analysis

    YANG Zhou; ZHU Yunpeng; REN Hongrui; ZHANG Yimin


    Reliability allocation of computerized numerical controlled(CNC) lathes is very important in industry. Traditional allocation methods only focus on high-failure rate components rather than moderate failure rate components, which is not applicable in some conditions. Aiming at solving the problem of CNC lathes reliability allocating, a comprehensive reliability allocation method based on cubic transformed functions of failure modes and effects analysis(FMEA) is presented. Firstly, conventional reliability allocation methods are introduced. Then the limitations of direct combination of comprehensive allocation method with the exponential transformed FMEA method are investigated. Subsequently, a cubic transformed function is established in order to overcome these limitations. Properties of the new transformed functions are discussed by considering the failure severity and the failure occurrence. Designers can choose appropriate transform amplitudes according to their requirements. Finally, a CNC lathe and a spindle system are used as an example to verify the new allocation method. Seven criteria are considered to compare the results of the new method with traditional methods. The allocation results indicate that the new method is more flexible than traditional methods. By employing the new cubic transformed function, the method covers a wider range of problems in CNC reliability allocation without losing the advantages of traditional methods.

  4. Comparative analysis of minor histocompatibility antigens genotyping methods

    A. S. Vdovin


    Full Text Available The wide range of techniques could be employed to find mismatches in minor histocompatibility antigens between transplant recipients and their donors. In the current study we compared three genotyping methods based on polymerase chain reaction (PCR for four minor antigens. Three of the tested methods: allele-specific PCR, restriction fragment length polymorphism and real-time PCR with TaqMan probes demonstrated 100% reliability when compared to Sanger sequencing for all of the studied polymorphisms. High resolution melting analysis was unsuitable for genotyping of one of the tested minor antigens (HA-1 as it has linked synonymous polymorphism. Obtained data could be used to select the strategy for large-scale clinical genotyping.

  5. Estimating Reliability of Disturbances in Satellite Time Series Data Based on Statistical Analysis

    Zhou, Z.-G.; Tang, P.; Zhou, M.


    Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with "Change/ No change" by most of the present methods, while few methods focus on estimating reliability (or confidence level) of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1) Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST). (2) Forecasting and detecting disturbances in new time series data. (3) Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI) and Confidence Levels (CL). The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  6. Reliability reallocation models as a support tools in traffic safety analysis.

    Bačkalić, Svetlana; Jovanović, Dragan; Bačkalić, Todor


    One of the essential questions placed before a road authority is where to act first, i.e. which road sections should be treated in order to achieve the desired level of reliability of a particular road, while this is at the same time the subject of this research. The paper shows how the reliability reallocation theory can be applied in safety analysis of a road consisting of sections. The model has been successfully tested using two apportionment techniques - ARINC and the minimum effort algorithm. The given methods were applied in the traffic safety analysis as a basic step, for the purpose of achieving a higher level of reliability. The previous methods used for selecting hazardous locations do not provide precise values for the required frequency of accidents, i.e. the time period between the occurrences of two accidents. In other words, they do not allow for the establishment of a connection between a precise demand for increased reliability (expressed as a percentage) and the selection of particular road sections for further analysis. The paper shows that reallocation models can also be applied in road safety analysis, or more precisely, as part of the measures for increasing their level of safety. A tool has been developed for selecting road sections for treatment on the basis of a precisely defined increase in the level of reliability of a particular road, i.e. the mean time between the occurrences of two accidents.

  7. Reliability analysis of supporting pressure in tunnels based on three-dimensional failure mechanism

    罗卫华; 李闻韬


    Based on nonlinear failure criterion, a three-dimensional failure mechanism of the possible collapse of deep tunnel is presented with limit analysis theory. Support pressure is taken into consideration in the virtual work equation performed under the upper bound theorem. It is necessary to point out that the properties of surrounding rock mass plays a vital role in the shape of collapsing rock mass. The first order reliability method and Monte Carlo simulation method are then employed to analyze the stability of presented mechanism. Different rock parameters are considered random variables to value the corresponding reliability index with an increasing applied support pressure. The reliability indexes calculated by two methods are in good agreement. Sensitivity analysis was performed and the influence of coefficient variation of rock parameters was discussed. It is shown that the tensile strength plays a much more important role in reliability index than dimensionless parameter, and that small changes occurring in the coefficient of variation would make great influence of reliability index. Thus, significant attention should be paid to the properties of surrounding rock mass and the applied support pressure to maintain the stability of tunnel can be determined for a given reliability index.

  8. An Evidential Reasoning-Based CREAM to Human Reliability Analysis in Maritime Accident Process.

    Wu, Bing; Yan, Xinping; Wang, Yang; Soares, C Guedes


    This article proposes a modified cognitive reliability and error analysis method (CREAM) for estimating the human error probability in the maritime accident process on the basis of an evidential reasoning approach. This modified CREAM is developed to precisely quantify the linguistic variables of the common performance conditions and to overcome the problem of ignoring the uncertainty caused by incomplete information in the existing CREAM models. Moreover, this article views maritime accident development from the sequential perspective, where a scenario- and barrier-based framework is proposed to describe the maritime accident process. This evidential reasoning-based CREAM approach together with the proposed accident development framework are applied to human reliability analysis of a ship capsizing accident. It will facilitate subjective human reliability analysis in different engineering systems where uncertainty exists in practice.

  9. Structure buckling and non-probabilistic reliability analysis of supercavitating vehicles

    AN Wei-guang; ZHOU Ling; AN Hai


    To perform structure buckling and reliability analysis on supercavitating vehicles with high velocity in the submarine, supercavitating vehicles were simplified as variable cross section beam firstly. Then structural buckling analysis of supercavitating vehicles with or without engine thrust was conducted, and the structural buckling safety margin equation of supercavitating vehicles was established. The indefinite information was de-scribed by interval set and the structure reliability analysis was performed by using non-probabilistic reliability method. Considering interval variables as random variables which satisfy uniform distribution, the Monte-Carlo method was used to calculate the non-probabilistic failure degree. Numerical examples of supercavitating vehi-cles were presented. Under different ratios of base diameter to cavitator diameter, the change tendency of non-probabilistic failure degree of structural buckling of supereavitating vehicles with or without engine thrust was studied along with the variety of speed.




    A two-point adaptive nonlinear approximation (referred to as TANA4) suitable for reliability analysis is proposed. Transformed and normalized random variables in probabilistic analysis could become negative and pose a challenge to the earlier developed two-point approximations; thus a suitable method that can address this issue is needed. In the method proposed, the nonlinearity indices of intervening variables are limited to integers. Then, on the basis of the present method, an improved sequential approximation of the limit state surface for reliability analysis is presented. With the gradient projection method, the data points for the limit state surface approximation are selected on the original limit state surface, which effectively represents the nature of the original response function. On the basis of this new approximation, the reliability is estimated using a first-order second-moment method. Various examples, including both structural and non-structural ones, are presented to show the effectiveness of the method proposed.

  11. The Monte Carlo Simulation Method for System Reliability and Risk Analysis

    Zio, Enrico


    Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling.   Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques.   This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...

  12. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    Migneault, Gerard E.


    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  13. A Comparative Analysis of Institutional Repository Software


    This proposal outlines the design of a comparative analysis of the four institutional repository software packages that were represented at the 4th International Conference on Open Repositories held in 2009 in Atlanta, Georgia: EPrints, DSpace, Fedora and Zentity (The 4th International Conference on Open Repositories website, The study includes 23 qualitative and quantitative measures taken from default installations of the four repositories on a benchmark ma...

  14. Comparative analysis of twelve Dothideomycete plant pathogens

    Ohm, Robin; Aerts, Andrea; Salamov, Asaf; Goodwin, Stephen B.; Grigoriev, Igor


    The Dothideomycetes are one of the largest and most diverse groups of fungi. Many are plant pathogens and pose a serious threat to agricultural crops grown for biofuel, food or feed. Most Dothideomycetes have only a single host and related Dothideomycete species can have very diverse host plants. Twelve Dothideomycete genomes have currently been sequenced by the Joint Genome Institute and other sequencing centers. They can be accessed via Mycocosm which has tools for comparative analysis




    Full Text Available In the second wave of financial crisis, namely the sovereign debt crisis, the country’s most affected by this phenomenon are Greece, Italy, Spain, Portugal, Ireland and last year Cyprus joined. Future more the crisis in Greece in 2015 requires local authorities to evaluate constantly their rating in order to prevent bankruptcy. In this paper we conducted a comparative analysis using Altman method and the Stickney method and correlate the scores with ratings agencies Standard & Poor's and Moody's.

  16. Reliability analysis of production ships with emphasis on load combination and ultimate strength

    Wang, Xiaozhi


    This thesis deals with ultimate strength and reliability analysis of offshore production ships, accounting for stochastic load combinations, using a typical North Sea production ship for reference. A review of methods for structural reliability analysis is presented. Probabilistic methods are established for the still water and vertical wave bending moments. Linear stress analysis of a midships transverse frame is carried out, four different finite element models are assessed. Upon verification of the general finite element code ABAQUS with a typical ship transverse girder example, for which test results are available, ultimate strength analysis of the reference transverse frame is made to obtain the ultimate load factors associated with the specified pressure loads in Det norske Veritas Classification rules for ships and rules for production vessels. Reliability analysis is performed to develop appropriate design criteria for the transverse structure. It is found that the transverse frame failure mode does not seem to contribute to the system collapse. Ultimate strength analysis of the longitudinally stiffened panels is performed, accounting for the combined biaxial and lateral loading. Reliability based design of the longitudinally stiffened bottom and deck panels is accomplished regarding the collapse mode under combined biaxial and lateral loads. 107 refs., 76 refs., 37 tabs.

  17. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.


    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  18. MultiMetEval : Comparative and Multi-Objective Analysis of Genome-Scale Metabolic Models

    Zakrzewski, Piotr; Medema, Marnix H.; Gevorgyan, Albert; Kierzek, Andrzej M.; Breitling, Rainer; Takano, Eriko; Fong, Stephen S.


    Comparative metabolic modelling is emerging as a novel field, supported by the development of reliable and standardized approaches for constructing genome-scale metabolic models in high throughput. New software solutions are needed to allow efficient comparative analysis of multiple models in the co

  19. Reliability of functional and predictive methods to estimate the hip joint centre in human motion analysis in healthy adults.

    Kainz, Hans; Hajek, Martin; Modenese, Luca; Saxby, David J; Lloyd, David G; Carty, Christopher P


    In human motion analysis predictive or functional methods are used to estimate the location of the hip joint centre (HJC). It has been shown that the Harrington regression equations (HRE) and geometric sphere fit (GSF) method are the most accurate predictive and functional methods, respectively. To date, the comparative reliability of both approaches has not been assessed. The aims of this study were to (1) compare the reliability of the HRE and the GSF methods, (2) analyse the impact of the number of thigh markers used in the GSF method on the reliability, (3) evaluate how alterations to the movements that comprise the functional trials impact HJC estimations using the GSF method, and (4) assess the influence of the initial guess in the GSF method on the HJC estimation. Fourteen healthy adults were tested on two occasions using a three-dimensional motion capturing system. Skin surface marker positions were acquired while participants performed quite stance, perturbed and non-perturbed functional trials, and walking trials. Results showed that the HRE were more reliable in locating the HJC than the GSF method. However, comparison of inter-session hip kinematics during gait did not show any significant difference between the approaches. Different initial guesses in the GSF method did not result in significant differences in the final HJC location. The GSF method was sensitive to the functional trial performance and therefore it is important to standardize the functional trial performance to ensure a repeatable estimate of the HJC when using the GSF method.

  20. Application of FTA Method to Reliability Analysis of Vacuum Resin Shot Dosing Equipment


    Faults of vacuum resin shot dosing equipment are studied systematically and the fault tree of the system is constructed by using the fault tree analysis(FTA) method. Then the qualitative and quantitative analysis of the tree is carried out, respectively, and according to the results of the analysis, the measures to improve the system are worked out and implemented. As a result, the reliability of the equipment is enhanced greatly.

  1. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    Ronald L. Boring; John A. Forester; Andreas Bye; Vinh N. Dang; Erasmia Lois


    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to “translate” the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  2. Design and Analysis of a Flexible, Reliable Deep Space Life Support System

    Jones, Harry W.


    This report describes a flexible, reliable, deep space life support system design approach that uses either storage or recycling or both together. The design goal is to provide the needed life support performance with the required ultra reliability for the minimum Equivalent System Mass (ESM). Recycling life support systems used with multiple redundancy can have sufficient reliability for deep space missions but they usually do not save mass compared to mixed storage and recycling systems. The best deep space life support system design uses water recycling with sufficient water storage to prevent loss of crew if recycling fails. Since the amount of water needed for crew survival is a small part of the total water requirement, the required amount of stored water is significantly less than the total to be consumed. Water recycling with water, oxygen, and carbon dioxide removal material storage can achieve the high reliability of full storage systems with only half the mass of full storage and with less mass than the highly redundant recycling systems needed to achieve acceptable reliability. Improved recycling systems with lower mass and higher reliability could perform better than systems using storage.

  3. Vibration reliability analysis for aeroengine compressor blade based on support vector machine response surface method

    GAO Hai-feng; BAI Guang-chen


    To ameliorate reliability analysis efficiency for aeroengine components, such as compressor blade, support vector machine response surface method (SRSM) is proposed. SRSM integrates the advantages of support vector machine (SVM) and traditional response surface method (RSM), and utilizes experimental samples to construct a suitable response surface function (RSF) to replace the complicated and abstract finite element model. Moreover, the randomness of material parameters, structural dimension and operating condition are considered during extracting data so that the response surface function is more agreeable to the practical model. The results indicate that based on the same experimental data, SRSM has come closer than RSM reliability to approximating Monte Carlo method (MCM); while SRSM (17.296 s) needs far less running time than MCM (10958 s) and RSM (9840 s). Therefore, under the same simulation conditions, SRSM has the largest analysis efficiency, and can be considered a feasible and valid method to analyze structural reliability.

  4. Reduced Expanding Load Method for Simulation-Based Structural System Reliability Analysis

    远方; 宋丽娜; 方江生


    The current situation and difficulties of the structural system reliability analysis are mentioned. Then on the basis of Monte Carlo method and computer simulation, a new analysis method reduced expanding load method ( RELM ) is presented, which can be used to solve structural reliability problems effectively and conveniently. In this method, the uncertainties of loads, structural material properties and dimensions can be fully considered. If the statistic parameters of stochastic variables are known, by using this method, the probability of failure can be estimated rather accurately. In contrast with traditional approaches, RELM method gives a much better understanding of structural failure frequency and its reliability indexβ is more meaningful. To illustrate this new idea, a specific example is given.

  5. Markov Chain Modelling of Reliability Analysis and Prediction under Mixed Mode Loading

    SINGH Salvinder; ABDULLAH Shahrum; NIK MOHAMED Nik Abdullah; MOHD NOORANI Mohd Salmi


    The reliability assessment for an automobile crankshaft provides an important understanding in dealing with the design life of the component in order to eliminate or reduce the likelihood of failure and safety risks. The failures of the crankshafts are considered as a catastrophic failure that leads towards a severe failure of the engine block and its other connecting subcomponents. The reliability of an automotive crankshaft under mixed mode loading using the Markov Chain Model is studied. The Markov Chain is modelled by using a two-state condition to represent the bending and torsion loads that would occur on the crankshaft. The automotive crankshaft represents a good case study of a component under mixed mode loading due to the rotating bending and torsion stresses. An estimation of the Weibull shape parameter is used to obtain the probability density function, cumulative distribution function, hazard and reliability rate functions, the bathtub curve and the mean time to failure. The various properties of the shape parameter is used to model the failure characteristic through the bathtub curve is shown. Likewise, an understanding of the patterns posed by the hazard rate onto the component can be used to improve the design and increase the life cycle based on the reliability and dependability of the component. The proposed reliability assessment provides an accurate, efficient, fast and cost effective reliability analysis in contrast to costly and lengthy experimental techniques.

  6. Analysis of the Kinematic Accuracy Reliability of a 3-DOF Parallel Robot Manipulator

    Guohua Cui


    Full Text Available Kinematic accuracy reliability is an important performance index in the evaluation of mechanism quality. By using a 3- DOF 3-PUU parallel robot manipulator as the research object, the position and orientation error model was derived by mapping the relation between the input and output of the mechanism. Three error sensitivity indexes that evaluate the kinematic accuracy of the parallel robot manipulator were obtained by adapting the singular value decomposition of the error translation matrix. Considering the influence of controllable and uncontrollable factors on the kinematic accuracy, the mathematical model of reliability based on random probability was employed. The measurement and calculation method for the evaluation of the mechanism’s kinematic reliability level was also provided. By analysing the mechanism’s errors and reliability, the law of surface error sensitivity for the location and structure parameters was obtained. The kinematic reliability of the parallel robot manipulator was statistically computed on the basis of the Monte Carlo simulation method. The reliability analysis of kinematic accuracy provides a theoretical basis for design optimization and error compensation.

  7. Reliability Analysis of Distributed Grid-connected Photovoltaic System Monitoring Network

    Fu Zhixin


    Full Text Available A large amount of distributed grid-connected Photovoltaic systems have brought new challenges to the dispatching of power network. Real-time monitoring the PV system can efficiently help improve the ability of power network to accept and control the distributed PV systems, and thus mitigate the impulse on the power network imposed by the uncertainty of its power output. To study the reliability of distributed PV monitoring network, it is of great significance to look for a method to build a highly reliable monitoring system, analyze the weak links and key nodes of its monitoring performance in improving the performance of the monitoring network. Firstly a reliability model of PV system was constructed based on WSN technology. Then, in view of the dynamic characteristics of the network’s reliability, fault tree analysis was used to judge any possible reasons that cause the failure of the network and logical relationship between them. Finally, the reliability of the monitoring network was analyzed to figure out the weak links and key nodes. This paper provides guidance to build a stable and reliable monitoring network of a distributed PV system.

  8. A comparative analysis of frog early development.

    del Pino, Eugenia M; Venegas-Ferrín, Michael; Romero-Carvajal, Andrés; Montenegro-Larrea, Paola; Sáenz-Ponce, Natalia; Moya, Iván M; Alarcón, Ingrid; Sudou, Norihiro; Yamamoto, Shinji; Taira, Masanori


    The current understanding of Xenopus laevis development provides a comparative background for the analysis of frog developmental modes. Our analysis of development in various frogs reveals that the mode of gastrulation is associated with developmental rate and is unrelated to egg size. In the gastrula of the rapidly developing embryos of the foam-nesting frogs Engystomops coloradorum and Engystomops randi, archenteron and notochord elongation overlapped with involution at the blastopore lip, as in X. laevis embryos. In embryos of dendrobatid frogs and in the frog without tadpoles Eleutherodactylus coqui, which develop somewhat more slowly than X. laevis, involution and archenteron elongation concomitantly occurred during gastrulation; whereas elongation of the notochord and, therefore, dorsal convergence and extension, occurred in the postgastrula. In contrast, in the slow developing embryos of the marsupial frog Gastrotheca riobambae, only involution occurred during gastrulation. The processes of archenteron and notochord elongation and convergence and extension were postgastrulation events. We produced an Ab against the homeodomain protein Lim1 from X. laevis as a tool for the comparative analysis of development. By the expression of Lim1, we were able to identify the dorsal side of the G. riobambae early gastrula, which otherwise was difficult to detect. Moreover, the Lim1 expression in the dorsal lip of the blastopore and notochord differed among the studied frogs, indicating variation in the timing of developmental events. The variation encountered gives evidence of the modular character of frog gastrulation.

  9. An Efficient Approach for the Reliability Analysis of Phased-Mission Systems with Dependent Failures

    Xing, Liudong; Meshkat, Leila; Donahue, Susan K.


    We consider the reliability analysis of phased-mission systems with common-cause failures in this paper. Phased-mission systems (PMS) are systems supporting missions characterized by multiple, consecutive, and nonoverlapping phases of operation. System components may be subject to different stresses as well as different reliability requirements throughout the course of the mission. As a result, component behavior and relationships may need to be modeled differently from phase to phase when performing a system-level reliability analysis. This consideration poses unique challenges to existing analysis methods. The challenges increase when common-cause failures (CCF) are incorporated in the model. CCF are multiple dependent component failures within a system that are a direct result of a shared root cause, such as sabotage, flood, earthquake, power outage, or human errors. It has been shown by many reliability studies that CCF tend to increase a system's joint failure probabilities and thus contribute significantly to the overall unreliability of systems subject to CCF.We propose a separable phase-modular approach to the reliability analysis of phased-mission systems with dependent common-cause failures as one way to meet the above challenges in an efficient and elegant manner. Our methodology is twofold: first, we separate the effects of CCF from the PMS analysis using the total probability theorem and the common-cause event space developed based on the elementary common-causes; next, we apply an efficient phase-modular approach to analyze the reliability of the PMS. The phase-modular approach employs both combinatorial binary decision diagram and Markov-chain solution methods as appropriate. We provide an example of a reliability analysis of a PMS with both static and dynamic phases as well as CCF as an illustration of our proposed approach. The example is based on information extracted from a Mars orbiter project. The reliability model for this orbiter considers

  10. Guidelines for reliability analysis of digital systems in PSA context. Phase 1 status report

    Authen, S.; Larsson, J. (Risk Pilot AB, Stockholm (Sweden)); Bjoerkman, K.; Holmberg, J.-E. (VTT, Helsingfors (Finland))


    Digital protection and control systems are appearing as upgrades in older nuclear power plants (NPPs) and are commonplace in new NPPs. To assess the risk of NPP operation and to determine the risk impact of digital system upgrades on NPPs, quantitative reliability models are needed for digital systems. Due to the many unique attributes of these systems, challenges exist in systems analysis, modeling and in data collection. Currently there is no consensus on reliability analysis approaches. Traditional methods have clearly limitations, but more dynamic approaches are still in trial stage and can be difficult to apply in full scale probabilistic safety assessments (PSA). The number of PSAs worldwide including reliability models of digital I and C systems are few. A comparison of Nordic experiences and a literature review on main international references have been performed in this pre-study project. The study shows a wide range of approaches, and also indicates that no state-of-the-art currently exists. The study shows areas where the different PSAs agree and gives the basis for development of a common taxonomy for reliability analysis of digital systems. It is still an open matter whether software reliability needs to be explicitly modelled in the PSA. The most important issue concerning software reliability is proper descriptions of the impact that software-based systems has on the dependence between the safety functions and the structure of accident sequences. In general the conventional fault tree approach seems to be sufficient for modelling reactor protection system kind of functions. The following focus areas have been identified for further activities: 1. Common taxonomy of hardware and software failure modes of digital components for common use 2. Guidelines regarding level of detail in system analysis and screening of components, failure modes and dependencies 3. Approach for modelling of CCF between components (including software). (Author)

  11. Experimental results of fingerprint comparison validity and reliability: A review and critical analysis.

    Haber, Ralph Norman; Haber, Lyn


    Our purpose in this article is to determine whether the results of the published experiments on the accuracy and reliability of fingerprint comparison can be generalized to fingerprint laboratory casework, and/or to document the error rate of the Analysis-Comparison-Evaluation (ACE) method. We review the existing 13 published experiments on fingerprint comparison accuracy and reliability. These studies comprise the entire corpus of experimental research published on the accuracy of fingerprint comparisons since criminal courts first admitted forensic fingerprint evidence about 120years ago. We start with the two studies by Ulery, Hicklin, Buscaglia and Roberts (2011, 2012), because they are recent, large, designed specifically to provide estimates of the accuracy and reliability of fingerprint comparisons, and to respond to the criticisms cited in the National Academy of Sciences Report (2009). Following the two Ulery et al. studies, we review and evaluate the other eleven experiments, considering problems that are unique to each. We then evaluate the 13 experiments for the problems common to all or most of them, especially with respect to the generalizability of their results to laboratory casework. Overall, we conclude that the experimental designs employed deviated from casework procedures in critical ways that preclude generalization of the results to casework. The experiments asked examiner-subjects to carry out their comparisons using different responses from those employed in casework; the experiments presented the comparisons in formats that differed from casework; the experiments enlisted highly trained examiners as experimental subjects rather than subjects drawn randomly from among all fingerprint examiners; the experiments did not use fingerprint test items known to be comparable in type and especially in difficulty to those encountered in casework; and the experiments did not require examiners to use the ACE method, nor was that method defined

  12. Saddlepoint approximation based structural reliability analysis with non-normal random variables


    The saddlepoint approximation (SA) can directly estimate the probability distribution of linear performance function in non-normal variables space. Based on the property of SA, three SA based methods are developed for the structural system reliability analysis. The first method is SA based reliability bounds theory (RBT), in which SA is employed to estimate failure probability and equivalent normal reliability index for each failure mode firstly, and then RBT is employed to obtain the upper and the lower bounds of system failure probability. The second method is SA based Nataf approximation, in which SA is used to estimate the probability density function (PDF) and cumulative distribution function (CDF) for the approximately linearized performance function of each failure mode. After the PDF of each failure mode and the correlation coefficients among approximately linearized performance functions are estimated, Nataf distribution is employed to approximate the joint PDF of multiple structural system performance functions, and then the system failure probability can be estimated directly by numerical simulation using the joint PDF. The third method is SA based line sampling (LS). The standardization transformation is needed to eliminate the dimensions of variables firstly in this case. Then LS method can express the system failure probability as an arithmetic average of a set of failure probabilities of the linear performance functions, and the probabilities of the linear performance functions can be estimated by the SA in the non-normal variables space. By comparing basic concepts, implementations and results of illustrations, the following conclusions can be drawn: (1) The first method can only obtain the bounds of system failure probability and it is only acceptable for the linear limit state function; (2) the second method can give the estimation of system failure probability, and its error mostly results from the approximation of Nataf distribution for the

  13. WHO quality of life-BREF 26 questionnaire: reliability and validity of the Persian version and compare it with Iranian diabetics quality of life questionnaire in diabetic patients.

    Jahanlou, Alireza Shahab; Karami, Nader Alishan


    There are several tools for the assessment of quality of life (QOL) in diabetes mellitus. In the current research, two standard questionnaires for evaluating of QOL were selected. First one was a questionnaire from the World Health Organization quality of life (WHOQOL-BREF 26) and the second one, The Iranian diabetics quality of life (IRDQOL). The first aim of this study was to reliability and validity of the Persian Version of WHOQOL-BREF 26. The second aims compare it with IRDQOL questionnaire in diabetic patients. A random sample of Iranian adult outpatient diabetics (n=387) was selected and they completed the WHOQOL and IRDQOL assessment instruments. In addition HbA1c was measured in these patients by calorimetric method. Data analysis was carried out by the use of T-test, Spearman correlation coefficient, Pearson's correlation coefficient. Data analysis based on Pearson correlations in the two questionnaires showed all subscales and total QOL have highly acceptable test-retest reliability. Comparison of total QOL and similar domains in the two questionnaires showed physical domain score in IRDQOL was lower than in WHOQOL and it was significant (Pafore-mentioned domains and glycemic control.

  14. Reliability assessment of different plate theories for elastic wave propagation analysis in functionally graded plates.

    Mehrkash, Milad; Azhari, Mojtaba; Mirdamadi, Hamid Reza


    The importance of elastic wave propagation problem in plates arises from the application of ultrasonic elastic waves in non-destructive evaluation of plate-like structures. However, precise study and analysis of acoustic guided waves especially in non-homogeneous waveguides such as functionally graded plates are so complicated that exact elastodynamic methods are rarely employed in practical applications. Thus, the simple approximate plate theories have attracted much interest for the calculation of wave fields in FGM plates. Therefore, in the current research, the classical plate theory (CPT), first-order shear deformation theory (FSDT) and third-order shear deformation theory (TSDT) are used to obtain the transient responses of flexural waves in FGM plates subjected to transverse impulsive loadings. Moreover, comparing the results with those based on a well recognized hybrid numerical method (HNM), we examine the accuracy of the plate theories for several plates of various thicknesses under excitations of different frequencies. The material properties of the plate are assumed to vary across the plate thickness according to a simple power-law distribution in terms of volume fractions of constituents. In all analyses, spatial Fourier transform together with modal analysis are applied to compute displacement responses of the plates. A comparison of the results demonstrates the reliability ranges of the approximate plate theories for elastic wave propagation analysis in FGM plates. Furthermore, based on various examples, it is shown that whenever the plate theories are used within the appropriate ranges of plate thickness and frequency content, solution process in wave number-time domain based on modal analysis approach is not only sufficient but also efficient for finding the transient waveforms in FGM plates.

  15. Inpatient care in Kazakhstan: A comparative analysis

    Ainur B Kumar


    Full Text Available Background: Reforms in inpatient care are critical for the enhancement of the efficiency of health systems. It still remains the main costly sector of the health system, accounting for more than 60% of all expenditures. Inappropriate and ineffective use of the hospital infrastructure is also a big issue. We aimed to analyze statistical data on health indices and dynamics of the hospital stock in Kazakhstan in comparison with those of developed countries. Materials and Methods: Study design is comparative quantitative analysis of inpatient care indicators. We used information and analytical methods, content analysis, mathematical treatment, and comparative analysis of statistical data on health system and dynamics of hospital stock in Kazakhstan and some other countries of the world [Organization for Economic Cooperation and Development (OECD, USA, Canada, Russia, China, Japan, and Korea] over the period 2001-2011. Results : Despite substantial and continuous reductions over the past 10 years, hospitalization rates in Kazakhstan still remain high compared to some developed countries, including those of the OECD. In fact, the hospital stay length for all patients in Kazakhstan in 2011 is around 9.9 days, hospitalization ratio per 100 people is 16.3, and hospital beds capacity is 100 per 10,000 inhabitants. Conclusion: The decreased level of beds may adversely affect both medical organization and health system operations. Alternatives to the existing inpatient care are now being explored. The introduction of the unified national healthcare system allows shifting the primary focus on primary care organizations, which can decrease the demand on inpatient care as a result of improving the health status of people at the primary care level.

  16. Comparative Analysis of Hand Gesture Recognition Techniques

    Arpana K. Patel


    Full Text Available During past few years, human hand gesture for interaction with computing devices has continues to be active area of research. In this paper survey of hand gesture recognition is provided. Hand Gesture Recognition is contained three stages: Pre-processing, Feature Extraction or matching and Classification or recognition. Each stage contains different methods and techniques. In this paper define small description of different methods used for hand gesture recognition in existing system with comparative analysis of all method with its benefits and drawbacks are provided.

  17. Hierarchical modeling for reliability analysis using Markov models. B.S./M.S. Thesis - MIT

    Fagundo, Arturo


    Markov models represent an extremely attractive tool for the reliability analysis of many systems. However, Markov model state space grows exponentially with the number of components in a given system. Thus, for very large systems Markov modeling techniques alone become intractable in both memory and CPU time. Often a particular subsystem can be found within some larger system where the dependence of the larger system on the subsystem is of a particularly simple form. This simple dependence can be used to decompose such a system into one or more subsystems. A hierarchical technique is presented which can be used to evaluate these subsystems in such a way that their reliabilities can be combined to obtain the reliability for the full system. This hierarchical approach is unique in that it allows the subsystem model to pass multiple aggregate state information to the higher level model, allowing more general systems to be evaluated. Guidelines are developed to assist in the system decomposition. An appropriate method for determining subsystem reliability is also developed. This method gives rise to some interesting numerical issues. Numerical error due to roundoff and integration are discussed at length. Once a decomposition is chosen, the remaining analysis is straightforward but tedious. However, an approach is developed for simplifying the recombination of subsystem reliabilities. Finally, a real world system is used to illustrate the use of this technique in a more practical context.

  18. Johnson Space Center's Risk and Reliability Analysis Group 2008 Annual Report

    Valentine, Mark; Boyer, Roger; Cross, Bob; Hamlin, Teri; Roelant, Henk; Stewart, Mike; Bigler, Mark; Winter, Scott; Reistle, Bruce; Heydorn,Dick


    The Johnson Space Center (JSC) Safety & Mission Assurance (S&MA) Directorate s Risk and Reliability Analysis Group provides both mathematical and engineering analysis expertise in the areas of Probabilistic Risk Assessment (PRA), Reliability and Maintainability (R&M) analysis, and data collection and analysis. The fundamental goal of this group is to provide National Aeronautics and Space Administration (NASA) decisionmakers with the necessary information to make informed decisions when evaluating personnel, flight hardware, and public safety concerns associated with current operating systems as well as with any future systems. The Analysis Group includes a staff of statistical and reliability experts with valuable backgrounds in the statistical, reliability, and engineering fields. This group includes JSC S&MA Analysis Branch personnel as well as S&MA support services contractors, such as Science Applications International Corporation (SAIC) and SoHaR. The Analysis Group s experience base includes nuclear power (both commercial and navy), manufacturing, Department of Defense, chemical, and shipping industries, as well as significant aerospace experience specifically in the Shuttle, International Space Station (ISS), and Constellation Programs. The Analysis Group partners with project and program offices, other NASA centers, NASA contractors, and universities to provide additional resources or information to the group when performing various analysis tasks. The JSC S&MA Analysis Group is recognized as a leader in risk and reliability analysis within the NASA community. Therefore, the Analysis Group is in high demand to help the Space Shuttle Program (SSP) continue to fly safely, assist in designing the next generation spacecraft for the Constellation Program (CxP), and promote advanced analytical techniques. The Analysis Section s tasks include teaching classes and instituting personnel qualification processes to enhance the professional abilities of our analysts

  19. A continuous-time Bayesian network reliability modeling and analysis framework

    Boudali, H.; Dugan, J.B.


    We present a continuous-time Bayesian network (CTBN) framework for dynamic systems reliability modeling and analysis. Dynamic systems exhibit complex behaviors and interactions between their components; where not only the combination of failure events matters, but so does the sequence ordering of th

  20. Reliability of ^1^H NMR analysis for assessment of lipid oxidation at frying temperatures

    The reliability of a method using ^1^H NMR analysis for assessment of oil oxidation at a frying temperature was examined. During heating and frying at 180 °C, changes of soybean oil signals in the ^1^H NMR spectrum including olefinic (5.16-5.30 ppm), bisallylic (2.70-2.88 ppm), and allylic (1.94-2.1...

  1. The Stress and Reliability Analysis of HTR’s Graphite Component

    Xiang Fang


    Full Text Available The high temperature gas cooled reactor (HTR is developing rapidly toward a modular, compact, and integral direction. As the main structure material, graphite plays a very important role in HTR engineering, and the reliability of graphite component has a close relationship with the integrity of reactor core. The graphite components are subjected to high temperature and fast neutron irradiation simultaneously during normal operation of the reactor. With the stress accumulation induced by high temperature and irradiation, the failure risk of graphite components increases constantly. Therefore it is necessary to study and simulate the mechanical behavior of graphite component under in-core working conditions and forecast the internal stress accumulation history and the variation of reliability. The work of this paper focuses on the mechanical analysis of pebble-bed type HTR's graphite brick. The analysis process is comprised of two procedures, stress analysis and reliability analysis. Three different creep models and two different reliability models are reviewed and taken into account in simulation. The stress and failure probability calculation results are obtained and discussed. The results gained with various models are highly consistent, and the discrepancies are acceptable.

  2. Reliability of an Automated High-Resolution Manometry Analysis Program across Expert Users, Novice Users, and Speech-Language Pathologists

    Jones, Corinne A.; Hoffman, Matthew R.; Geng, Zhixian; Abdelhalim, Suzan M.; Jiang, Jack J.; McCulloch, Timothy M.


    Purpose: The purpose of this study was to investigate inter- and intrarater reliability among expert users, novice users, and speech-language pathologists with a semiautomated high-resolution manometry analysis program. We hypothesized that all users would have high intrarater reliability and high interrater reliability. Method: Three expert…

  3. Human reliability analysis of the Tehran research reactor using the SPAR-H method

    Barati Ramin


    Full Text Available The purpose of this paper is to cover human reliability analysis of the Tehran research reactor using an appropriate method for the representation of human failure probabilities. In the present work, the technique for human error rate prediction and standardized plant analysis risk-human reliability methods have been utilized to quantify different categories of human errors, applied extensively to nuclear power plants. Human reliability analysis is, indeed, an integral and significant part of probabilistic safety analysis studies, without it probabilistic safety analysis would not be a systematic and complete representation of actual plant risks. In addition, possible human errors in research reactors constitute a significant part of the associated risk of such installations and including them in a probabilistic safety analysis for such facilities is a complicated issue. Standardized plant analysis risk-human can be used to address these concerns; it is a well-documented and systematic human reliability analysis system with tables for human performance choices prepared in consultation with experts in the domain. In this method, performance shaping factors are selected via tables, human action dependencies are accounted for, and the method is well designed for the intended use. In this study, in consultations with reactor operators, human errors are identified and adequate performance shaping factors are assigned to produce proper human failure probabilities. Our importance analysis has revealed that human action contained in the possibility of an external object falling on the reactor core are the most significant human errors concerning the Tehran research reactor to be considered in reactor emergency operating procedures and operator training programs aimed at improving reactor safety.

  4. An efficient hybrid reliability analysis method with random and interval variables

    Xie, Shaojun; Pan, Baisong; Du, Xiaoping


    Random and interval variables often coexist. Interval variables make reliability analysis much more computationally intensive. This work develops a new hybrid reliability analysis method so that the probability analysis (PA) loop and interval analysis (IA) loop are decomposed into two separate loops. An efficient PA algorithm is employed, and a new efficient IA method is developed. The new IA method consists of two stages. The first stage is for monotonic limit-state functions. If the limit-state function is not monotonic, the second stage is triggered. In the second stage, the limit-state function is sequentially approximated with a second order form, and the gradient projection method is applied to solve the extreme responses of the limit-state function with respect to the interval variables. The efficiency and accuracy of the proposed method are demonstrated by three examples.

  5. Reliability analysis of an LCL tuned track segmented bi-directional inductive power transfer system

    Asif Iqbal, S. M.; Madawala, U. K.; Thrimawithana, D. J.;


    Bi-directional Inductive Power Transfer (BDIPT) technique is suitable for renewable energy based applications such as electric vehicles (EVs), for the implementation of vehicle-to-grid (V2G) systems. Recently, more efforts have been made by researchers to improve both efficiency and reliability...... for a 1.5 kW BDIPT system in a MATLAB/Simulink environment. Reliability parameters such as failure rate and mean time between failures (MTBF) are compared between the two systems. A nonlinear programming (NP) model is developed for optimizing charging schedule for a stationery EV. A case study of EV...

  6. A comparative analysis of academic dissertation management systems in China

    GAO; Min; JIN; Yuling; WANG; Zhengjun; DAI; Yumei; WANG; Ruiyun


    The paper is to do a comparative evaluation on the four major digital theses and dissertations management systems used in the mainland China,which are TRS,TPI,TASi and IDL-ETD.The evaluation analysis is primarily based on the systematic tests on these systems conducted by DUT(Dalian University of Technology)Library.Special focuses are on the distinctive features of each system such as their performance in terms of stability,reliability,openness,capacity of backdating,copyright protection,service monitoring,document conversion and release automation,log management,statistical tabulations,vendor’s technical support and so on.In addition,authors provide a statistics on the choice of academic dissertation system of most 985a)colleges and universities in China.

  7. Fuzzy Fatigue Reliability Analysis of Offshore Platforms in Ice-Infested Waters

    方华灿; 段梦兰; 贾星兰; 谢彬


    The calculation of fatigue stress ranges due to random waves and ice loads on offshore structures is discussed, and the corresponding accumulative fatigue damages of the structural members are evaluated. To evaluate the fatigue damage to the structures more accurately, the Miner rule is modified considering the fuzziness of the concerned parameters, and a new model for fuzzy fatigue reliability analysis of offshore structures members is developed. Furthermore, an assessment method for predicting the dynamics of the fuzzy fatigue reliability of structural members is provided.

  8. Reliability Analysis of Piezoelectric Truss Structures Under Joint Action of Electric and Mechanical Loading

    YANG Duo-he; AN Wei-guang; ZHU Rong-rong; MIAO Han


    Based on the finite element method(FEM) for the dynamical analysis of piezoelectric truss structures, the expressions of safety margins of strength fracture and damage electric field in the structure element are given considering electromechanical coupling effect under the joint action of electric and mechanical load. By importing the stochastic FEM,reliability of piezoelectric truss structures is analyzed by solving for partial derivative in the process of solving dynamical response of structure system with mode-superposition method. The influence of electromechanical coupling effect to reliability index is then analyzed through an example.

  9. Tensile reliability analysis for gravity dam foundation surface based on FEM and response surface method

    Tong-chun LI; Li, Dan-Dan; Wang, Zhi-Qiang


    In the paper, the limit state equation of tensile reliability of foundation base of gravity dam is established. The possible crack length is set as action effect and the allowance crack length is set as resistance in this limit state. The nonlinear FEM is applied to obtain the crack length of foundation base of gravity dam, and linear response surface method based on the orthogonal test design method is used to calculate the reliability,which offered an reasonable and simple analysis method t...


    Dars, P.; Ternisien D'Ouville, T.; Mingam, H.; Merckel, G.


    Statistical analysis of asymmetry in LDD NMOSFETs electrical characteristics shows the influence of implantation angles on non-overlap variation observed on devices realized on a 100 mm wafer and within the wafers of a batch . The study of the consequence of this dispersion on the aging behaviour illustrates the importance of this parameter for reliability and the necessity to take it in account for accurate analysis of stress results.

  11. A limited assessment of the ASEP human reliability analysis procedure using simulator examination results

    Gore, B.R.; Dukelow, J.S. Jr.; Mitts, T.M.; Nicholson, W.L. [Pacific Northwest Lab., Richland, WA (United States)


    This report presents a limited assessment of the conservatism of the Accident Sequence Evaluation Program (ASEP) human reliability analysis (HRA) procedure described in NUREG/CR-4772. In particular, the, ASEP post-accident, post-diagnosis, nominal HRA procedure is assessed within the context of an individual`s performance of critical tasks on the simulator portion of requalification examinations administered to nuclear power plant operators. An assessment of the degree to which operator perforn:Lance during simulator examinations is an accurate reflection of operator performance during actual accident conditions was outside the scope of work for this project; therefore, no direct inference can be made from this report about such performance. The data for this study are derived from simulator examination reports from the NRC requalification examination cycle. A total of 4071 critical tasks were identified, of which 45 had been failed. The ASEP procedure was used to estimate human error probability (HEP) values for critical tasks, and the HEP results were compared with the failure rates observed in the examinations. The ASEP procedure was applied by PNL operator license examiners who supplemented the limited information in the examination reports with expert judgment based upon their extensive simulator examination experience. ASEP analyses were performed for a sample of 162 critical tasks selected randomly from the 4071, and the results were used to characterize the entire population. ASEP analyses were also performed for all of the 45 failed critical tasks. Two tests were performed to assess the bias of the ASEP HEPs compared with the data from the requalification examinations. The first compared the average of the ASEP HEP values with the fraction of the population actually failed and it found a statistically significant factor of two bias on the average.

  12. Comparative genome analysis of Basidiomycete fungi

    Riley, Robert; Salamov, Asaf; Henrissat, Bernard; Nagy, Laszlo; Brown, Daren; Held, Benjamin; Baker, Scott; Blanchette, Robert; Boussau, Bastien; Doty, Sharon L.; Fagnan, Kirsten; Floudas, Dimitris; Levasseur, Anthony; Manning, Gerard; Martin, Francis; Morin, Emmanuelle; Otillar, Robert; Pisabarro, Antonio; Walton, Jonathan; Wolfe, Ken; Hibbett, David; Grigoriev, Igor


    Fungi of the phylum Basidiomycota (basidiomycetes), make up some 37percent of the described fungi, and are important in forestry, agriculture, medicine, and bioenergy. This diverse phylum includes symbionts, pathogens, and saprotrophs including the majority of wood decaying and ectomycorrhizal species. To better understand the genetic diversity of this phylum we compared the genomes of 35 basidiomycetes including 6 newly sequenced genomes. These genomes span extremes of genome size, gene number, and repeat content. Analysis of core genes reveals that some 48percent of basidiomycete proteins are unique to the phylum with nearly half of those (22percent) found in only one organism. Correlations between lifestyle and certain gene families are evident. Phylogenetic patterns of plant biomass-degrading genes in Agaricomycotina suggest a continuum rather than a dichotomy between the white rot and brown rot modes of wood decay. Based on phylogenetically-informed PCA analysis of wood decay genes, we predict that that Botryobasidium botryosum and Jaapia argillacea have properties similar to white rot species, although neither has typical ligninolytic class II fungal peroxidases (PODs). This prediction is supported by growth assays in which both fungi exhibit wood decay with white rot-like characteristics. Based on this, we suggest that the white/brown rot dichotomy may be inadequate to describe the full range of wood decaying fungi. Analysis of the rate of discovery of proteins with no or few homologs suggests the value of continued sequencing of basidiomycete fungi.

  13. A model for reliability analysis and calculation applied in an example from chemical industry

    Pejović Branko B.


    Full Text Available The subject of the paper is reliability design in polymerization processes that occur in reactors of a chemical industry. The designed model is used to determine the characteristics and indicators of reliability, which enabled the determination of basic factors that result in a poor development of a process. This would reduce the anticipated losses through the ability to control them, as well as enabling the improvement of the quality of production, which is the major goal of the paper. The reliability analysis and calculation uses the deductive method based on designing of a scheme for fault tree analysis of a system based on inductive conclusions. It involves the use standard logical symbols and rules of Boolean algebra and mathematical logic. The paper eventually gives the results of the work in the form of quantitative and qualitative reliability analysis of the observed process, which served to obtain complete information on the probability of top event in the process, as well as objective decision making and alternative solutions.

  14. The design and use of reliability data base with analysis tool

    Doorepall, J.; Cooke, R.; Paulsen, J.; Hokstadt, P.


    With the advent of sophisticated computer tools, it is possible to give a distributed population of users direct access to reliability component operational histories. This allows the user a greater freedom in defining statistical populations of components and selecting failure modes. However, the reliability data analyst`s current analytical instrumentarium is not adequate for this purpose. The terminology used in organizing and gathering reliability data is standardized, and the statistical methods used in analyzing this data are not always suitably chosen. This report attempts to establish a baseline with regard to terminology and analysis methods, to support the use of a new analysis tool. It builds on results obtained in several projects for the ESTEC and SKI on the design of reliability databases. Starting with component socket time histories, we identify a sequence of questions which should be answered prior to the employment of analytical methods. These questions concern the homogeneity and stationarity of (possible dependent) competing failure modes and the independence of competing failure modes. Statistical tests, some of them new, are proposed for answering these questions. Attention is given to issues of non-identifiability of competing risk and clustering of failure-repair events. These ideas have been implemented in an analysis tool for grazing component socket time histories, and illustrative results are presented. The appendix provides background on statistical tests and competing failure modes. (au) 4 tabs., 17 ills., 61 refs.

  15. CARES/LIFE Ceramics Analysis and Reliability Evaluation of Structures Life Prediction Program

    Nemeth, Noel N.; Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.


    This manual describes the Ceramics Analysis and Reliability Evaluation of Structures Life Prediction (CARES/LIFE) computer program. The program calculates the time-dependent reliability of monolithic ceramic components subjected to thermomechanical and/or proof test loading. CARES/LIFE is an extension of the CARES (Ceramic Analysis and Reliability Evaluation of Structures) computer program. The program uses results from MSC/NASTRAN, ABAQUS, and ANSYS finite element analysis programs to evaluate component reliability due to inherent surface and/or volume type flaws. CARES/LIFE accounts for the phenomenon of subcritical crack growth (SCG) by utilizing the power law, Paris law, or Walker law. The two-parameter Weibull cumulative distribution function is used to characterize the variation in component strength. The effects of multiaxial stresses are modeled by using either the principle of independent action (PIA), the Weibull normal stress averaging method (NSA), or the Batdorf theory. Inert strength and fatigue parameters are estimated from rupture strength data of naturally flawed specimens loaded in static, dynamic, or cyclic fatigue. The probabilistic time-dependent theories used in CARES/LIFE, along with the input and output for CARES/LIFE, are described. Example problems to demonstrate various features of the program are also included.

  16. Investigation on Thermal Contact Conductance Based on Data Analysis Method of Reliability

    WANG Zongren; YANG Jun; YANG Mingyuan; ZHANG Weifang


    The method of reliability is proposed for the investigation of thermal contact conductance (TCC) in this study.A new definition is introduced,namely reliability thermal contact conductance (RTCC),which is defined as the TCC value that meets the reliability design requirement of the structural materials under consideration.An experimental apparatus with the compensation heater to test the TCC is introduced here.A practical engineering example is utilized to demonstrate the applicability of the proposed approach.By using a statistical regression model along with experimental data obtained from the interfaces of the structural materials GH4169 and K417 used in aero-engine,the estimate values and the confidence level of TCC and RTCC values are studied and compared.The results show that the testing values of TCC increase with interface pressure and the proposed RTCC model matches the test results better at high interface pressure.

  17. A hybrid algorithm for reliability analysis combining Kriging and subset simulation importance sampling

    Tong, Cao; Sun, Zhili; Zhao, Qianli; Wang, Qibin [Northeastern University, Shenyang (China); Wang, Shuang [Jiangxi University of Science and Technology, Ganzhou (China)


    To solve the problem of large computation when failure probability with time-consuming numerical model is calculated, we propose an improved active learning reliability method called AK-SSIS based on AK-IS algorithm. First, an improved iterative stopping criterion in active learning is presented so that iterations decrease dramatically. Second, the proposed method introduces Subset simulation importance sampling (SSIS) into the active learning reliability calculation, and then a learning function suitable for SSIS is proposed. Finally, the efficiency of AK-SSIS is proved by two academic examples from the literature. The results show that AK-SSIS requires fewer calls to the performance function than AK-IS, and the failure probability obtained from AK-SSIS is very robust and accurate. Then this method is applied on a spur gear pair for tooth contact fatigue reliability analysis.

  18. Reliability Analysis of Component Software in Wireless Sensor Networks Based on Transformation of Testing Data

    Chunyan Hou


    Full Text Available We develop an approach of component software reliability analysis which includes the benefits of both time domain, and structure based approaches. This approach overcomes the deficiency of existing NHPP techniques that fall short of addressing repair, and internal system structures simultaneously. Our solution adopts a method of transformation of testing data to cover both methods, and is expected to improve reliability prediction. This paradigm allows component-based software testing process doesn’t meet the assumption of NHPP models, and accounts for software structures by the way of modeling the testing process. According to the testing model it builds the mapping relation from the testing profile to the operational profile which enables the transformation of the testing data to build the reliability dataset required by NHPP models. At last an example is evaluated to validate and show the effectiveness of this approach.

  19. A Study on Management Techniques of Power Telecommunication System by Reliability Analysis

    Lee, B.K.; Lee, B.S.; Woy, Y.H.; Oh, M.T.; Shin, M.T.; Kwan, O.G. [Korea Electric Power Corp. (KEPCO), Taejon (Korea, Republic of). Research Center; Kim, K.H.; Kim, Y.H.; Lee, W.T.; Park, Y.H.; Lee, J.J.; Park, H.S.; Choi, M.C.; Kim, J. [Korea Electrotechnology Research Inst., Changwon (Korea, Republic of)


    Power telecommunication network is being increased rapidly in that expansion of power facilities according to the growth of electric power supply. The requirement of power facility and office automation and importance of communication services make it to complex and confusing to operate. And, for the sake of correspond to the change of power telecommunication network, effective operation and management is called for urgently. Therefore, the object of this study is to establish total reliability analysis system based on dependability, maintainability, cost effectiveness and replenishment for keep up reasonable reliability, support economical maintenance and reasonable planning of facility investment. And it will make effective management and administration system and schemes for total reliability improvement. (author). 44 refs., figs.

  20. Reliable and Efficient Procedure for Steady-State Analysis of Nonautonomous and Autonomous Systems

    J. Dobes


    Full Text Available The majority of contemporary design tools do not still contain steady-state algorithms, especially for the autonomous systems. This is mainly caused by insufficient accuracy of the algorithm for numerical integration, but also by unreliable steady-state algorithms themselves. Therefore, in the paper, a very stable and efficient procedure for the numerical integration of nonlinear differential-algebraic systems is defined first. Afterwards, two improved methods are defined for finding the steady state, which use this integration algorithm in their iteration loops. The first is based on the idea of extrapolation, and the second utilizes nonstandard time-domain sensitivity analysis. The two steady-state algorithms are compared by analyses of a rectifier and a C-class amplifier, and the extrapolation algorithm is primarily selected as a more reliable alternative. Finally, the method based on the extrapolation naturally cooperating with the algorithm for solving the differential-algebraic systems is thoroughly tested on various electronic circuits: Van der Pol and Colpitts oscillators, fragment of a large bipolar logical circuit, feedback and distributed microwave oscillators, and power amplifier. The results confirm that the extrapolation method is faster than a classical plain numerical integration, especially for larger circuits with complicated transients.

  1. Design Optimization of ESD (Emergency ShutDown System for Offshore Process Based on Reliability Analysis

    Bae Jeong-hoon


    Full Text Available Hydrocarbon leaks have a major accident potential and it could give significant damages to human, property and environment.To prevent these risks from the leak in design aspects, installation of ESD system is representative. Because the ESD system should be operated properly at any time, It needs high reliability and much cost. To make ESD system with high reliability and reasonable cost, it is a need to find specific design method.In this study, we proposed the multi-objective design optimization method and performed the optimization of the ESD system for 1st separation system to satisfy high reliability and cost-effective.‘NSGA-II (Non-dominated Sorting Genetic Algorithm-II’ was applied and two objective functions of ‘Reliability’ and ‘Cost’ of system were defined. Six design variables were set to related variables for system configuration. To verify the result of the optimization, the results of existing design and optimum design were compared in aspects of reliability and cost. With the optimization method proposed from this study, it was possible to derive the reliable and economical design of the ESD system.


    Z.-G. Zhou


    Full Text Available Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with “Change/ No change” by most of the present methods, while few methods focus on estimating reliability (or confidence level of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1 Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST. (2 Forecasting and detecting disturbances in new time series data. (3 Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI and Confidence Levels (CL. The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  3. A Report on Simulation-Driven Reliability and Failure Analysis of Large-Scale Storage Systems

    Wan, Lipeng [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wang, Feiyi [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Oral, H. Sarp [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Vazhkudai, Sudharshan S. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cao, Qing [Univ. of Tennessee, Knoxville, TN (United States)


    High-performance computing (HPC) storage systems provide data availability and reliability using various hardware and software fault tolerance techniques. Usually, reliability and availability are calculated at the subsystem or component level using limited metrics such as, mean time to failure (MTTF) or mean time to data loss (MTTDL). This often means settling on simple and disconnected failure models (such as exponential failure rate) to achieve tractable and close-formed solutions. However, such models have been shown to be insufficient in assessing end-to-end storage system reliability and availability. We propose a generic simulation framework aimed at analyzing the reliability and availability of storage systems at scale, and investigating what-if scenarios. The framework is designed for an end-to-end storage system, accommodating the various components and subsystems, their interconnections, failure patterns and propagation, and performs dependency analysis to capture a wide-range of failure cases. We evaluate the framework against a large-scale storage system that is in production and analyze its failure projections toward and beyond the end of lifecycle. We also examine the potential operational impact by studying how different types of components affect the overall system reliability and availability, and present the preliminary results

  4. Reliability analysis for the 220 kV Libyan high voltage communication system

    Saleh, O.S.A.; AlAthram, A.Y. [General Electric Company of Libya (Libyan Arab Jamahiriya). Development Dept.


    Electric utilities are expanding their networks to include fiber-optic communications, which offer high capacity with reliable performance at low cost. Fiber-optic networks offer a feasible technical solution for leasing excess capacity. They can be readily deployed under a wide range of network configurations and can be upgraded rapidly. This study evaluated the reliability index for the communication network of Libya's 220 kV high voltage subsystem operated by the General Electric Company of Libya (GECOL). The schematic diagrams of the communication networks were presented for both power line carriers and fiber optics networks. A reliability analysis for the two communication networks was performed through the existing communication equipment. The reliability values revealed that the fiber optics system has several advantages such as a large bandwidth for high quality data transmission; immunity to electromagnetic interference; low attenuation which allows for extended cable transmission; ability to be used in dangerous environments; a higher degree of security; and, a high capacity through existing conduits due to its light weight and small diameter. However, it was noted that although fiber optic communications may be more reliable and provide the clearest signal, the powerline communication (PLC) system has more redundancy, particularly in the case of outdoor components where the PLC has more power line to carry the signals, while the fiber optic communications depend only on the earthing wire of the high voltage transmission line. 4 refs., 8 tabs., 6 figs.

  5. Comparative Genome Analysis of Basidiomycete Fungi

    Riley, Robert; Salamov, Asaf; Morin, Emmanuelle; Nagy, Laszlo; Manning, Gerard; Baker, Scott; Brown, Daren; Henrissat, Bernard; Levasseur, Anthony; Hibbett, David; Martin, Francis; Grigoriev, Igor


    Fungi of the phylum Basidiomycota (basidiomycetes), make up some 37percent of the described fungi, and are important in forestry, agriculture, medicine, and bioenergy. This diverse phylum includes the mushrooms, wood rots, symbionts, and plant and animal pathogens. To better understand the diversity of phenotypes in basidiomycetes, we performed a comparative analysis of 35 basidiomycete fungi spanning the diversity of the phylum. Phylogenetic patterns of lignocellulose degrading genes suggest a continuum rather than a sharp dichotomy between the white rot and brown rot modes of wood decay. Patterns of secondary metabolic enzymes give additional insight into the broad array of phenotypes found in the basidiomycetes. We suggest that the profile of an organism in lignocellulose-targeting genes can be used to predict its nutritional mode, and predict Dacryopinax sp. as a brown rot; Botryobasidium botryosum and Jaapia argillacea as white rots.

  6. Comparative Analysis of VNSA Complex Engineering Efforts

    Gary Ackerman


    Full Text Available The case studies undertaken in this special issue demonstrate unequivocally that, despite being forced to operate clandestinely and facing the pressures of security forces seeking to hunt them down and neutralize them, at least a subset of violent non-state actors (VNSAs are capable of some genuinely impressive feats of engineering. At the same time, success in such endeavours is not guaranteed and VNSAs will undoubtedly face a number of obstacles along the way. A comparative analysis of the cases also reveals new insights about the factors influencing the decision to pursue complex engineering efforts, the implementation of such decisions and the determinants of the ultimate outcome. These result in a set of hypotheses and indicators that, if confirmed by future research, can contribute to both operational and strategic intelligence assessments. Overall, the current study enriches our understanding of how and why VNSAs might engage in complex engineering efforts.

  7. A Comparative Analysis of Biomarker Selection Techniques

    Nicoletta Dessì


    Full Text Available Feature selection has become the essential step in biomarker discovery from high-dimensional genomics data. It is recognized that different feature selection techniques may result in different set of biomarkers, that is, different groups of genes highly correlated to a given pathological condition, but few direct comparisons exist which quantify these differences in a systematic way. In this paper, we propose a general methodology for comparing the outcomes of different selection techniques in the context of biomarker discovery. The comparison is carried out along two dimensions: (i measuring the similarity/dissimilarity of selected gene sets; (ii evaluating the implications of these differences in terms of both predictive performance and stability of selected gene sets. As a case study, we considered three benchmarks deriving from DNA microarray experiments and conducted a comparative analysis among eight selection methods, representatives of different classes of feature selection techniques. Our results show that the proposed approach can provide useful insight about the pattern of agreement of biomarker discovery techniques.

  8. Comparative Analysis of Students’ Media Competences Levels

    Alexander Fedorov


    Full Text Available This article analyzed the results of survey of university students’ media literacy competence (on the base of a classification of indicators of media literacy competence of the audience as an effective tool for comparative analysis of the levels of development of media competence of students of the control and experimental groups: the level of media competence of students who have a one-year training course in the framework of media literacy education courses four times higher than in similar indicators in the control group. Analysis of the results of this survey confirmed the general trend of media contacts of student audience – its orientation to entertainment genres of audiovisual media, visually appealing; positive, active, unmarried, childless, educated, highly qualified characters (primarily – male characters aged 19 to 35 years. These heroes are characteristic optimism, independence, intelligence, emotion. They have an excellent command of the life situation and have a positive impact on the development progress of the plot of a media text.

  9. Evaluating the safety risk of roadside features for rural two-lane roads using reliability analysis.

    Jalayer, Mohammad; Zhou, Huaguo


    The severity of roadway departure crashes mainly depends on the roadside features, including the sideslope, fixed-object density, offset from fixed objects, and shoulder width. Common engineering countermeasures to improve roadside safety include: cross section improvements, hazard removal or modification, and delineation. It is not always feasible to maintain an object-free and smooth roadside clear zone as recommended in design guidelines. Currently, clear zone width and sideslope are used to determine roadside hazard ratings (RHRs) to quantify the roadside safety of rural two-lane roadways on a seven-point pictorial scale. Since these two variables are continuous and can be treated as random, probabilistic analysis can be applied as an alternative method to address existing uncertainties. Specifically, using reliability analysis, it is possible to quantify roadside safety levels by treating the clear zone width and sideslope as two continuous, rather than discrete, variables. The objective of this manuscript is to present a new approach for defining the reliability index for measuring roadside safety on rural two-lane roads. To evaluate the proposed approach, we gathered five years (2009-2013) of Illinois run-off-road (ROR) crash data and identified the roadside features (i.e., clear zone widths and sideslopes) of 4500 300ft roadway segments. Based on the obtained results, we confirm that reliability indices can serve as indicators to gauge safety levels, such that the greater the reliability index value, the lower the ROR crash rate.

  10. Reliability and Sensitivity Analysis of Transonic Flutter Using Improved Line Sampling Technique

    Song Shufang; Lu Zhenzhou; Zhang Weiwei; Ye Zhengyin


    The improved line sampling (LS) technique, an effective numerical simulation method, is employed to analyze the probabilistic characteristics and reliability sensitivity of flutter with random structural parameter in transonic flow. The improved LS technique is a novel methodology for reliability and sensitivity analysis of high dimensionality and low probability problem with implicit limit state function, and it does not require any approximating surrogate of the implicit limit state equation. The improved LS is used to estimate the flutter reliability and the sensitivity of a two-dimensional wing, in which some structural properties, such as frequency, parameters of gravity center and mass ratio, are considered as random variables. Computational fluid dynamics (CFD) based unsteady aerodynamic reduced order model (ROM) method is used to construct the aerodynamic state equations. Coupling structural state equations with aerodynamic state equations, the safety margin of flutter is founded by using the critical velocity of flutter. The results show that the improved LS technique can effectively decrease the computational cost in the random uncertainty analysis of flutter. The reliability sensitivity, defined by the partial derivative of the failure probability with respect to the distribution parameter of random variable, can help to identify the important parameters and guide the structural optimization design.

  11. Reliability analysis of stochastic structural system considering static strength, stiffness and fatigue

    AN WeiGuang; ZHAO WeiTao; AN Hai


    Multi-failures are possible to appear in the process of using the structural system,such as dead load failure, fatigue failure and stiffness failure. The expression of residual resistance is given based on the impact of random crack propagation induced by the fatigue load on the critical limit stress and section modulus in this paper. The failure modes of every element of the structural system are analyzed under dead and fatigue loads, and the influence of the correlation of failure modes on reliability of the element is considered. Failure mechanism and the correlation of failure modes under dead and fatigue loads are discussed, and the method of reliability analysis considering static strength, fatigue and stiffness is given. A numerical example is analyzed, which indicates that the failure probability is different for different use life and the influence of dead and fatigue loads on reliability of the structural system is different as well. This method of reliability analysis, in the paper, is better than the method only considering a single factor (or static strength, or fatigue, or stiffness, etc.) in the case of practical engineering.

  12. Reliability, risk and availability analysis and evaluation of a port oil pipeline transportation system in constant operation conditions

    Kolowrocki, Krzysztof [Gdynia Maritime University, Gdynia (Poland)


    In the paper the multi-state approach to the analysis and evaluation of systems' reliability, risk and availability is practically applied. Theoretical definitions and results are illustrated by the example of their application in the reliability, risk and availability evaluation of an oil pipeline transportation system. The pipeline transportation system is considered in the constant in time operation conditions. The system reliability structure and its components reliability functions are not changing in constant operation conditions. The system reliability structure is fixed with a high accuracy. Whereas, the input reliability characteristics of the pipeline components are not sufficiently exact because of the lack of statistical data necessary for their estimation. The results may be considered as an illustration of the proposed methods possibilities of applications in pipeline systems reliability analysis. (author)

  13. Reliability Analysis of Brittle Material Structures - Including MEMS(?) - With the CARES/Life Program

    Nemeth, Noel N.


    Brittle materials are being used, or considered, for a wide variety of high tech applications that operate in harsh environments, including static and rotating turbine parts. thermal protection systems, dental prosthetics, fuel cells, oxygen transport membranes, radomes, and MEMS. Designing components to sustain repeated load without fracturing while using the minimum amount of material requires the use of a probabilistic design methodology. The CARES/Life code provides a general-purpose analysis tool that predicts the probability of failure of a ceramic component as a function of its time in service. For this presentation an interview of the CARES/Life program will be provided. Emphasis will be placed on describing the latest enhancements to the code for reliability analysis with time varying loads and temperatures (fully transient reliability analysis). Also, early efforts in investigating the validity of using Weibull statistics, the basis of the CARES/Life program, to characterize the strength of MEMS structures will be described as as well as the version of CARES/Life for MEMS (CARES/MEMS) being prepared which incorporates single crystal and edge flaw reliability analysis capability. It is hoped this talk will open a dialog for potential collaboration in the area of MEMS testing and life prediction.

  14. Test-retest reliability of pain-related functional brain connectivity compared with pain self-report.

    Letzen, Janelle E; Boissoneault, Jeff; Sevel, Landrew S; Robinson, Michael E


    Test-retest reliability, or reproducibility of results over time, is poorly established for functional brain connectivity (fcMRI) during painful stimulation. As reliability informs the validity of research findings, it is imperative to examine, especially given recent emphasis on using functional neuroimaging as a tool for biomarker development. Although proposed pain neural signatures have been derived using complex, multivariate algorithms, even the reliability of less complex fcMRI findings has yet to be reported. This study examined the test-retest reliability for fcMRI of pain-related brain regions, and self-reported pain (through visual analogue scales [VASs]). Thirty-two healthy individuals completed 3 consecutive fMRI runs of a thermal pain task. Functional connectivity analyses were completed on pain-related brain regions. Intraclass correlations were conducted on fcMRI values and VAS scores across the fMRI runs. Intraclass correlations coefficients for fcMRI values varied widely (range = -.174-.766), with fcMRI between right nucleus accumbens and medial prefrontal cortex showing the highest reliability (range = .649-.766). Intraclass correlations coefficients for VAS scores ranged from .906 to .947. Overall, self-reported pain was more reliable than fcMRI data. These results highlight that fMRI findings might be less reliable than inherently assumed and have implications for future studies proposing pain markers.

  15. Competing risk models in reliability systems, a weibull distribution model with bayesian analysis approach

    Iskandar, Ismed; Satria Gondokaryono, Yudi


    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  16. Efficient Approximate Method of Global Reliability Analysis for Offshore Platforms in the Ice Zone


    Ice load is the dominative load in the design of offshore platforms in the ice zone, and the extreme ice load is the key factor that affects the safety of platforms. The present paper studies the statistical properties of the global resistance and the extreme responses of the jacket platforms in Bohai Bay, considering the randomness of ice load, dead load, steel elastic modulus, yield strength and structural member dimensions. Then, based on the above results, an efficient approximate method of the global reliability analysis for the offshore platforms is proposed, which converts the implicit nonlinear performance function in the conventional reliability analysis to linear explicit one. Finally, numerical examples of JZ20-2 MSW, JZ20-2NW and JZ20-2 MUQ offshore jacket platforms in the Bohai Bay demonstrate the satisfying efficiency, accuracy and applicability of the proposed method.

  17. Microcircuit Device Reliability. Digital Evaluation and Failure Analysis Data. Parts 1 and 2, Summer 1980



  18. A Reliability Analysis of a Rainfall Harvesting System in Southern Italy

    Lorena Liuzzo; Vincenza Notaro; Gabriele Freni


    Rainwater harvesting (RWH) may be an effective alternative water supply solution in regions affected by water scarcity. It has recently become a particularly important option in arid and semi-arid areas (like Mediterranean basins), mostly because of its many benefits and affordable costs. This study provides an analysis of the reliability of using a rainwater harvesting system to supply water for toilet flushing and garden irrigation purposes, with reference to a single-family home in a resid...

  19. Application of the Simulation Based Reliability Analysis on the LBB methodology

    Pečínka L.; Švrček M.


    Guidelines on how to demonstrate the existence of Leak Before Break (LBB) have been developed in many western countries. These guidelines, partly based on NUREG/CR-6765, define the steps that should be fulfilled to get a conservative assessment of LBB acceptability. As a complement and also to help identify the key parameters that influence the resulting leakage and failure probabilities, the application of Simulation Based Reliability Analysis is under development. The used methodology will ...

  20. An evaluation of the reliability and usefulness of external-initiator PRA (probabilistic risk analysis) methodologies

    Budnitz, R.J.; Lambert, H.E. (Future Resources Associates, Inc., Berkeley, CA (USA))


    The discipline of probabilistic risk analysis (PRA) has become so mature in recent years that it is now being used routinely to assist decision-making throughout the nuclear industry. This includes decision-making that affects design, construction, operation, maintenance, and regulation. Unfortunately, not all sub-areas within the larger discipline of PRA are equally mature,'' and therefore the many different types of engineering insights from PRA are not all equally reliable. 93 refs., 4 figs., 1 tab.

  1. Reliability Information Analysis Center 1st Quarter 2007, Technical Area Task (TAT) Report


    07 planning conference 14 Dec 06 II Marine Expeditionary Force (MEF) meeting with Major Smith 14 Dec 06 Gulf of Mexico Tyndall Air Force Base Missile...Restructured action item spreadsheet " Reviewed the following storyboards (functional flow, graphics and text): 1. 050101 Main Rotor System components 2... storyboards (functional flow, graphics, and text): o 050101 Main Rotor System components. Reliability Information Analysis Center 6000 Flanagan Road

  2. Comparative analysis of safety related site characteristics

    Andersson, Johan (ed.)


    This document presents a comparative analysis of site characteristics related to long-term safety for the two candidate sites for a final repository for spent nuclear fuel in Forsmark (municipality of Oesthammar) and in Laxemar (municipality of Oskarshamn) from the point of view of site selection. The analyses are based on the updated site descriptions of Forsmark /SKB 2008a/ and Laxemar /SKB 2009a/, together with associated updated repository layouts and designs /SKB 2008b and SKB 2009b/. The basis for the comparison is thus two equally and thoroughly assessed sites. However, the analyses presented here are focussed on differences between the sites rather than evaluating them in absolute terms. The document serves as a basis for the site selection, from the perspective of long-term safety, in SKB's application for a final repository. A full evaluation of safety is made for a repository at the selected site in the safety assessment SR-Site /SKB 2011/, referred to as SR-Site main report in the following

  3. Reliability analysis of the objective structured clinical examination using generalizability theory

    Trejo-Mejía, Juan Andrés; Sánchez-Mendiola, Melchor; Méndez-Ramírez, Ignacio; Martínez-González, Adrián


    Background The objective structured clinical examination (OSCE) is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory) in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. Methods An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM) Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. Results The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Conclusions Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements. PMID:27543188

  4. Reliability analysis of the objective structured clinical examination using generalizability theory

    Juan Andrés Trejo-Mejía


    Full Text Available Background: The objective structured clinical examination (OSCE is a widely used method for assessing clinical competence in health sciences education. Studies using this method have shown evidence of validity and reliability. There are no published studies of OSCE reliability measurement with generalizability theory (G-theory in Latin America. The aims of this study were to assess the reliability of an OSCE in medical students using G-theory and explore its usefulness for quality improvement. Methods: An observational cross-sectional study was conducted at National Autonomous University of Mexico (UNAM Faculty of Medicine in Mexico City. A total of 278 fifth-year medical students were assessed with an 18-station OSCE in a summative end-of-career final examination. There were four exam versions. G-theory with a crossover random effects design was used to identify the main sources of variance. Examiners, standardized patients, and cases were considered as a single facet of analysis. Results: The exam was applied to 278 medical students. The OSCE had a generalizability coefficient of 0.93. The major components of variance were stations, students, and residual error. The sites and the versions of the tests had minimum variance. Conclusions: Our study achieved a G coefficient similar to that found in other reports, which is acceptable for summative tests. G-theory allows the estimation of the magnitude of multiple sources of error and helps decision makers to determine the number of stations, test versions, and examiners needed to obtain reliable measurements.

  5. Reliability Analysis of Aircraft Condition Monitoring Network Using an Enhanced BDD Algorithm

    ZHAO Changxiao; CHEN Yao; WANG Hailiang; XIONG Huagang


    The aircraft condition monitoring network is responsible for collecting the status of each component in aircraft.The reliability of this network has a significant effect on safety of the aircraft.The aircraft condition monitoring network works in a real-time manner that all the data should be transmitted within the deadline to ensure that the control center makes proper decision in time.Only the connectedness between the source node and destination cannot guarantee the data to be transmitted in time.In this paper,we take the time deadline into account and build the task-based reliability model.The binary decision diagram (BDD),which has the merit of efficiency in computing and storage space,is introduced when calculating the reliability of the network and addressing the essential variable.A case is analyzed using the algorithm proposed in this paper.The experimental results show that our method is efficient and proper for the reliability analysis of the real-time network.

  6. A new approach for interexaminer reliability data analysis on dental caries calibration

    Andréa Videira Assaf


    Full Text Available Objectives: a to evaluate the interexaminer reliability in caries detection considering different diagnostic thresholds and b to indicate, by using Kappa statistics, the best way of measuring interexaminer agreement during the calibration process in dental caries surveys. Methods: Eleven dentists participated in the initial training, which was divided into theoretical discussions and practical activities, and calibration exercises, performed at baseline, 3 and 6 months after the initial training. For the examinations of 6-7-year-old schoolchildren, the World Health Organization (WHO recommendations were followed and different diagnostic thresholds were used: WHO (decayed/missing/filled teeth - DMFT index and WHO + IL (initial lesion diagnostic thresholds. The interexaminer reliability was calculated by Kappa statistics, according to WHO and WHO+IL thresholds considering: a the entire dentition; b upper/lower jaws; c sextants; d each tooth individually. Results: Interexaminer reliability was high for both diagnostic thresholds; nevertheless, it decreased in all calibration sections when considering teeth individually. Conclusion: The interexaminer reliability was possible during the period of 6 months, under both caries diagnosis thresholds. However, great disagreement was observed for posterior teeth, especially using the WHO+IL criteria. Analysis considering dental elements individually was the best way of detecting interexaminer disagreement during the calibration sections.

  7. Reliability Analysis of a 3-Machine Power Station Using State Space Approach

    WasiuAkande Ahmed


    Full Text Available With the advent of high-integrity fault-tolerant systems, the ability to account for repairs of partially failed (but still operational systems become increasingly important. This paper presents a systemic method of determining the reliability of a 3-machine electric power station, taking into consideration the failure rates and repair rates of the individual component (machine that make up the system. A state-space transition process for a 3-machine with 23 states was developed and consequently, steady state equations were generated based on Markov mathematical modeling of the power station. Important reliability components were deduced from this analysis. This research simulation was achieved with codes written in Excel® -VBA programming environment. System reliability using state space approach proofs to be a viable and efficient technique of reliability prediction as it is able to predict the state of the system under consideration. For the purpose of neatness and easy entry of data, Graphic User Interface (GUI was designed.

  8. Comparing the electrical characteristics and reliabilities of BJTs and MOSFETs between Pt and Ti contact silicide processes

    Liu, Kaiping; Shang, Ling


    The sub-threshold characteristics and the reliability of BJTs, using platinum contact silicide (PtSi) or titanium contact silicide (TiSi2), are compared and analyzed. During processing, it is observed that the TiSi2 process produces higher interface state density (Dit) than the PtSi process. The increase in Dit not only leads to a higher base current in the BJTs, but also leads to a lower transconductance for the MOS transistors. The data also show that the impact on NPN and nMOS is more severe than the impact of PNP and pMOS, respectively. This can be explained by the non-symmetric interface state distribution, the re- activation of boron, and/or by substrate trap centers. The amount of interface states produced depends not only on the thickness of the titanium film deposited, but also on the temperature and duration of the titanium silicide process. The electrical data indicates that after all the Back-End- Of-The-Line processing steps, which includes a forming gas anneal, Dit is still higher on wafers with the TiSi2 transistor's base current increases at different rates between the two processes, but eventually levels off to the same final value. However, the PNP transistor's base current increases at approximately the same rate, but eventually levels off at different final values. These indicate that the TiSi2 process may have modified the silicon and oxygen dangling bond structure during its high temperature process in addition to removing the hydrogen from the passivated interface states.

  9. Comparative Analysis of Virtual Education Applications

    Mehmet KURT


    Full Text Available The research was conducted in order to make comparative analysis of virtual education applications. The research is conducted in survey model. The study group consists of total 300 institutes providing virtual education in the fall, spring and summer semesters of 2004; 246 in USA, 10 in Australia, 3 in South Africa, 10 in India, 21 in UK, 6 in Japan, 4 in Turkey. The information has been collected by online questionnaire sent to the target mass by e-mail. The questionnaire has been developed in two information categories as personal information and institutes and their virtual education applications. The English web design of the online questionnaire and the database has been prepared by Microsoft ASP codes which is the script language of Microsoft Front Page editor and has been tested on personal web site. The questionnaire has been pre applied in institutions providing virtual education in Australia. The English text of the questionnaire and web site design have been sent to educational technology and virtual education specialists in the countries of the study group. With the feedback received, the spelling mistakes have been corrected and concept and language validity have been completed. The application of the questionnaire has taken 40 weeks during March-November 2004. Only 135 institutes have replied. Two of the questionnaires have been discharged because they included mistaken coding, names of the institutions and countries. Valid 133 questionnaires cover approximately 44% of the study group. Questionnaires saved in the online database have been transferred to Microsoft Excel and then to SPSS by external database connection. In regards of the research objectives, the data collected has been analyzed on computer and by using SPSS statistics package program. In data analysis frequency (f, percentage (% and arithmetic mean ( have been used. In comparisons of country, institute, year, and other variables, che-square test, independent t

  10. Probabilistic Structural Analysis and Reliability Using NESSUS With Implemented Material Strength Degradation Model

    Bast, Callie C.; Jurena, Mark T.; Godines, Cody R.; Chamis, Christos C. (Technical Monitor)


    This project included both research and education objectives. The goal of this project was to advance innovative research and education objectives in theoretical and computational probabilistic structural analysis, reliability, and life prediction for improved reliability and safety of structural components of aerospace and aircraft propulsion systems. Research and education partners included Glenn Research Center (GRC) and Southwest Research Institute (SwRI) along with the University of Texas at San Antonio (UTSA). SwRI enhanced the NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) code and provided consulting support for NESSUS-related activities at UTSA. NASA funding supported three undergraduate students, two graduate students, a summer course instructor and the Principal Investigator. Matching funds from UTSA provided for the purchase of additional equipment for the enhancement of the Advanced Interactive Computational SGI Lab established during the first year of this Partnership Award to conduct the probabilistic finite element summer courses. The research portion of this report presents the cumulation of work performed through the use of the probabilistic finite element program, NESSUS, Numerical Evaluation and Structures Under Stress, and an embedded Material Strength Degradation (MSD) model. Probabilistic structural analysis provided for quantification of uncertainties associated with the design, thus enabling increased system performance and reliability. The structure examined was a Space Shuttle Main Engine (SSME) fuel turbopump blade. The blade material analyzed was Inconel 718, since the MSD model was previously calibrated for this material. Reliability analysis encompassing the effects of high temperature and high cycle fatigue, yielded a reliability value of 0.99978 using a fully correlated random field for the blade thickness. The reliability did not change significantly for a change in distribution type except for a change in

  11. Reliability of automated biometrics in the analysis of enamel rod end patterns

    K Manjunath


    Full Text Available Tooth prints are enamel rod end patterns on the tooth surface. These patterns are unique to an individual tooth of same individual and different individuals. The aim of this study was to analyze the reliability and sensitivity of an automated biometrics software (Verifinger ® standard SDK version 5.0 in analyzing tooth prints. In present study, enamel rod end patterns were obtained three times from a specific area on the labial surface of ten extracted teeth using acetate peel technique. The acetate peels were subjected to analysis with Verifinger ® standard SDK version 5.0 software to obtain the enamel rod end patterns (tooth prints and respective minutiae scores for each tooth print. The minutiae scores obtained for each tooth print was subjected to statistical analysis using Cronbach′s test for reliability. In the present study, it was found that Verifinger ® software was able to identify duplicate records of the same area of a same tooth with the original records stored on the database of the software. Comparison of the minutiae scores using Cronbach′s test also showed that there was no significant difference in the minutiae scores obtained (>0.6. Hence, acetate peel technique with Verifinger ® standard SDK version 5.0 is a reliable technique in analysis of enamel rod end patterns, and as a forensic tool in personal identification. But, further studies are needed to verify the reliability to this technique in a clinical setting, as obtaining an acetate peel record from the same area of the tooth in-vivo, is difficult.

  12. A comparative analysis of influenza vaccination programs.

    Shweta Bansal


    Full Text Available BACKGROUND: The threat of avian influenza and the 2004-2005 influenza vaccine supply shortage in the United States have sparked a debate about optimal vaccination strategies to reduce the burden of morbidity and mortality caused by the influenza virus. METHODS AND FINDINGS: We present a comparative analysis of two classes of suggested vaccination strategies: mortality-based strategies that target high-risk populations and morbidity-based strategies that target high-prevalence populations. Applying the methods of contact network epidemiology to a model of disease transmission in a large urban population, we assume that vaccine supplies are limited and then evaluate the efficacy of these strategies across a wide range of viral transmission rates and for two different age-specific mortality distributions. We find that the optimal strategy depends critically on the viral transmission level (reproductive rate of the virus: morbidity-based strategies outperform mortality-based strategies for moderately transmissible strains, while the reverse is true for highly transmissible strains. These results hold for a range of mortality rates reported for prior influenza epidemics and pandemics. Furthermore, we show that vaccination delays and multiple introductions of disease into the community have a more detrimental impact on morbidity-based strategies than mortality-based strategies. CONCLUSIONS: If public health officials have reasonable estimates of the viral transmission rate and the frequency of new introductions into the community prior to an outbreak, then these methods can guide the design of optimal vaccination priorities. When such information is unreliable or not available, as is often the case, this study recommends mortality-based vaccination priorities.


    Mocanu Mihaela


    Full Text Available The present paper starts out from the challenge regarding auditor tenure launched in 2010 by the Green Paper of the European Commission Audit Policy: Lessons from the Crisis. According to this document, the European Commission speaks both in favor of the mandatory rotation of the audit firm, and in favor of the mandatory rotation of audit partners. Rotation is considered a solution to mitigate threats to independence generated by familiarity, intimidation and self-interest in the context of a long-term audit-client relationship. At international level, there are several studies on auditor rotation, both empirical (e.g. Lu and Sivaramakrishnan, 2009, Li, 2010, Kaplan and Mauldin, 2008, Jackson et al., 2008 and normative in nature (e.g. Marten et al., 2007, Muller, 2006 and Gelter, 2004. The objective of the present paper is to perform a critical and comparative analysis of the regulations on internal and external rotation in force at international level, in the European Union and in the United States of America. Moreover, arguments both in favor and against mandatory rotation are brought into discussion. With regard to the research design, the paper has a normative approach. The main findings are first of all that by comparison, all regulatory authorities require internal rotation at least in the case of public interest entities, while the external rotation is not in the focus of the regulators. In general, the most strict and detailed requirements are those issued by the Securities and Exchange Commission from the United States of America. Second of all, in favor of mandatory rotation speaks the fact that the auditor becomes less resilient in case of divergence of opinions between him and company management, less stimulated to follow his own interest, and more scrupulous in conducting the audit. However, mandatory rotation may also have negative consequences, thus the debate on the opportunity of this regulatory measure remains open-ended.

  14. Assessing Reliability of Cellulose Hydrolysis Models to Support Biofuel Process Design – Identifiability and Uncertainty Analysis

    Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist


    The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done in the ori...... to analyze the uncertainty of model predictions. This allows judging the fitness of the model to the purpose under uncertainty. Hence we recommend uncertainty analysis as a proactive solution when faced with model uncertainty, which is the case for biofuel process development research....

  15. A reliable procedure for the analysis of multiexponential transients that arise in deep level transient spectroscopy

    Hanine, M. [Laboratoire Electronique Microtechnologie et Instrumentation (LEMI), University of Rouen, 76821 Mont Saint Aignan (France)]. E-mail:; Masmoudi, M. [Laboratoire Electronique Microtechnologie et Instrumentation (LEMI), University of Rouen, 76821 Mont Saint Aignan (France); Marcon, J. [Laboratoire Electronique Microtechnologie et Instrumentation (LEMI), University of Rouen, 76821 Mont Saint Aignan (France)


    In this paper, a reliable procedure, which allows a fine as well as a robust analysis of the deep defects in semiconductors, is detailed. In this procedure where capacitance transients are considered as multiexponential and corrupted with Gaussian noise, our new method of analysis, the Levenberg-Marquardt deep level transient spectroscopy (LM-DLTS) is associated with two other high-resolution techniques, i.e. the Matrix Pencil which provides an approximation of exponential components contained in the capacitance transients and Prony's method recently revised by Osborne in order to set the initial parameters.


    Bruyako V. N.


    Full Text Available High growth-rate of plantlet is the integral index of intensity of physiological processes at rice and other cultures. 20 typical plantlets from each of two variants were studded (in the distilled water in thermostat at a temperature 29° C by: length of embryonic root and coleoptile. Comparative analysis of trait characterizing the growth rates of rice varieties showed reliable advantage of Russian ones above the Italian and Chinese. Local varieties, regionalized until the year 2000, exceed new ones on this trait. Highest growth rates were characterized by medium grain samples. The white grain and red grain varieties are excelled other groups on the height of plantlet. Analysis rates of height of plantlets in the distinguished groups showed the necessity of prosecution of improvement of the above enumerated signs at the varieties of late term of ripening, long grain, with Waxy gene, colored grain. We recommend to sow this type of varieties on fields with good leveling, because of low speed of growth

  17. Reliability analysis of stochastic structural system considering static strength, stiffness and fatigue


    Multi-failures are possible to appear in the process of using the structural system, such as dead load failure, fatigue failure and stiffness failure. The expression of residual resistance is given based on the impact of random crack propagation in- duced by the fatigue load on the critical limit stress and section modulus in this paper. The failure modes of every element of the structural system are analyzed under dead and fatigue loads, and the influence of the correlation of failure modes on reliability of the element is considered. Failure mechanism and the correlation of failure modes under dead and fatigue loads are discussed, and the method of reli- ability analysis considering static strength, fatigue and stiffness is given. A nu- merical example is analyzed, which indicates that the failure probability is different for different use life and the influence of dead and fatigue loads on reliability of the structural system is different as well. This method of reliability analysis, in the pa- per, is better than the method only considering a single factor (or static strength, or fatigue, or stiffness, etc.) in the case of practical engineering.

  18. Using wavefront coding technique as an optical encryption system: reliability analysis and vulnerabilities assessment

    Konnik, Mikhail V.


    Wavefront coding paradigm can be used not only for compensation of aberrations and depth-of-field improvement but also for an optical encryption. An optical convolution of the image with the PSF occurs when a diffractive optical element (DOE) with a known point spread function (PSF) is placed in the optical path. In this case, an optically encoded image is registered instead of the true image. Decoding of the registered image can be performed using standard digital deconvolution methods. In such class of optical-digital systems, the PSF of the DOE is used as an encryption key. Therefore, a reliability and cryptographic resistance of such an encryption method depends on the size and complexity of the PSF used for optical encoding. This paper gives a preliminary analysis on reliability and possible vulnerabilities of such an encryption method. Experimental results on brute-force attack on the optically encrypted images are presented. Reliability estimation of optical coding based on wavefront coding paradigm is evaluated. An analysis of possible vulnerabilities is provided.


    ZHAO Yongxiang; PENG Jiachun; YANG Bing


    A state-of-art review is given to the new advances on fatigue reliability design and analysis methods of Chinese railway vehicle's structures. First, the structures are subject to a complicated random fatigue stressing history and this history should be determined by combining dynamic Simulation and on-line inspection. Second, the random fatigue constitutions belong to an intrinsic fatigue phenomenon and a probabilistic model is developed to well describe them with two measurements of survival probability and confidence, similar model is also presented for the random stress-life relations and extrapolated appropriately into long fatigue life regime. Third, concept of the fatigue limit should be understood as the fatigue strength at a given fatigue life and a so-called local Basquin model method is proposed for measuring the random strengths. In addition, drawing and application methods of the Goodman-Smith diagram for integrally characterizing the random fatigue strengths are established in terms of ten kilometers. Fourth, a reliability stress-based method is constructed with a consideration of the random constitutive relations. These new advances form a new frame work for railway fatigue reliability design and analysis.

  20. Based on Weibull Information Fusion Analysis Semiconductors Quality the Key Technology of Manufacturing Execution Systems Reliability

    Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai


    Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.

  1. Practical applications of age-dependent reliability models and analysis of operational data

    Lannoy, A.; Nitoi, M.; Backstrom, O.; Burgazzi, L.; Couallier, V.; Nikulin, M.; Derode, A.; Rodionov, A.; Atwood, C.; Fradet, F.; Antonov, A.; Berezhnoy, A.; Choi, S.Y.; Starr, F.; Dawson, J.; Palmen, H.; Clerjaud, L


    The purpose of the workshop was to present the experience of practical application of time-dependent reliability models. The program of the workshop comprises the following sessions: -) aging management and aging PSA (Probabilistic Safety Assessment), -) modeling, -) operation experience, and -) accelerating aging tests. In order to introduce time aging effect of particular component to the PSA model, it has been proposed to use the constant unavailability values on the short period of time (one year for example) calculated on the basis of age-dependent reliability models. As for modeling, it appears that the problem of too detailed statistical models for application is the lack of data for required parameters. As for operating experience, several methods of operating experience analysis have been presented (algorithms for reliability data elaboration and statistical identification of aging trend). As for accelerated aging tests, it is demonstrated that a combination of operating experience analysis with the results of accelerated aging tests of naturally aged equipment could provide a good basis for continuous operation of instrumentation and control systems.

  2. The Modified Femoral Neck-Shaft Angle: Age- and Sex-Dependent Reference Values and Reliability Analysis.

    Boese, Christoph Kolja; Frink, Michael; Jostmeier, Janine; Haneder, Stefan; Dargel, Jens; Eysel, Peer; Lechler, Philipp


    Background. The femoral neck-shaft angle (NSA) is of high importance for the diagnostics and treatment of various conditions of the hip. However, rotational effects limit its precision and applicability using plain radiographs. This study introduces a novel method to measure the femoral NSA: the modified NSA (mNSA), possibly being less susceptible against rotational effects compared to the conventional NSA. Patients and Methods. The method of measurement is described and its applicability was tested in 400 pelvis computed tomography scans (800 hips). Age- and gender-dependent reference values are given and intra- and interrater reliability are analyzed. Results. The mean age of all 400 patients (800 hips) was 54.32 years (18-100, SD 22.05 years). The mean mNSA was 147.0° and the 95% confidence interval was 146.7°-147.4°. Differences of the mNSA between sexes, age groups, and sides were nonsignificant. The absolute difference between NSA and mNSA was 16.3° (range 3-31°; SD 4.4°); the correlation was high (0.738; p < 0.001). Overall, the intra- and interrater reliability were excellent for the mNSA. Interpretation. We introduced a novel concept for the analysis of the neck-shaft angle. The high reliability of the measurement has been proven and its robustness to hip rotation was demonstrated.

  3. Reliability Analysis of a Composite Wind Turbine Blade Section Using the Model Correction Factor Method: Numerical Study and Validation

    Dimitrov, Nikolay Krasimirov; Friis-Hansen, Peter; Berggreen, Christian


    Reliability analysis of fiber-reinforced composite structures is a relatively unexplored field, and it is therefore expected that engineers and researchers trying to apply such an approach will meet certain challenges until more knowledge is accumulated. While doing the analyses included...... in the present paper, the authors have experienced some of the possible pitfalls on the way to complete a precise and robust reliability analysis for layered composites. Results showed that in order to obtain accurate reliability estimates it is necessary to account for the various failure modes described...... by the composite failure criteria. Each failure mode has been considered in a separate component reliability analysis, followed by a system analysis which gives the total probability of failure of the structure. The Model Correction Factor method used in connection with FORM (First-Order Reliability Method) proved...

  4. Comparing Results from Constant Comparative and Computer Software Methods: A Reflection about Qualitative Data Analysis

    Putten, Jim Vander; Nolen, Amanda L.


    This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…

  5. Analysis of the reliabilities of maglev train power system with FTA method

    Long, Zhiqiang; Lu, Zhiquo; Chen, Huixing; Liu, Shaoke


    As to the high safeties and reliabilities of the magnetic suspension train, the most fundamental rule is that under all supposed running disturbances, faults, and other urgent conditions, and at any time, the running train can all stop at any given point, where the passengers can get off the train completely and can all find a safe stopping place. The object studied in this paper is the CMS-3 type electromagnetic suspension sample train developed in the National University of Defense Technology. Based on the method of fault tree analysis, the safeties and the reliabilities of the key part (the power system) of the train are analyzed systematically. And the instructive viewpoints and the improving measures are put forward.

  6. Reliability Analysis of a Composite Blade Structure Using the Model Correction Factor Method

    Dimitrov, Nikolay Krasimiroy; Friis-Hansen, Peter; Berggreen, Christian


    This paper presents a reliability analysis of a composite blade profile. The so-called Model Correction Factor technique is applied as an effective alternate approach to the response surface technique. The structural reliability is determined by use of a simplified idealised analytical model which...... in a probabilistic sense is model corrected so that it, close to the design point, represents the same structural behaviour as a realistic FE model. This approach leads to considerable improvement of computational efficiency over classical response surface methods, because the numerically “cheap” idealistic model...... is used as the response surface, while the time-consuming detailed model is called only a few times until the simplified model is calibrated to the detailed model....

  7. Screen for child anxiety related emotional disorders: are subscale scores reliable? A bifactor model analysis.

    DeSousa, Diogo Araújo; Zibetti, Murilo Ricardo; Trentini, Clarissa Marceli; Koller, Silvia Helena; Manfro, Gisele Gus; Salum, Giovanni Abrahão


    The aim of this study was to investigate the utility of creating and scoring subscales for the self-report version of the Screen for Child Anxiety Related Emotional Disorders (SCARED) by examining whether subscale scores provide reliable information after accounting for a general anxiety factor in a bifactor model analysis. A total of 2420 children aged 9-18 answered the SCARED in their schools. Results suggested adequate fit of the bifactor model. The SCARED score variance was hardly influenced by the specific domains after controlling for the common variance in the general factor. The explained common variance (ECV) for the general factor was large (63.96%). After accounting for the general total score (ωh=.83), subscale scores provided very little reliable information (ωh ranged from .005 to .04). Practitioners that use the SCARED should be careful when scoring and interpreting the instrument subscales since there is more common variance to them than specific variance.

  8. Quasi-Monte Carlo Simulation-Based SFEM for Slope Reliability Analysis

    Yu Yuzhen; Xie Liquan; Zhang Bingyin


    Considering the stochastic spatial variation of geotechnical parameters over the slope, a Stochastic Finite Element Method (SFEM) is established based on the combination of the Shear Strength Reduction (SSR) concept and quasi-Monte Carlo simulation. The shear strength reduction FEM is superior to the slice method based on the limit equilibrium theory in many ways, so it will be more powerful to assess the reliability of global slope stability when combined with probability theory. To illustrate the performance of the proposed method, it is applied to an example of simple slope. The results of simulation show that the proposed method is effective to perform the reliability analysis of global slope stability without presupposing a potential slip surface.

  9. A practical engineering method for fuzzy reliability analysis of mechanical structures

    Li Bing; Zhu Meilin; Xu Kai


    The fuzzy sets theory in reliability analyses is studied. The structure stress is related to several other variables, such as structure sizes, material properties, external loads; in most cases, it is difficult to be expressed in a mathematical formula, and the related variables are not random variables, but fuzzy variables or other uncertain variables which have not only randomness but also fuzziness. In this paper, a novel approach is presented to use the finite element analysis as a 'numerical experiment' tool, and to find directly, by fuzzy linear regression method, the statistical property of the structure stress. Based on the fuzzy stress-random strength interference model proposed in this paper, the fuzzy reliability of the mechanical structure can be evaluated. The compressor blade of a given turbocharger is then introduced as a realistic example to illustrate the approach.

  10. Ceramic material life prediction: A program to translate ANSYS results to CARES/LIFE reliability analysis

    Vonhermann, Pieter; Pintz, Adam


    This manual describes the use of the ANSCARES program to prepare a neutral file of FEM stress results taken from ANSYS Release 5.0, in the format needed by CARES/LIFE ceramics reliability program. It is intended for use by experienced users of ANSYS and CARES. Knowledge of compiling and linking FORTRAN programs is also required. Maximum use is made of existing routines (from other CARES interface programs and ANSYS routines) to extract the finite element results and prepare the neutral file for input to the reliability analysis. FORTRAN and machine language routines as described are used to read the ANSYS results file. Sub-element stresses are computed and written to a neutral file using FORTRAN subroutines which are nearly identical to those used in the NASCARES (MSC/NASTRAN to CARES) interface.

  11. Reliability Analysis of Ice-Induced Fatigue and Damage in Offshore Engineering Structures


    - In Bohai Gulf, offshore and other installations have collapsed by sea ice due to the fatigue and fracture of the main supporting components in the ice environments. In this paper presented are some results on fatigue reliability of these structures in the Gulf by investigating the distributions of ice parameters such as its floating direction and speed, sheet thickness, compressive strength, ice forces on the structures, and hot spot stress in the structure. The low temperature, ice breaking modes and component fatigue failure modes are also taken into account in the analysis of the fatigue reliability of the offshore structures experiencing both random ice loading and low temperatures. The results could be applied to the design and operation of offshore platforms in the Bohai Gulf.

  12. Handbook of human-reliability analysis with emphasis on nuclear power plant applications. Final report

    Swain, A D; Guttmann, H E


    The primary purpose of the Handbook is to present methods, models, and estimated human error probabilities (HEPs) to enable qualified analysts to make quantitative or qualitative assessments of occurrences of human errors in nuclear power plants (NPPs) that affect the availability or operational reliability of engineered safety features and components. The Handbook is intended to provide much of the modeling and information necessary for the performance of human reliability analysis (HRA) as a part of probabilistic risk assessment (PRA) of NPPs. Although not a design guide, a second purpose of the Handbook is to enable the user to recognize error-likely equipment design, plant policies and practices, written procedures, and other human factors problems so that improvements can be considered. The Handbook provides the methodology to identify and quantify the potential for human error in NPP tasks.


    Bing Xue,


    Full Text Available Laminated Veneer Lumber (LVL panels made from poplar (Populus ussuriensis Kom. and birch (Betula platyphylla Suk. veneers were tested for mechanical properties. The effects of the assembly pattern on the modulus of elasticity (MOE and modulus of rupture (MOR of the LVL with vertical load testing were investigated. Three analytical methods were used: composite material mechanics, computer simulation, and static testing. The reliability of the different LVL assembly patterns was assessed using the method of Monte-Carlo. The results showed that the theoretical and ANSYS analysis results of the LVL MOE and MOR were very close to those of the static test results, and the largest proportional error was not greater than 5%. The veneer amount was the same, but the strength and reliability of the LVL made of birch veneers on the top and bottom was much more than the LVL made of poplar veneers. Good assembly patterns can improve the utility value of wood.

  14. Reliability analysis of shallow foundations by means of limit analysis with random slip lines

    Pula, Wojciech; Chwała, Marcin


    In order to evaluate credible reliability measures when bearing capacity of a shallow foundation is considered it is reasonable to describe soil strength properties in terms of random field's theory. As a next step the selected random field can be spatially averaged by means of a procedure introduced by Vanmarcke (1977). Earlier experiences have proved that, without applying spatial averaging procedure, reliability computations carried out in the context of foundation's bearing capacity had given significantly small values of reliability indices (large values of failure's probability) even for foundations which were considered as relatively safe. On the other hand the volume of the area under averaging strongly affects results of reliability computations. Hence the selection of the averaged area constitutes a vital problem and has to be dependent on the failure mechanism under consideration. In the present study local averages associated with kinematically admissible mechanism of failure proposed by Prandtl (1920) are considered. Soil strength parameters are assumed to constitute anisotropic random fields with different values of vertical and horizontal fluctuation scales. These fields are subjected to averaging along potential slip lines within the mechanism under consideration. Due to random fluctuations of the angle of internal friction the location of a slip line is changeable. Therefore it was necessary to solve the problem of spatial averaging of the random field along the varying slip lines. In order to incorporate an anisotropy of soil properties random fields the vertical correlation length was assumed to significantly shorter than the horizontal one. Finally, reliability indices were evaluated for foundations of various width by means of the Monte Carlo simulation. By numerical examples it is demonstrated that for reasonable proportions (from practical viewpoint) between horizontal and vertical fluctuation scales the reliability indices resulting in two

  15. Comparative naval architecture analysis of diesel submarines

    Torkelson, Kai Oscar


    CIVINS Many comparative naval architecture analyses of surface ships have been performed, but few published comparative analyses of submarines exist. Of the several design concept papers, reports and studies that have been written on submarines, no exclusively diesel submarine comparative naval architecture analyses have been published. One possible reason for few submarine studies may be the lack of complete and accurate information regarding the naval architecture of foreign diesel subma...

  16. Reliability of sprinkler systems. Exploration and analysis of data from nuclear and non-nuclear installations

    Roenty, V.; Keski-Rahkonen, O.; Hassinen, J.P. [VTT Building and Transport, Espoo (Finland)


    Sprinkler systems are an important part of fire safety of nuclear installations. As a part of effort to make fire-PSA of our utilities more quantitative a literature survey from open sources worldwide of available reliability data on sprinkler systems was carried out. Since the result of the survey was rather poor quantitatively, it was decided to mine available original Finnish nuclear and non-nuclear data, since nuclear power plants present a rather small device population. Sprinklers are becoming a key element for the fire safety in modern, open non-nuclear buildings. Therefore, the study included both nuclear power plants and non-nuclear buildings protected by sprinkler installations. Data needed for estimating of reliability of sprinkler systems were collected from available sources in Finnish nuclear and non-nuclear installations. Population sizes on sprinkler system installations and components therein as well as covered floor areas were counted individually from Finnish nuclear power plants. From non-nuclear installations corresponding data were estimated by counting relevant things from drawings of 102 buildings, and plotting from that sample needed probability distributions. The total populations of sprinkler systems and components were compiled based on available direct data and these distributions. From nuclear power plants electronic maintenance reports were obtained, observed failures and other reliability relevant data were selected, classified according to failure severity, and stored on spreadsheets for further analysis. A short summary of failures was made, which was hampered by a small sample size. From non-nuclear buildings inspection statistics from years 1985.1997 were surveyed, and observed failures were classified and stored on spreadsheets. Finally, a reliability model is proposed based on earlier formal work, and failure frequencies obtained by preliminary data analysis of this work. For a model utilising available information in the non

  17. Reliability of segmental accelerations measured using a new wireless gait analysis system.

    Kavanagh, Justin J; Morrison, Steven; James, Daniel A; Barrett, Rod


    The purpose of this study was to determine the inter- and intra-examiner reliability, and stride-to-stride reliability, of an accelerometer-based gait analysis system which measured 3D accelerations of the upper and lower body during self-selected slow, preferred and fast walking speeds. Eight subjects attended two testing sessions in which accelerometers were attached to the head, neck, lower trunk, and right shank. In the initial testing session, two different examiners attached the accelerometers and performed the same testing procedures. A single examiner repeated the procedure in a subsequent testing session. All data were collected using a new wireless gait analysis system, which features near real-time data transmission via a Bluetooth network. Reliability for each testing condition (4 locations, 3 directions, 3 speeds) was quantified using a waveform similarity statistic known as the coefficient of multiple determination (CMD). CMD's ranged from 0.60 to 0.98 across all test conditions and were not significantly different for inter-examiner (0.86), intra-examiner (0.87), and stride-to-stride reliability (0.86). The highest repeatability for the effect of location, direction and walking speed were for the shank segment (0.94), the vertical direction (0.91) and the fast walking speed (0.91), respectively. Overall, these results indicate that a high degree of waveform repeatability was obtained using a new gait system under test-retest conditions involving single and dual examiners. Furthermore, differences in acceleration waveform repeatability associated with the reapplication of accelerometers were small in relation to normal motor variability.

  18. Geomorphological Dating Using an Improved Scarp Degradation Model: Is This a Reliable Approach Compared With Common Absolute Dating Methods?

    Oemisch, M.; Hergarten, S.; Neugebauer, H. J.


    Geomorphological dating of a certain landform or geomorphological structure is based on the evolution of the landscape itself. In this context it is difficult to use common absolute dating techniques such as luminescence and radiocarbon dating because they require datable material which is often not available. Additionally these methods do not always date the time since the formation of these structures. For these reasons the application of geomorphological dating seems one reliable possibility to date certain geomorphological features. The aim of our work is to relate present-day shapes of fault scarps and terrace risers to their ages. The time span since scarp formation ceased is reflected by the stage of degradation as well as the rounding of the profile edges due to erosive processes. It is assumed that the average rate of downslope soil movement depends on the local slope angle and can be described in terms of a diffusion equation. On the basis of these assumptions we present a model to simulate the temporal development of scarp degradation by erosion. A diffusivity reflecting the effects of soil erosion, surface runoff and detachability of particles as well as present-day shapes of scarps are included in the model. As observations of present-day scarps suggest a higher diffusivity at the toe than at the head of a slope, we suggest a linear approach with increasing diffusivities in downslope direction. First results show a better match between simulated and observed profiles of the Upper Rhine Graben in comparison to models using a constant diffusivity. To date the scarps the model has to be calibrated. For this purpose we estimate diffusivities by fitting modelled profiles to observed ones of known age. Field data have been collected in the area around Bonn, Germany and in the Alps, Switzerland. It is a matter of current research to assess the quality of this dating technique and to compare the results and the applicability with some of the absolute dating

  19. Fuzzy Reliability Analysis for Seabed Oil-Gas Pipeline Networks Under Earthquakes

    刘震; 潘斌


    The seabed oil-gas pipeline network is simplified to a network w i th stochastic edge-weight by means of the fuzzy graphics theory. With the help o f network analysis, fuzzy mathematics, and stochastic theory, the problem of rel iability analysis for the seabed oil-gas pipeline network under earthquakes is t ransformed into the calculation of the transitive closure of fuzzy matrix of the stochastic fuzzy network. In classical network reliability analysis, the node i s supposed to be non-invalidated; in this paper, this premise is modified by in t roducing a disposal method which has taken the possible invalidated node into a ccount. A good result is obtained by use of the Monte Carlo simulation analysis.

  20. Human Capital Development: Comparative Analysis of BRICs

    Ardichvili, Alexandre; Zavyalova, Elena; Minina, Vera


    Purpose: The goal of this article is to conduct macro-level analysis of human capital (HC) development strategies, pursued by four countries commonly referred to as BRICs (Brazil, Russia, India, and China). Design/methodology/approach: This analysis is based on comparisons of macro indices of human capital and innovativeness of the economy and a…

  1. A Reliability Analysis of a Rainfall Harvesting System in Southern Italy

    Lorena Liuzzo


    Full Text Available Rainwater harvesting (RWH may be an effective alternative water supply solution in regions affected by water scarcity. It has recently become a particularly important option in arid and semi-arid areas (like Mediterranean basins, mostly because of its many benefits and affordable costs. This study provides an analysis of the reliability of using a rainwater harvesting system to supply water for toilet flushing and garden irrigation purposes, with reference to a single-family home in a residential area of Sicily (Southern Italy. A flushing water demand pattern was evaluated using water consumption data collected from a sample of residential customers during an extended measurement campaign. A daily water balance simulation of the rainwater storage tank was performed, and the yield-after-spillage algorithm was used to define the tank release rule. The model’s performance was evaluated using rainfall data from more than 100 different sites located throughout the Sicilian territory. This regional analysis provided annual reliability curves for the system as a function of mean annual precipitation, which have practical applications in this area of study. The uncertainty related to the regional model predictions was also assessed. A cost-benefit analysis highlighted that the implementation of a rainwater harvesting system in Sicily can provide environmental and economic advantages over traditional water supply methods. In particular, the regional analysis identified areas where the application of this system would be most effective.

  2. RELAP5/MOD3.3 Best Estimate Analyses for Human Reliability Analysis

    Andrej Prošek


    Full Text Available To estimate the success criteria time windows of operator actions the conservative approach was used in the conventional probabilistic safety assessment (PSA. The current PSA standard recommends the use of best-estimate codes. The purpose of the study was to estimate the operator action success criteria time windows in scenarios in which the human actions are supplement to safety systems actuations, needed for updated human reliability analysis (HRA. For calculations the RELAP5/MOD3.3 best estimate thermal-hydraulic computer code and the qualified RELAP5 input model representing a two-loop pressurized water reactor, Westinghouse type, were used. The results of deterministic safety analysis were examined what is the latest time to perform the operator action and still satisfy the safety criteria. The results showed that uncertainty analysis of realistic calculation in general is not needed for human reliability analysis when additional time is available and/or the event is not significant contributor to the risk.

  3. Examining the Reliability and Validity of ADEPT and CELDT: Comparing Two Assessments of Oral Language Proficiency for English Language Learners

    Chavez, Gina


    Few classroom measures of English language proficiency have been evaluated for reliability and validity. This research examined the concurrent and predictive validity of an oral language test, titled A Developmental English Language Proficiency Test (ADEPT), and the relationship to the California English Language Development Test (CELDT) in the…

  4. Question analysis for Indonesian comparative question

    Saelan, A.; Purwarianti, A.; Widyantoro, D. H.


    Information seeking is one of human needs today. Comparing things using search engine surely take more times than search only one thing. In this paper, we analyzed comparative questions for comparative question answering system. Comparative question is a question that comparing two or more entities. We grouped comparative questions into 5 types: selection between mentioned entities, selection between unmentioned entities, selection between any entity, comparison, and yes or no question. Then we extracted 4 types of information from comparative questions: entity, aspect, comparison, and constraint. We built classifiers for classification task and information extraction task. Features used for classification task are bag of words, whether for information extraction, we used lexical, 2 previous and following words lexical, and previous label as features. We tried 2 scenarios: classification first and extraction first. For classification first, we used classification result as a feature for extraction. Otherwise, for extraction first, we used extraction result as features for classification. We found that the result would be better if we do extraction first before classification. For the extraction task, classification using SMO gave the best result (88.78%), while for classification, it is better to use naïve bayes (82.35%).

  5. Design and reliability, availability, maintainability, and safety analysis of a high availability quadruple vital computer system

    Ping TAN; Wei-ting HE; Jia LIN; Hong-ming ZHAO; Jian CHU


    With the development of high-speed railways in China,more than 2000 high-speed trains will be put into use.Safety and efficiency of railway transportation is increasingly important.We have designed a high availability quadruple vital computer (HAQVC) system based on the analysis of the architecture of the traditional double 2-out-of-2 system and 2-out-of-3 system.The HAQVC system is a system with high availability and safety,with prominent characteristics such as fire-new internal architecture,high efficiency,reliable data interaction mechanism,and operation state change mechanism.The hardware of the vital CPU is based on ARM7 with the real-time embedded safe operation system (ES-OS).The Markov modeling method is designed to evaluate the reliability,availability,maintainability,and safety (RAMS) of the system.In this paper,we demonstrate that the HAQVC system is more reliable than the all voting triple modular redundancy (AVTMR) system and double 2-out-of-2 system.Thus,the design can be used for a specific application system,such as an airplane or high-speed railway system.

  6. Investigation on design and reliability analysis of a new deployable and lockable mechanism

    Lin, Qing; Nie, Hong; Ren, Jie; Chen, Jinbao


    The traditional structure of the deployable and lockable mechanism on soft-landing gear system is complicated and unreliable. To overcome the defects, a new deployable and lockable mechanism for planetary probes is developed. The compression assembly shares a set of new mechanism with the deployment assembly and locking assembly. The new mechanism shows some advantages: more steady deployment, simpler mechanism and higher reliability. This paper presents an introduction of the deployment and locking theory of the new mechanism, and constitutes the fault tree, which would contribute to qualitative and quantitative analyses. In addition, probability importance and criticality importance of the new mechanism are derived and calculated. The reliability modeling and analysis of the mechanism are accomplished from static torque margin, torque and the work by torque. In investigation results, reliability rate that the new mechanism could deploy successfully is 0.999334. The crucial problems concentrate on the insufficiency of storage force torque of high strength spring, the lubrication failure between the inner cylinder and the outer cylinder of the strut and the stuck soft-landing gear system. And then, the paper presents some improvement approaches and suggestions according to the problems discussed above.

  7. Reliability Analysis of Jacket Platforms in Malaysia-Environmental Load Factors

    Nelson J. Cossa


    Full Text Available In recent years, there has been a significant trend for adoption of the ISO-19902 standards for design of fixed steel offshore structures. The implementation of this standard aims to provide a harmonized international framework of design. Unlike, the traditional and currently used, WSD-method, the ISO-19902, follows the LRFD-method, that consists of both partial load and resistance factors. These partial factors are usually calibrated through reliability analysis. In these analyses, the performance of a structure is defined by the limit state function for the critical mode of failure. This paper, focuses mainly on the ultimate (strength limit state which is directly related to the (highest environmental loading. The partial environmental load factors contained in the ISO 19902, were calibrated for the Gulf of Mexico and UK's North Sea conditions, which are relatively harsh than those in Malaysia. The study presents the steps taken for the determination of the environmental load factor for tubular members of jacket platforms in Malaysia. The factor was determined such that the reliability of tubular members of jacket designed as per the LRFD-method is at the level of target reliability obtained by the WSD-method.

  8. Reliability analysis of instrument design of noninvasive bone marrow disease detector

    Su, Yu; Li, Ting; Sun, Yunlong


    Bone marrow is an important hematopoietic organ, and bone marrow lesions (BMLs) may cause a variety of complications with high death rate and short survival time. Early detection and follow up care are particularly important. But the current diagnosis methods rely on bone marrow biopsy/puncture, with significant limitations such as invasion, complex operation, high risk, and discontinuous. It is highly in need of a non-invasive, safe, easily operated, and continuous monitoring technology. So we proposed to design a device aimed for detecting bone marrow lesions, which was based on near infrared spectrum technology. Then we fully tested its reliabilities, including the sensitivity, specificity, signal-to-noise ratio (SNR), stability, and etc. Here, we reported this sequence of reliability test experiments, the experimental results, and the following data analysis. This instrument was shown to be very sensitive, with distinguishable concentration less than 0.002 and with good linearity, stability and high SNR. Finally, these reliability-test data supported the promising clinical diagnosis and surgery guidance of our novel instrument in detection of BMLs.

  9. Electric Power quality Analysis in research reactor: Impacts on nuclear safety assessment and electrical distribution reliability

    Touati, Said; Chennai, Salim; Souli, Aissa [Nuclear Research Centre of Birine, Ain Oussera, Djelfa Province (Algeria)


    The increased requirements on supervision, control, and performance in modern power systems make power quality monitoring a common practise for utilities. Large databases are created and automatic processing of the data is required for fast and effective use of the available information. Aim of the work presented in this paper is the development of tools for analysis of monitoring power quality data and in particular measurements of voltage and currents in various level of electrical power distribution. The study is extended to evaluate the reliability of the electrical system in nuclear plant. Power Quality is a measure of how well a system supports reliable operation of its loads. A power disturbance or event can involve voltage, current, or frequency. Power disturbances can originate in consumer power systems, consumer loads, or the utility. The effect of power quality problems is the loss power supply leading to severe damage to equipments. So, we try to track and improve system reliability. The assessment can be focused on the study of impact of short circuits on the system, harmonics distortion, power factor improvement and effects of transient disturbances on the Electrical System during motor starting and power system fault conditions. We focus also on the review of the Electrical System design against the Nuclear Directorate Safety Assessment principles, including those extended during the last Fukushima nuclear accident. The simplified configuration of the required system can be extended from this simple scheme. To achieve these studies, we have used a demo ETAP power station software for several simulations. (authors)

  10. Reliability analysis on resonance for low-pressure compressor rotor blade based on least squares support vector machine with leave-one-out cross-validation

    Haifeng Gao


    Full Text Available This research article analyzes the resonant reliability at the rotating speed of 6150.0 r/min for low-pressure compressor rotor blade. The aim is to improve the computational efficiency of reliability analysis. This study applies least squares support vector machine to predict the natural frequencies of the low-pressure compressor rotor blade considered. To build a more stable and reliable least squares support vector machine model, leave-one-out cross-validation is introduced to search for the optimal parameters of least squares support vector machine. Least squares support vector machine with leave-one-out cross-validation is presented to analyze the resonant reliability. Additionally, the modal analysis at the rotating speed of 6150.0 r/min for the rotor blade is considered as a tandem system to simplify the analysis and design process, and the randomness of influence factors on frequencies, such as material properties, structural dimension, and operating condition, is taken into consideration. Back-propagation neural network is compared to verify the proposed approach based on the same training and testing sets as least squares support vector machine with leave-one-out cross-validation. Finally, the statistical results prove that the proposed approach is considered to be effective and feasible and can be applied to structural reliability analysis.

  11. Development of a Reliable Fuel Depletion Methodology for the HTR-10 Spent Fuel Analysis

    Chung, Kiwhan [Los Alamos National Laboratory; Beddingfield, David H. [Los Alamos National Laboratory; Geist, William H. [Los Alamos National Laboratory; Lee, Sang-Yoon [unaffiliated


    A technical working group formed in 2007 between NNSA and CAEA to develop a reliable fuel depletion method for HTR-10 based on MCNPX and to analyze the isotopic inventory and radiation source terms of the HTR-10 spent fuel. Conclusions of this presentation are: (1) Established a fuel depletion methodology and demonstrated its safeguards application; (2) Proliferation resistant at high discharge burnup ({approx}80 GWD/MtHM) - Unfavorable isotopics, high number of pebbles needed, harder to reprocess pebbles; (3) SF should remain under safeguards comparable to that of LWR; and (4) Diversion scenarios not considered, but can be performed.

  12. Feasibility, reliability, and validity of adolescent health status measurement by the Child Health Questionnaire Child Form (CHQ-CF): Internet administration compared with the standard paper version

    H. Raat (Hein); R.T. Mangunkusumo; J.M. Landgraf (Jeanne); G. Kloek (Gitte); J. Brug (Hans)


    textabstractAims: In this study we evaluated indicators of the feasibility, reliability, and validity of the Child Health Questionnaire-Child Form (CHQ-CF). We compared the results in a subgroup of adolescents who completed the standard paper version of the CHQ-CF with the results in another subgrou

  13. Reliability Analysis for Adhesive Bonded Composite Stepped Lap Joints Loaded in Fatigue

    Kimiaeifar, Amin; Sørensen, John Dalsgaard; Lund, Erik


    This paper describes a probabilistic approach to calculate the reliability of adhesive bonded composite stepped lap joints loaded in fatigue using three- dimensional finite element analysis (FEA). A method for progressive damage modelling is used to assess fatigue damage accumulation and residual...... strength under fully reversed cyclic loading based on stiffness/strength degradation. The FEA simulations are conducted using the commercial FEA code ANSYS 12.1. A design equation for fatigue failure of wind turbine blades is chosen based on recommendations given in the wind turbine standard IEC 61400...

  14. Common-Cause Failure Analysis for Reactor Protection System Reliability Studies

    Gentillon, C.; Rasmuson, D.; Eide, S.; Wierman, T.


    Analyses were performed of the safety-related performance of the reactor protection system (RPS) at U.S. Westinghouse and General Electric commercial reactors during the period 1984 through 1995. RPS operational data from these reactors were collected from the Nuclear Plant Reliability Data System (NPRDS) and Licensee Event Reports (LER). The common-cause failure (CCF) modeling in the fault trees developed for these studies and the analysis and use of common-cause failure data were sophisticated, state-of-the-art efforts. The overall CCF effort helped to test and expand the limits of the U.S. Nuclear Regulatory Commission's CCF methodology.

  15. Statistical analysis of the reliability of complex systems for maintenance planning

    Pedersen, Thomas Espelund


    is to analyze failure and maintenance data using mathematical and statistical models in order to improve maintenance procedures in the Danish Defence. The first part of the report introduces the maintenance planning problem and presents an overview of models for reliability, failure processes, and maintenance...... planning. This overview is structured to highlight the process of choosing a proper model for a given data set, focusing on different measures of time and the data requirements for the different models. The second part of the report describes the analysis of two data sets from the Danish Defence. The data...

  16. Effect of wine dilution on the reliability of tannin analysis by protein precipitation

    Jensen, Jacob Skibsted; Werge, Hans Henrik Malmborg; Egebo, Max


    A reported analytical method for tannin quantification relies on selective precipitation of tannins with bovine serum albumin. The reliability of tannin analysis by protein precipitation on wines having variable tannin levels was evaluated by measuring the tannin concentration of various dilutions...... of five commercial red wines. Tannin concentrations of both very diluted and concentrated samples were systematically underestimated, which could be explained by a precipitation threshold and insufficient protein for precipitation, respectively. Based on these findings, we have defined a valid range...... of the tannin response in the protein precipitation-tannin assay, which suffers minimally from these problems....

  17. Reliability And Maintenance Analysis Of CCTV Systems Used In Rail Transport

    Siergiejczyk Mirosław


    Full Text Available CCTV systems are widely used across plethora of industrial areas including transport, where their function is to support transport telematics systems. Among others, they are used to ensure travel safety. This paper presented a reliability and maintenance analysis of CCTV. It led to building a relationships graph and then Chapman–Kolmogorov system of equations was derived to describe it. Drawing on those equations, relationships for calculating probability of system staying in state of full ability SPZ, state of the impendency over safety SZB1 as well as state of unreliability of safety SB were derived.

  18. Reliability Analysis on Data of SO2 Emissions from Thermal Power Plants


    @@ Since the atmospheric pollutants from thermal power plants accounl for a large proportion of lhe national total, knowing well the status of SO2 emissions of power industry is of great significance for making control strategies and related environmental policies concerning SO2 and acid rain.Through introduction and analysis to some key links, such as the existing monitoring network, data sources, examining methods and procedures for statistic data and calculating methods of total national emissions, it is concluded that the data of SO2 emissions from the statistic database for power environment is reliable and can be a reference for decision-making both on power development and environmental protection.

  19. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Quality Assurance Manual

    C. L. Smith; R. Nims; K. J. Kvarfordt; C. Wharton


    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment using a personal computer running the Microsoft Windows operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC). The role of the INL in this project is that of software developer and tester. This development takes place using formal software development procedures and is subject to quality assurance (QA) processes. The purpose of this document is to describe how the SAPHIRE software QA is performed for Version 6 and 7, what constitutes its parts, and limitations of those processes.

  20. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Tutorial

    C. L. Smith; S. T. Beck; S. T. Wood


    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a set of computer programs that were developed to create and analyze probabilistic risk assessment (PRAs). This volume is the tutorial manual for the SAPHIRE system. In this document, a series of lessons are provided that guide the user through basic steps common to most analyses preformed with SAPHIRE. The tutorial is divided into two major sections covering both basic and advanced features. The section covering basic topics contains lessons that lead the reader through development of a probabilistic hypothetical problem involving a vehicle accident, highlighting the program’s most fundamental features. The advanced features section contains additional lessons that expand on fundamental analysis features of SAPHIRE and provide insights into more complex analysis techniques. Together, these two elements provide an overview into the operation and capabilities of the SAPHIRE software.


    Josip Peko


    Full Text Available This study examined steel and aluminum variants of modern exhibition structures in which the main design requirements include low weight (increased span/depth ratio, transportation, and construction and durability (resistance to corrosion. This included a design situation in which the structural application of aluminum alloys provided an extremely convenient and practical solution. Viability of an aluminum structure depends on several factors and requires a detailed analysis. The overall conclusion of the study indicated that aluminum can be used as a structural material and as a viable alternative to steel for Croatian snow and wind load values and evidently in cases in which positive properties of aluminum are required for structural design. Furthermore, a structural fire analysis was conducted for an aluminum variant structure by using a zone model for realistic fire analysis. The results suggested that passive fire protection for the main structural members was not required in the event of areal fire with duration of 60 min.

  2. ErpICASSO: a tool for reliability estimates of independent components in EEG event-related analysis.

    Artoni, Fiorenzo; Gemignani, Angelo; Sebastiani, Laura; Bedini, Remo; Landi, Alberto; Menicucci, Danilo


    Independent component analysis and blind source separation methods are steadily gaining popularity for separating individual brain and non-brain source signals mixed by volume conduction in electroencephalographic data. Despite the advancements on these techniques, determining the number of embedded sources and their reliability are still open issues. In particular to date no method takes into account trial-to-trial variability in order to provide a reliability measure of independent components extracted in Event Related Potentials (ERPs) studies. In this work we present ErpICASSO, a new method which modifies a data-driven approach named ICASSO for the analysis of trials (epochs). In addition to ICASSO the method enables the user to estimate the number of embedded sources, and provides a quality index of each extracted ERP component by combining trial-to-trial bootstrapping and CCA projection. We applied ErpICASSO on ERPs recorded from 14 subjects presented with unpleasant and neutral pictures. We separated potentials putatively related to different systems and identified the four primary ERP independent sources. Standing on the confidence interval estimated by ErpICASSO, we were able to compare the components between neutral and unpleasant conditions. ErpICASSO yielded encouraging results, thus providing the scientific community with a useful tool for ICA signal processing whenever dealing with trials recorded in different conditions.

  3. Advanced multiple response surface method of sensitivity analysis for turbine blisk reliability with multi-physics coupling

    Zhang Chunyi


    Full Text Available To reasonably implement the reliability analysis and describe the significance of influencing parameters for the multi-failure modes of turbine blisk, advanced multiple response surface method (AMRSM was proposed for multi-failure mode sensitivity analysis for reliability. The mathematical model of AMRSM was established and the basic principle of multi-failure mode sensitivity analysis for reliability with AMRSM was given. The important parameters of turbine blisk failures are obtained by the multi-failure mode sensitivity analysis of turbine blisk. Through the reliability sensitivity analyses of multiple failure modes (deformation, stress and strain with the proposed method considering fluid–thermal–solid interaction, it is shown that the comprehensive reliability of turbine blisk is 0.9931 when the allowable deformation, stress and strain are 3.7 × 10−3 m, 1.0023 × 109 Pa and 1.05 × 10−2 m/m, respectively; the main impact factors of turbine blisk failure are gas velocity, gas temperature and rotational speed. As demonstrated in the comparison of methods (Monte Carlo (MC method, traditional response surface method (RSM, multiple response surface method (MRSM and AMRSM, the proposed AMRSM improves computational efficiency with acceptable computational accuracy. The efforts of this study provide the AMRSM with high precision and efficiency for multi-failure mode reliability analysis, and offer a useful insight for the reliability optimization design of multi-failure mode structure.

  4. Comparative Distributions of Hazard Modeling Analysis

    Rana Abdul Wajid


    Full Text Available In this paper we present the comparison among the distributions used in hazard analysis. Simulation technique has been used to study the behavior of hazard distribution modules. The fundamentals of Hazard issues are discussed using failure criteria. We present the flexibility of the hazard modeling distribution that approaches to different distributions.

  5. Saddlepoint approximation based line sampling method for uncertainty propagation in fuzzy and random reliability analysis


    For structural system with random basic variables as well as fuzzy basic variables,uncertain propagation from two kinds of basic variables to the response of the structure is investigated.A novel algorithm for obtaining membership function of fuzzy reliability is presented with saddlepoint approximation(SA)based line sampling method.In the presented method,the value domain of the fuzzy basic variables under the given membership level is firstly obtained according to their membership functions.In the value domain of the fuzzy basic variables corresponding to the given membership level,bounds of reliability of the structure response satisfying safety requirement are obtained by employing the SA based line sampling method in the reduced space of the random variables.In this way the uncertainty of the basic variables is propagated to the safety measurement of the structure,and the fuzzy membership function of the reliability is obtained.Compared to the direct Monte Carlo method for propagating the uncertainties of the fuzzy and random basic variables,the presented method can considerably improve computational efficiency with acceptable precision.The presented method has wider applicability compared to the transformation method,because it doesn’t limit the distribution of the variable and the explicit expression of performance function, and no approximation is made for the performance function during the computing process.Additionally,the presented method can easily treat the performance function with cross items of the fuzzy variable and the random variable,which isn’t suitably approximated by the existing transformation methods.Several examples are provided to illustrate the advantages of the presented method.

  6. Analysis and improvement of the reliability and longevity of diesel engines of commercial vehicles

    Brunner, F.J.; Zalud, F.H.


    The high demands on quality, reliability and longevity of new engines require concrete data on technical-functional characteristics, reliability measures and life expectance, which also have to be included in the performance specifications. As an example some reliability measures of Diesel engines are given, which were installed in commercial vehicles. Conditions are quoted for which the reliability measures are valid and some problems of the longevity of engines and their components are discussed. The manufacturer can reach a satisfactory reliability of engines if he applies a reliability assurance program in which methods of reliability engineering play an important role.

  7. Orthodoxy and reflexivity in international comparative analysis

    Lind, Jens; Valkenburg, Ben


    project, in which we have tried to deal with these consequences. Fourth, and hopefully as a result of the first three aims, we want to argue that a reflexive approach of international, comparative research is not only desirable, but attainable as well. In order to do so, we begin with a short discussion...... upon the consequences on the level of empirical research. We want to avoid that, so our second and third subject will be the practical implications of reflexivity for empirical research as well as for social policy. Our discussion on these subjects is based on the practical experiences in the INPART...... on the main issues in the so-called ?reflexive approach? and consider the main consequences of this approach for both social science and social policy. Against this background we will discuss the implications for comparative research and the experiences of the INPART project end up with a few central issues...

  8. 计算机辅助型较直接面试型口语测试信度、效度对比研究%Comparative Analysis of Validity and Reliability between Computer-Assisted Oral Test and Face-to-Face Oral Test



    To conduct oral teaching and test is the rule of language teaching'and learning as well as the requirement of college English teaching reform and society advancement. Oral English test has been taken as one important part of comprehensive English test by more and more colleges. It is meaningful to discuss the theories and methods of computer assisted oral test(CAOT) which would be much helpful to promote the reliability and validity of students' oral proficiency. The thesis analyzes the advantages of computer assisted oral test and make comparisons to traditional face-to-face oral test (FFOT) to find a relatively sci- entific computer-assisted based method to evaluate students' oral English proficiency.%开展英语口语教学是语言教学的规律,是大学英语教学改革与发展的必然,更是社会需求的反映。大学英语实行分级教学后,口语测试被纳入许多院校的期末考试范围。深入探讨计算机辅助的口语测试理论及方法,使大规模的口语测试科学、公正地测试学生的口语交际能力,进而推动大学英语口语教学,具有现实意义。采用计算机辅助的半直接型口语测试,在一定程度上缓解了师资力量不足和水平不一的情况。

  9. Loss Given Default Modelling: Comparative Analysis

    Yashkir, Olga; Yashkir, Yuriy


    In this study we investigated several most popular Loss Given Default (LGD) models (LSM, Tobit, Three-Tiered Tobit, Beta Regression, Inflated Beta Regression, Censored Gamma Regression) in order to compare their performance. We show that for a given input data set, the quality of the model calibration depends mainly on the proper choice (and availability) of explanatory variables (model factors), but not on the fitting model. Model factors were chosen based on the amplitude of their correlati...

  10. Multi-dimensional reliability assessment of fractal signature analysis in an outpatient sports medicine population.

    Jarraya, Mohamed; Guermazi, Ali; Niu, Jingbo; Duryea, Jeffrey; Lynch, John A; Roemer, Frank W


    The aim of this study has been to test reproducibility of fractal signature analysis (FSA) in a young, active patient population taking into account several parameters including intra- and inter-reader placement of regions of interest (ROIs) as well as various aspects of projection geometry. In total, 685 patients were included (135 athletes and 550 non-athletes, 18-36 years old). Regions of interest (ROI) were situated beneath the medial tibial plateau. The reproducibility of texture parameters was evaluated using intraclass correlation coefficients (ICC). Multi-dimensional assessment included: (1) anterior-posterior (A.P.) vs. posterior-anterior (P.A.) (Lyon-Schuss technique) views on 102 knees; (2) unilateral (single knee) vs. bilateral (both knees) acquisition on 27 knees (acquisition technique otherwise identical; same A.P. or P.A. view); (3) repetition of the same image acquisition on 46 knees (same A.P. or P.A. view, and same unitlateral or bilateral acquisition); and (4) intra- and inter-reader reliability with repeated placement of the ROIs in the subchondral bone area on 99 randomly chosen knees. ICC values on the reproducibility of texture parameters for A.P. vs. P.A. image acquisitions for horizontal and vertical dimensions combined were 0.72 (95% confidence interval (CI) 0.70-0.74) ranging from 0.47 to 0.81 for the different dimensions. For unilateral vs. bilateral image acquisitions, the ICCs were 0.79 (95% CI 0.76-0.82) ranging from 0.55 to 0.88. For the repetition of the identical view, the ICCs were 0.82 (95% CI 0.80-0.84) ranging from 0.67 to 0.85. Intra-reader reliability was 0.93 (95% CI 0.92-0.94) and inter-observer reliability was 0.96 (95% CI 0.88-0.99). A decrease in reliability was observed with increasing voxel sizes. Our study confirms excellent intra- and inter-reader reliability for FSA, however, results seem to be affected by acquisition technique, which has not been previously recognized.

  11. Study on Performance Shaping Factors (PSFs) Quantification Method in Human Reliability Analysis (HRA)

    Kim, Ar Ryum; Jang, Inseok Jang; Seong, Poong Hyun [Korea Advanced Institute of Science and Technology, Daejon (Korea, Republic of); Park, Jinkyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Jong Hyun [KEPCO, Ulsan (Korea, Republic of)


    The purpose of HRA implementation is 1) to achieve the human factor engineering (HFE) design goal of providing operator interfaces that will minimize personnel errors and 2) to conduct an integrated activity to support probabilistic risk assessment (PRA). For these purposes, various HRA methods have been developed such as technique for human error rate prediction (THERP), simplified plant analysis risk human reliability assessment (SPAR-H), cognitive reliability and error analysis method (CREAM) and so on. In performing HRA, such conditions that influence human performances have been represented via several context factors called performance shaping factors (PSFs). PSFs are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. Most HRA methods evaluate the weightings of PSFs by expert judgment and explicit guidance for evaluating the weighting is not provided. It has been widely known that the performance of the human operator is one of the critical factors to determine the safe operation of NPPs. HRA methods have been developed to identify the possibility and mechanism of human errors. In performing HRA methods, the effect of PSFs which may increase or decrease human error should be investigated. However, the effect of PSFs were estimated by expert judgment so far. Accordingly, in order to estimate the effect of PSFs objectively, the quantitative framework to estimate PSFs by using PSF profiles is introduced in this paper.

  12. A Comprehensive Comparison of Different Clustering Methods for Reliability Analysis of Microarray Data

    Kafieh, Rahele; Mehridehnavi, Alireza


    In this study, we considered some competitive learning methods including hard competitive learning and soft competitive learning with/without fixed network dimensionality for reliability analysis in microarrays. In order to have a more extensive view, and keeping in mind that competitive learning methods aim at error minimization or entropy maximization (different kinds of function optimization), we decided to investigate the abilities of mixture decomposition schemes. Therefore, we assert that this study covers the algorithms based on function optimization with particular insistence on different competitive learning methods. The destination is finding the most powerful method according to a pre-specified criterion determined with numerical methods and matrix similarity measures. Furthermore, we should provide an indication showing the intrinsic ability of the dataset to form clusters before we apply a clustering algorithm. Therefore, we proposed Hopkins statistic as a method for finding the intrinsic ability of a data to be clustered. The results show the remarkable ability of Rayleigh mixture model in comparison with other methods in reliability analysis task. PMID:24083134

  13. Performance improvement of a moment method for reliability analysis using kriging metamodels

    Ju, Byeong Hyeon; Cho, Tae Min; Lee, Byung Chai [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Jung, Do Hyun [Korea Automotive Technology Institute, Chonan (Korea, Republic of)


    Many methods for reliability analysis have been studied and one of them, a moment method, has the advantage that it doesn't require sensitivities of performance functions. The moment method for reliability analysis requires the first four moments of a performance function and then Pearson system is used for the probability of failure where the accuracy of the probability of failure greatly depends on that of the first four moments. But it is generally impossible to assess them analytically for multidimensional functions, and numerical integration is mainly used to estimate the moment. However, numerical integration requires many function evaluations and in case of involving finite element analyses, the calculation of the first four moments is very time-consuming. To solve the problem, this research proposes a new method of approximating the first four moments based on kriging metamodel. The proposed method substitutes the kriging metamodel for the performance function and can also evaluate the accuracy of the calculated moments adjusting the approximation range. Numerical examples show the proposed method can approximate the moments accurately with the less function evaluations and evaluate the accuracy of the calculated moments.

  14. Number of iterations needed in Monte Carlo Simulation using reliability analysis for tunnel supports

    E. Bukaçi


    Full Text Available There are many methods in geotechnical engineering which could take advantage of Monte Carlo Simulation to establish probability of failure, since closed form solutions are almost impossible to use in most cases. The problem that arises with using Monte Carlo Simulation is the number of iterations needed for a particular simulation.This article will show why it’s important to calculate number of iterations needed for Monte Carlo Simulation used in reliability analysis for tunnel supports using convergence – confinement method. Number if iterations needed will be calculated with two methods. In the first method, the analyst has to accept a distribution function for the performance function. The other method suggested by this article is to calculate number of iterations based on the convergence of the factor the analyst is interested in the calculation. Reliability analysis will be performed for the diversion tunnel in Rrëshen, Albania, by using both methods mentioned and results will be confronted

  15. [Amaranth flour: characteristics, comparative analysis, application possibilities].

    Zharkov, I M; Miroshnichenko, L A; Zviagin, A A; Bavykina, I A


    Amaranth flour--a product of amaranth seeds processing--is a valuable industrial raw material that has an unique chemical composition and may be used for nutrition of people suffering from intolerance to traditional cereals protein, including celiac disease patients. The research aim was to study the composition of amaranth flour of two types compared with semolina which is traditionally used for nutrition by Russian population, as well as to compare the composition of milk amaranth flour porridge with milk semolina porridge. The composition of amaranth whole-ground flour and amaranth flour of premium grade processed from amaranth seeds grown in Voronezh region has been researched. It is to be noted that protein content in amaranth flour was 10.8-24.3% higher than in semolina, and its biological value and NPU-coefficient were higher by 22.65 and 46.51% respectively; lysine score in amaranth flour protein of premium grade came up to 107.54%, and in semolina protein only 40.95%. The level of digestible carbohydrates, including starch, was lower in amaranth flour than in semolina by 2.79-12.85 and 4.76-15.85% respectively, while fiber content was 15.5-30 fold higher. Fat content in amaranth flour of premium grade was 2,4 fold lower than in whole-ground amaranth flour but it was 45% higher than in semolina. The main advantage of amaranth flour protein compared to wheat protein is the predominance of albumins and globulins and a minimal content of prolamines and alpha-gliadin complete absence. The specifics of chemical composition allow the amaranth flour to be recommended for being included into nutrition of both healthy children and adults and also celiac disease patients.

  16. Nigerian Criminal Networks; A comparative analysis.

    Alkholt, Aimar


    Why is an African federation the home to one of the more dominating criminal networks operating globally? Nigeria is not well known for its high level of Internet-infrastructure. Still, it is in a class of its own when it comes to e-fraud or 419 spam mails. It is also prominent within the drug trade and the African-European trafficking network. By comparatively analysing other forms of Organized Crime against the Nigerian Brand, the thesis has tried to find the particulars of Nigerian Crimina...

  17. Reliability Analysis of Partially Repairable Systems%部分可修系统的可靠性分析



    The reliability analysis of a system with repairable failures and non-repairable failures is presented. It is assumed that the system has n repairable failure modes and m non-repairable failure modes. As one repairable failure mode takes place, the system will be repaired after the failure mode is detected, otherwise, it would never work again when attaining one non-repairable failure mode. Thus, the system brings about new reliability indices for having both repairable failures and non-repairable failures. The definitions of the new reliability indices are given, and the calculating methods for them are derived by using probability analysis and the supplementary variable technique.

  18. Comparative genomic analysis of soybean flowering genes.

    Chol-Hee Jung

    Full Text Available Flowering is an important agronomic trait that determines crop yield. Soybean is a major oilseed legume crop used for human and animal feed. Legumes have unique vegetative and floral complexities. Our understanding of the molecular basis of flower initiation and development in legumes is limited. Here, we address this by using a computational approach to examine flowering regulatory genes in the soybean genome in comparison to the most studied model plant, Arabidopsis. For this comparison, a genome-wide analysis of orthologue groups was performed, followed by an in silico gene expression analysis of the identified soybean flowering genes. Phylogenetic analyses of the gene families highlighted the evolutionary relationships among these candidates. Our study identified key flowering genes in soybean and indicates that the vernalisation and the ambient-temperature pathways seem to be the most variant in soybean. A comparison of the orthologue groups containing flowering genes indicated that, on average, each Arabidopsis flowering gene has 2-3 orthologous copies in soybean. Our analysis highlighted that the CDF3, VRN1, SVP, AP3 and PIF3 genes are paralogue-rich genes in soybean. Furthermore, the genome mapping of the soybean flowering genes showed that these genes are scattered randomly across the genome. A paralogue comparison indicated that the soybean genes comprising the largest orthologue group are clustered in a 1.4 Mb region on chromosome 16 of soybean. Furthermore, a comparison with the undomesticated soybean (Glycine soja revealed that there are hundreds of SNPs that are associated with putative soybean flowering genes and that there are structural variants that may affect the genes of the light-signalling and ambient-temperature pathways in soybean. Our study provides a framework for the soybean flowering pathway and insights into the relationship and evolution of flowering genes between a short-day soybean and the long-day plant

  19. Comparative analysis of life insurance market

    Malynych, Anna Mykolayivna


    Full Text Available The article deals with the comprehensive analysis of statistic insight into development of the world and regional life insurance markets on the basis of macroeconomic indicators. The author located domestic life insurance market on the global scale, analyzed its development and suggested the methods to calculate the marketing life insurance index. There was also approbated the mentioned methods on database of 77 countries all over the world. The author also defined the national rating on the basis of marketing life insurance index.

  20. Enforsing a system approach to composite failure criteria for reliability analysis

    Dimitrov, Nikolay Krasimirov; Friis-Hansen, Peter; Berggreen, Christian


    parameters are random, multiple failure modes may be identified which will jeopardize the FORM analysis and a system approach should be applied to assure a correct analysis. Although crude Monte Carlo simulation automatically may account for such effects, time constraints limit its useability in problems...... challenges with the use of failure criteria, since composite materials are a discontinuous medium, which invoke multiple failure modes. Under deterministic conditions the material properties and the stress vector are constant and will result in a single dominating failure mode. When any of these input...... involving advanced FEM models. When applying more computationally efficient methods based on FORM/SORM it is important to carefully account for the multiple failure modes described by the failure criterion. The present paper discusses how to handle this problem and presents examples where reliability...


    Ronald L. Boring; David I. Gertman


    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no U.S. nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  2. Reliability Analysis-Based Numerical Calculation of Metal Structure of Bridge Crane

    Wenjun Meng


    Full Text Available The study introduced a finite element model of DQ75t-28m bridge crane metal structure and made finite element static analysis to obtain the stress response of the dangerous point of metal structure in the most extreme condition. The simulated samples of the random variable and the stress of the dangerous point were successfully obtained through the orthogonal design. Then, we utilized BP neural network nonlinear mapping function trains to get the explicit expression of stress in response to the random variable. Combined with random perturbation theory and first-order second-moment (FOSM method, the study analyzed the reliability and its sensitivity of metal structure. In conclusion, we established a novel method for accurately quantitative analysis and design of bridge crane metal structure.

  3. Adapting Human Reliability Analysis from Nuclear Power to Oil and Gas Applications

    Boring, Ronald Laurids [Idaho National Laboratory


    ABSTRACT: Human reliability analysis (HRA), as currently used in risk assessments, largely derives its methods and guidance from application in the nuclear energy domain. While there are many similarities be-tween nuclear energy and other safety critical domains such as oil and gas, there remain clear differences. This paper provides an overview of HRA state of the practice in nuclear energy and then describes areas where refinements to the methods may be necessary to capture the operational context of oil and gas. Many key distinctions important to nuclear energy HRA such as Level 1 vs. Level 2 analysis may prove insignifi-cant for oil and gas applications. On the other hand, existing HRA methods may not be sensitive enough to factors like the extensive use of digital controls in oil and gas. This paper provides an overview of these con-siderations to assist in the adaptation of existing nuclear-centered HRA methods to the petroleum sector.

  4. Detectability and reliability analysis of the local seismic network in Pakistan


    The detectability and reliability analysis for the local seismic network is performed employing by Bungum and Husebye technique. The events were relocated using standard computer codes for hypocentral locations. The detectability levels are estimated from the twenty-five years of recorded data in terms of 50(, 90( and 100( cumulative detectability thresholds, which were derived from frequency-magnitude distribution. From this analysis the 100( level of detectability of the network is ML=1.7 for events which occur within the network. The accuracy in hypocentral solutions of the network is investigated by considering the fixed real hypocenter within the network. The epicentral errors are found to be less than 4 km when the events occur within the network. Finally, the problems faced during continuous operation of the local network, which effects its detectability, are discussed.


    Ronald L. Boring; David I. Gertman


    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room. Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  6. Analysis of the influence of input data uncertainties on determining the reliability of reservoir storage capacity

    Marton Daniel


    Full Text Available The paper contains a sensitivity analysis of the influence of uncertainties in input hydrological, morphological and operating data required for a proposal for active reservoir conservation storage capacity and its achieved values. By introducing uncertainties into the considered inputs of the water management analysis of a reservoir, the subsequent analysed reservoir storage capacity is also affected with uncertainties. The values of water outflows from the reservoir and the hydrological reliabilities are affected with uncertainties as well. A simulation model of reservoir behaviour has been compiled with this kind of calculation as stated below. The model allows evaluation of the solution results, taking uncertainties into consideration, in contributing to a reduction in the occurrence of failure or lack of water during reservoir operation in low-water and dry periods.

  7. The Constant Comparative Method of Qualitative Analysis

    Barney G. Glaser, Ph.D.


    Full Text Available Currently, the general approaches to the analysis of qualitative data are these:1. If the analyst wishes to convert qualitative data into crudely quantifiable form so that he can provisionally test a hypothesis, he codes the data first and then analyzes it. He makes an effort to code “all relevant data [that] can be brought to bear on a point,” and then systematically assembles, assesses and analyzes these data in a fashion that will “constitute proof for a given proposition.”i2. If the analyst wishes only to generate theoretical ideasnew categories and their properties, hypotheses and interrelated hypotheses- he cannot be confined to the practice of coding first and then analyzing the data since, in generating theory, he is constantly redesigning and reintegrating his theoretical notions as he reviews his material.ii Analysis with his purpose, but the explicit coding itself often seems an unnecessary, burdensome task. As a result, the analyst merely inspects his data for new properties of his theoretical categories, and writes memos on these properties.We wish to suggest a third approach

  8. Nonlinear analysis of RED - a comparative study

    Jiang Kai; Wang Xiaofan E-mail:; Xi Yugeng


    Random Early Detection (RED) is an active queue management (AQM) mechanism for routers on the Internet. In this paper, performance of RED and Adaptive RED are compared from the viewpoint of nonlinear dynamics. In particular, we reveal the relationship between the performance of the network and its nonlinear dynamical behavior. We measure the maximal Lyapunov exponent and Hurst parameter of the average queue length of RED and Adaptive RED, as well as the throughput and packet loss rate of the aggregate traffic on the bottleneck link. Our simulation scenarios include FTP flows and Web flows, one-way and two-way traffic. In most situations, Adaptive RED has smaller maximal Lyapunov exponents, lower Hurst parameters, higher throughput and lower packet loss rate than that of RED. This confirms that Adaptive RED has better performance than RED.

  9. Comparative analysis of some search engines

    Taiwo O. Edosomwan


    Full Text Available We compared the information retrieval performances of some popular search engines (namely, Google, Yahoo, AlltheWeb, Gigablast, Zworks and AltaVista and Bing/MSN in response to a list of ten queries, varying in complexity. These queries were run on each search engine and the precision and response time of the retrieved results were recorded. The first ten documents on each retrieval output were evaluated as being ‘relevant’ or ‘non-relevant’ for evaluation of the search engine’s precision. To evaluate response time, normalised recall ratios were calculated at various cut-off points for each query and search engine. This study shows that Google appears to be the best search engine in terms of both average precision (70% and average response time (2 s. Gigablast and AlltheWeb performed the worst overall in this study.

  10. Comparative analysis of Goodwin's business cycle models

    Antonova, A. O.; Reznik, S.; Todorov, M. D.


    We compare the behavior of solutions of Goodwin's business cycle equation in the form of neutral delay differential equation with fixed delay (NDDE model) and in the form of the differential equations of 3rd, 4th and 5th orders (ODE model's). Such ODE model's (Taylor series expansion of NDDE in powers of θ) are proposed in N. Dharmaraj and K. Vela Velupillai [6] for investigation of the short periodic sawthooth oscillations in NDDE. We show that the ODE's of 3rd, 4th and 5th order may approximate the asymptotic behavior of only main Goodwin's mode, but not the sawthooth modes. If the order of the Taylor series expansion exceeds 5, then the approximate ODE becomes unstable independently of time lag θ.

  11. Comparative analysis of Debrecen sunspot catalogues

    Győri, L.; Ludmány, A.; Baranyi, T.


    Sunspot area data are important for studying solar activity and its long-term variations. At the Debrecen Heliophysical Observatory, we compiled three sunspot catalogues: the Debrecen Photoheliographic Data (DPD), the SDO/HMI Debrecen Data (HMIDD) and the SOHO/MDI Debrecen Data. For comparison, we also compiled an additional sunspot catalogue, the Greenwich Photoheliographic Data, from the digitized Royal Greenwich Observatory images for 1974-76. By comparing these catalogues when they overlap in time, we can investigate how various factors influence the measured area of sunspots, and, in addition, we can derive area cross-calibration factors for these catalogues. The main findings are as follows. Poorer seeing increases the individual corrected spot areas and decreases the number of small spots. Interestingly, the net result of these two effects for the total corrected spot area is zero. DPD daily total corrected sunspot areas are 5 per cent smaller than the HMIDD ones. Revised DPD daily total corrected umbra areas are 9 per cent smaller than those of HMIDD. The Greenwich photoheliographic areas are only a few per cent smaller than DPD areas. A 0.2° difference between the north directions of the DPD and MDI images is found. This value is nearly the same as was found (0.22°) by us in a previous paper comparing HMI and MDI images. The area measurement practice (spots smaller than 10 mh were not directly measured but an area of 2 mh was assigned to each) of the Solar Observing Optical Network cannot explain the large area deficit of the Solar Observing Optical Network.

  12. Comparative analysis of cystatin superfamily in platyhelminths.

    Aijiang Guo

    Full Text Available The cystatin superfamily is comprised of cysteine proteinase inhibitors and encompasses at least 3 subfamilies: stefins, cystatins and kininogens. In this study, the platyhelminth cystatin superfamily was identified and grouped into stefin and cystatin subfamilies. The conserved domain of stefins (G, QxVxG was observed in all members of platyhelminth stefins. The three characteristics of cystatins, the cystatin-like domain (G, QxVxG, PW, a signal peptide, and one or two conserved disulfide bonds, were observed in platyhelminths, with the exception of cestodes, which lacked the conserved disulfide bond. However, it is noteworthy that cestode cystatins had two tandem repeated domains, although the second tandem repeated domain did not contain a cystatin-like domain, which has not been previously reported. Tertiary structure analysis of Taenia solium cystatin, one of the cestode cystatins, demonstrated that the N-terminus of T. solium cystatin formed a five turn α-helix, a five stranded β-pleated sheet and a hydrophobic edge, similar to the structure of chicken cystatin. Although no conserved disulfide bond was found in T. solium cystatin, the models of T. solium cystatin and chicken cystatin corresponded at the site of the first disulfide bridge of the chicken cystatin. However, the two models were not similar regarding the location of the second disulfide bridge of chicken cystatin. These results showed that T. solium cystatin and chicken cystatin had similarities and differences, suggesting that the biochemistry of T. solium cystatin could be similar to chicken cystatin in its inhibitory function and that it may have further functional roles. The same results were obtained for other cestode cystatins. Phylogenetic analysis showed that cestode cystatins constituted an independent clade and implied that cestode cystatins should be considered to have formed a new clade during evolution.

  13. Comparative analysis of cystatin superfamily in platyhelminths.

    Guo, Aijiang


    The cystatin superfamily is comprised of cysteine proteinase inhibitors and encompasses at least 3 subfamilies: stefins, cystatins and kininogens. In this study, the platyhelminth cystatin superfamily was identified and grouped into stefin and cystatin subfamilies. The conserved domain of stefins (G, QxVxG) was observed in all members of platyhelminth stefins. The three characteristics of cystatins, the cystatin-like domain (G, QxVxG, PW), a signal peptide, and one or two conserved disulfide bonds, were observed in platyhelminths, with the exception of cestodes, which lacked the conserved disulfide bond. However, it is noteworthy that cestode cystatins had two tandem repeated domains, although the second tandem repeated domain did not contain a cystatin-like domain, which has not been previously reported. Tertiary structure analysis of Taenia solium cystatin, one of the cestode cystatins, demonstrated that the N-terminus of T. solium cystatin formed a five turn α-helix, a five stranded β-pleated sheet and a hydrophobic edge, similar to the structure of chicken cystatin. Although no conserved disulfide bond was found in T. solium cystatin, the models of T. solium cystatin and chicken cystatin corresponded at the site of the first disulfide bridge of the chicken cystatin. However, the two models were not similar regarding the location of the second disulfide bridge of chicken cystatin. These results showed that T. solium cystatin and chicken cystatin had similarities and differences, suggesting that the biochemistry of T. solium cystatin could be similar to chicken cystatin in its inhibitory function and that it may have further functional roles. The same results were obtained for other cestode cystatins. Phylogenetic analysis showed that cestode cystatins constituted an independent clade and implied that cestode cystatins should be considered to have formed a new clade during evolution.

  14. Microgrid Design Analysis Using Technology Management Optimization and the Performance Reliability Model

    Stamp, Jason E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jensen, Richard P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Munoz-Ramos, Karina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)


    Microgrids are a focus of localized energy production that support resiliency, security, local con- trol, and increased access to renewable resources (among other potential benefits). The Smart Power Infrastructure Demonstration for Energy Reliability and Security (SPIDERS) Joint Capa- bility Technology Demonstration (JCTD) program between the Department of Defense (DOD), Department of Energy (DOE), and Department of Homeland Security (DHS) resulted in the pre- liminary design and deployment of three microgrids at military installations. This paper is focused on the analysis process and supporting software used to determine optimal designs for energy surety microgrids (ESMs) in the SPIDERS project. There are two key pieces of software, an ex- isting software application developed by Sandia National Laboratories (SNL) called Technology Management Optimization (TMO) and a new simulation developed for SPIDERS called the per- formance reliability model (PRM). TMO is a decision support tool that performs multi-objective optimization over a mixed discrete/continuous search space for which the performance measures are unrestricted in form. The PRM is able to statistically quantify the performance and reliability of a microgrid operating in islanded mode (disconnected from any utility power source). Together, these two software applications were used as part of the ESM process to generate the preliminary designs presented by SNL-led DOE team to the DOD. Acknowledgements Sandia National Laboratories and the SPIDERS technical team would like to acknowledge the following for help in the project: * Mike Hightower, who has been the key driving force for Energy Surety Microgrids * Juan Torres and Abbas Akhil, who developed the concept of microgrids for military instal- lations * Merrill Smith, U.S. Department of Energy SPIDERS Program Manager * Ross Roley and Rich Trundy from U.S. Pacific Command * Bill Waugaman and Bill Beary from U.S. Northern Command * Tarek Abdallah, Melanie

  15. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) GEM Manual

    C. L. Smith; J. Schroeder; S. T. Beck


    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer running the Microsoft Windows? operating system. SAPHIRE is primarily funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer and tester. Using the SAPHIRE analysis engine and relational database is a complementary program called GEM. GEM has been designed to simplify using existing PRA analysis for activities such as the NRC’s Accident Sequence Precursor program. In this report, the theoretical framework behind GEM-type calculations are discussed in addition to providing guidance and examples for performing evaluations when using the GEM software. As part of this analysis framework, the two types of GEM analysis are outlined, specifically initiating event (where an initiator occurs) and condition (where a component is failed for some length of time) assessments.


    V. M. Alpatov


    Full Text Available Abstract: The article analyses the book on political systems and processes in the East prepared by MGIMO-University authors and edited by Alexei D. Voskressenski in order to show the differences in approach and methods used in linguistics and political science. The author shows two significant differences in present-day stressing that linguistics of the XIX century was closer to the present-day political science? As he believes. The first difference includes monism of political science approach, since the book reveals monistic scale from totalitarianism to democracy, while linguistic has abandoned the monistic view on typology. The second difference is the value-addedness of the political science approach. The value-free norm in linguistics presupposes setting up of a single standard for all speakers in order to reach full mutual understanding. In political science subjective criteria are decisive for evaluation. The article gives examples from the book to prove that political science, compared to linguistics, is not aimed at overcoming contradictions, distinguishing between the terms, avoiding unproved statements and subjective evaluations.

  17. Comparative Analysis of Frames with Varying Inertia

    Prerana Nampalli


    Full Text Available This paper presents an elastic seismic response of reinforced concrete frames with 3 variations of heights, i.e. (G+2, (G+4, (G+6 storey models are compared for bare frame and frame with brick infill structures which have been analyzed for gravity as well as seismic forces and their response is studied as the geometric parameters varying from view point of predicting behavior of similar structures subjected to similar loads or load combinations. In this study, two different cases are selected i.e. frames with prismatic members and frames with non-prismatic members. The structural response of various members when geometry changes physically, as in case of linear and parabolic haunches provided beyond the face of columns at beam column joints or step variations as in case of stepped haunches was also studied. Frames have been analyzed statically as well as dynamically using ETABS-9.7.4 software referring IS: 456-2000, IS: 1893 (Part-12002 and the results so obtained are grouped into various categories

  18. Comparative proteomics analysis of human gastric cancer

    Wei Li; Jian-Fang Li; Ying Qu; Xue-Hua Chen; Jian-Min Qin; Qin-Long Gu; Min Yan; Zheng-Gang Zhu; Bing-Ya Liu


    AIM: To isolate and identify differentially expressed proteins between cancer and normal tissues of gastric cancer by two-dimensional electrophoresis (2-DE) and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS).METHODS: Soluble fraction proteins of gastric cancer tissues and paired normal tissues were separated by 2-DE.The differentially expressed proteins were selected and identified by MALDI-TOF-MS and database search.RESULTS: 2-DE profiles with high resolution and reproducibility were obtained.Twenty-three protein spots were excised from sliver staining gel and digested in gel by trypsin,in which fifteen protein spots were identified successfully.Among the identified proteins,there were ten over-expressed and five under-expressed proteins in stomach cancer tissues compared with normal tissues.CONCLUSION: In this study,the well-resolved,reproducible 2-DE patterns of human gastric cancer tissue and paired normal tissue were established and optimized and certain differentially-expressed proteins were identified.The combined use of 2-DE and MS provides an effective approach to screen for potential tumor markers.

  19. Human reliability analysis data obtainment through fuzzy logic in nuclear plants

    Nascimento, C.S. do, E-mail: [Centro Tecnologico da Marinha em Sao Paulo (CTMSP), Av. Professor Lineu Prestes 2468, 05508-000 Sao Paulo, SP (Brazil); Mesquita, R.N. de, E-mail: [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN - SP), Av. Professor Lineu Prestes 2242, 05508-000 Sao Paulo, SP (Brazil)


    Highlights: Black-Right-Pointing-Pointer Human Error Probability estimates from operator's reactions to emergency situations. Black-Right-Pointing-Pointer Human Reliability Analysis input data obtainment through fuzzy logic inference. Black-Right-Pointing-Pointer Performance Shaping Factors evaluation influence level onto the operator's actions. - Abstract: Human error has been recognized as an important factor for many industrial and nuclear accidents occurrence. Human error data is scarcely available for different reasons among which, lapses in historical database registry methodology is an important one. Human Reliability Analysis (HRA) is an usual tool employed to estimate the probability that an operator will reasonably perform a system required task in required time without degrading the system. This meta-analysis requires specific Human Error Probability estimates for most of its procedure. This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations hypothesis on Research Reactor IEA-R1 from IPEN, Brazil. Through this proposed methodology HRA should be able to be performed even with shortage of related human error statistical data. A Performance Shaping Factors (PSF's) evaluation in order to classify and estimate their influence level onto the operator's actions and to determine their actual state over the plant was also done. Both HEP estimation and PSF evaluation were done based on expert judgment using interviews and questionnaires. Expert group was established based on selected IEA-R1 operators, and their evaluation were put into a knowledge representation system which used linguistic variables and group evaluation values that were obtained through Fuzzy Logic and Fuzzy Set theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on HRA.

  20. Regression analysis of the structure function for reliability evaluation of continuous-state system

    Gamiz, M.L., E-mail: mgamiz@ugr.e [Departamento de Estadistica e I.O., Facultad de Ciencias, Universidad de Granada, Granada 18071 (Spain); Martinez Miranda, M.D. [Departamento de Estadistica e I.O., Facultad de Ciencias, Universidad de Granada, Granada 18071 (Spain)


    Technical systems are designed to perform an intended task with an admissible range of efficiency. According to this idea, it is permissible that the system runs among different levels of performance, in addition to complete failure and the perfect functioning one. As a consequence, reliability theory has evolved from binary-state systems to the most general case of continuous-state system, in which the state of the system changes over time through some interval on the real number line. In this context, obtaining an expression for the structure function becomes difficult, compared to the discrete case, with difficulty increasing as the number of components of the system increases. In this work, we propose a method to build a structure function for a continuum system by using multivariate nonparametric regression techniques, in which certain analytical restrictions on the variable of interest must be taken into account. Once the structure function is obtained, some reliability indices of the system are estimated. We illustrate our method via several numerical examples.