WorldWideScience

Sample records for highly reliable method

  1. Reliability-based design optimization via high order response surface method

    International Nuclear Information System (INIS)

    Li, Hong Shuang

    2013-01-01

    To reduce the computational effort of reliability-based design optimization (RBDO), the response surface method (RSM) has been widely used to evaluate reliability constraints. We propose an efficient methodology for solving RBDO problems based on an improved high order response surface method (HORSM) that takes advantage of an efficient sampling method, Hermite polynomials and uncertainty contribution concept to construct a high order response surface function with cross terms for reliability analysis. The sampling method generates supporting points from Gauss-Hermite quadrature points, which can be used to approximate response surface function without cross terms, to identify the highest order of each random variable and to determine the significant variables connected with point estimate method. The cross terms between two significant random variables are added to the response surface function to improve the approximation accuracy. Integrating the nested strategy, the improved HORSM is explored in solving RBDO problems. Additionally, a sampling based reliability sensitivity analysis method is employed to reduce the computational effort further when design variables are distributional parameters of input random variables. The proposed methodology is applied on two test problems to validate its accuracy and efficiency. The proposed methodology is more efficient than first order reliability method based RBDO and Monte Carlo simulation based RBDO, and enables the use of RBDO as a practical design tool.

  2. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    Science.gov (United States)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  3. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  4. System principles, mathematical models and methods to ensure high reliability of safety systems

    Science.gov (United States)

    Zaslavskyi, V.

    2017-04-01

    Modern safety and security systems are composed of a large number of various components designed for detection, localization, tracking, collecting, and processing of information from the systems of monitoring, telemetry, control, etc. They are required to be highly reliable in a view to correctly perform data aggregation, processing and analysis for subsequent decision making support. On design and construction phases of the manufacturing of such systems a various types of components (elements, devices, and subsystems) are considered and used to ensure high reliability of signals detection, noise isolation, and erroneous commands reduction. When generating design solutions for highly reliable systems a number of restrictions and conditions such as types of components and various constrains on resources should be considered. Various types of components perform identical functions; however, they are implemented using diverse principles, approaches and have distinct technical and economic indicators such as cost or power consumption. The systematic use of different component types increases the probability of tasks performing and eliminates the common cause failure. We consider type-variety principle as an engineering principle of system analysis, mathematical models based on this principle, and algorithms for solving optimization problems of highly reliable safety and security systems design. Mathematical models are formalized in a class of two-level discrete optimization problems of large dimension. The proposed approach, mathematical models, algorithms can be used for problem solving of optimal redundancy on the basis of a variety of methods and control devices for fault and defects detection in technical systems, telecommunication networks, and energy systems.

  5. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  6. Survey of industry methods for producing highly reliable software

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Persons, W.L.

    1994-11-01

    The Nuclear Reactor Regulation Office of the US Nuclear Regulatory Commission is charged with assessing the safety of new instrument and control designs for nuclear power plants which may use computer-based reactor protection systems. Lawrence Livermore National Laboratory has evaluated the latest techniques in software reliability for measurement, estimation, error detection, and prediction that can be used during the software life cycle as a means of risk assessment for reactor protection systems. One aspect of this task has been a survey of the software industry to collect information to help identify the design factors used to improve the reliability and safety of software. The intent was to discover what practices really work in industry and what design factors are used by industry to achieve highly reliable software. The results of the survey are documented in this report. Three companies participated in the survey: Computer Sciences Corporation, International Business Machines (Federal Systems Company), and TRW. Discussions were also held with NASA Software Engineering Lab/University of Maryland/CSC, and the AIAA Software Reliability Project

  7. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    International Nuclear Information System (INIS)

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  8. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  9. High-Reliability Health Care: Getting There from Here

    Science.gov (United States)

    Chassin, Mark R; Loeb, Jerod M

    2013-01-01

    Context Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer “project fatigue” because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. Methods We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals’ readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. Findings We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Conclusions Hospitals can make substantial progress toward high reliability by undertaking several specific

  10. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  11. An Evaluation Method of Equipment Reliability Configuration Management

    Science.gov (United States)

    Wang, Wei; Feng, Weijia; Zhang, Wei; Li, Yuan

    2018-01-01

    At present, many equipment development companies have been aware of the great significance of reliability of the equipment development. But, due to the lack of effective management evaluation method, it is very difficult for the equipment development company to manage its own reliability work. Evaluation method of equipment reliability configuration management is to determine the reliability management capabilities of equipment development company. Reliability is not only designed, but also managed to achieve. This paper evaluates the reliability management capabilities by reliability configuration capability maturity model(RCM-CMM) evaluation method.

  12. Method matters: Understanding diagnostic reliability in DSM-IV and DSM-5.

    Science.gov (United States)

    Chmielewski, Michael; Clark, Lee Anna; Bagby, R Michael; Watson, David

    2015-08-01

    Diagnostic reliability is essential for the science and practice of psychology, in part because reliability is necessary for validity. Recently, the DSM-5 field trials documented lower diagnostic reliability than past field trials and the general research literature, resulting in substantial criticism of the DSM-5 diagnostic criteria. Rather than indicating specific problems with DSM-5, however, the field trials may have revealed long-standing diagnostic issues that have been hidden due to a reliance on audio/video recordings for estimating reliability. We estimated the reliability of DSM-IV diagnoses using both the standard audio-recording method and the test-retest method used in the DSM-5 field trials, in which different clinicians conduct separate interviews. Psychiatric patients (N = 339) were diagnosed using the SCID-I/P; 218 were diagnosed a second time by an independent interviewer. Diagnostic reliability using the audio-recording method (N = 49) was "good" to "excellent" (M κ = .80) and comparable to the DSM-IV field trials estimates. Reliability using the test-retest method (N = 218) was "poor" to "fair" (M κ = .47) and similar to DSM-5 field-trials' estimates. Despite low test-retest diagnostic reliability, self-reported symptoms were highly stable. Moreover, there was no association between change in self-report and change in diagnostic status. These results demonstrate the influence of method on estimates of diagnostic reliability. (c) 2015 APA, all rights reserved).

  13. Reliability and validity of non-radiographic methods of thoracic kyphosis measurement: a systematic review.

    Science.gov (United States)

    Barrett, Eva; McCreesh, Karen; Lewis, Jeremy

    2014-02-01

    A wide array of instruments are available for non-invasive thoracic kyphosis measurement. Guidelines for selecting outcome measures for use in clinical and research practice recommend that properties such as validity and reliability are considered. This systematic review reports on the reliability and validity of non-invasive methods for measuring thoracic kyphosis. A systematic search of 11 electronic databases located studies assessing reliability and/or validity of non-invasive thoracic kyphosis measurement techniques. Two independent reviewers used a critical appraisal tool to assess the quality of retrieved studies. Data was extracted by the primary reviewer. The results were synthesized qualitatively using a level of evidence approach. 27 studies satisfied the eligibility criteria and were included in the review. The reliability, validity and both reliability and validity were investigated by sixteen, two and nine studies respectively. 17/27 studies were deemed to be of high quality. In total, 15 methods of thoracic kyphosis were evaluated in retrieved studies. All investigated methods showed high (ICC ≥ .7) to very high (ICC ≥ .9) levels of reliability. The validity of the methods ranged from low to very high. The strongest levels of evidence for reliability exists in support of the Debrunner kyphometer, Spinal Mouse and Flexicurve index, and for validity supports the arcometer and Flexicurve index. Further reliability and validity studies are required to strengthen the level of evidence for the remaining methods of measurement. This should be addressed by future research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. HUMAN RELIABILITY ANALYSIS DENGAN PENDEKATAN COGNITIVE RELIABILITY AND ERROR ANALYSIS METHOD (CREAM

    Directory of Open Access Journals (Sweden)

    Zahirah Alifia Maulida

    2015-01-01

    Full Text Available Kecelakaan kerja pada bidang grinding dan welding menempati urutan tertinggi selama lima tahun terakhir di PT. X. Kecelakaan ini disebabkan oleh human error. Human error terjadi karena pengaruh lingkungan kerja fisik dan non fisik.Penelitian kali menggunakan skenario untuk memprediksi serta mengurangi kemungkinan terjadinya error pada manusia dengan pendekatan CREAM (Cognitive Reliability and Error Analysis Method. CREAM adalah salah satu metode human reliability analysis yang berfungsi untuk mendapatkan nilai Cognitive Failure Probability (CFP yang dapat dilakukan dengan dua cara yaitu basic method dan extended method. Pada basic method hanya akan didapatkan nilai failure probabailty secara umum, sedangkan untuk extended method akan didapatkan CFP untuk setiap task. Hasil penelitian menunjukkan faktor- faktor yang mempengaruhi timbulnya error pada pekerjaan grinding dan welding adalah kecukupan organisasi, kecukupan dari Man Machine Interface (MMI & dukungan operasional, ketersediaan prosedur/ perencanaan, serta kecukupan pelatihan dan pengalaman. Aspek kognitif pada pekerjaan grinding yang memiliki nilai error paling tinggi adalah planning dengan nilai CFP 0.3 dan pada pekerjaan welding yaitu aspek kognitif execution dengan nilai CFP 0.18. Sebagai upaya untuk mengurangi nilai error kognitif pada pekerjaan grinding dan welding rekomendasi yang diberikan adalah memberikan training secara rutin, work instrucstion yang lebih rinci dan memberikan sosialisasi alat. Kata kunci: CREAM (cognitive reliability and error analysis method, HRA (human reliability analysis, cognitive error Abstract The accidents in grinding and welding sectors were the highest cases over the last five years in PT. X and it caused by human error. Human error occurs due to the influence of working environment both physically and non-physically. This study will implement an approaching scenario called CREAM (Cognitive Reliability and Error Analysis Method. CREAM is one of human

  15. A Method of Nuclear Software Reliability Estimation

    International Nuclear Information System (INIS)

    Park, Gee Yong; Eom, Heung Seop; Cheon, Se Woo; Jang, Seung Cheol

    2011-01-01

    A method on estimating software reliability for nuclear safety software is proposed. This method is based on the software reliability growth model (SRGM) where the behavior of software failure is assumed to follow the non-homogeneous Poisson process. Several modeling schemes are presented in order to estimate and predict more precisely the number of software defects based on a few of software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating the software test cases into the model. It is identified that this method is capable of accurately estimating the remaining number of software defects which are on-demand type directly affecting safety trip functions. The software reliability can be estimated from a model equation and one method of obtaining the software reliability is proposed

  16. Note: An online testing method for lifetime projection of high power light-emitting diode under accelerated reliability test.

    Science.gov (United States)

    Chen, Qi; Chen, Quan; Luo, Xiaobing

    2014-09-01

    In recent years, due to the fast development of high power light-emitting diode (LED), its lifetime prediction and assessment have become a crucial issue. Although the in situ measurement has been widely used for reliability testing in laser diode community, it has not been applied commonly in LED community. In this paper, an online testing method for LED life projection under accelerated reliability test was proposed and the prototype was built. The optical parametric data were collected. The systematic error and the measuring uncertainty were calculated to be within 0.2% and within 2%, respectively. With this online testing method, experimental data can be acquired continuously and sufficient amount of data can be gathered. Thus, the projection fitting accuracy can be improved (r(2) = 0.954) and testing duration can be shortened.

  17. Human reliability in high dose rate afterloading radiotherapy based on FMECA

    International Nuclear Information System (INIS)

    Deng Jun; Fan Yaohua; Yue Baorong; Wei Kedao; Ren Fuli

    2012-01-01

    Objective: To put forward reasonable and feasible recommendations against the procedure with relative high risk during the high dose rate (HDR) afterloading radiotherapy, so as to enhance its clinical application safety, through studying the human reliability in the process of carrying out the HDR afterloading radiotherapy. Methods: Basic data were collected by on-site investigation and process analysis as well as expert evaluation. Failure mode, effect and criticality analysis (FMECA) employed to study the human reliability in the execution of HDR afterloading radiotherapy. Results: The FMECA model of human reliability for HDR afterloading radiotherapy was established, through which 25 procedures with relative high risk index were found,accounting for 14.1% of total 177 procedures. Conclusions: FMECA method in human reliability study for HDR afterloading radiotherapy is feasible. The countermeasures are put forward to reduce the human error, so as to provide important basis for enhancing clinical application safety of HDR afterloading radiotherapy. (authors)

  18. Highly reliable electro-hydraulic control system

    International Nuclear Information System (INIS)

    Mande, Morima; Hiyama, Hiroshi; Takahashi, Makoto

    1984-01-01

    The unscheduled shutdown of nuclear power stations disturbs power system, and exerts large influence on power generation cost due to the lowering of capacity ratio; therefore, high reliability is required for the control system of nuclear power stations. Toshiba Corp. has exerted effort to improve the reliability of the control system of power stations, and in this report, the electro-hydraulic control system for the turbines of nuclear power stations is described. The main functions of the electro-hydraulic control system are the control of main steam pressure with steam regulation valves and turbine bypass valves, the control of turbine speed and load, the prevention of turbine overspeed, the protection of turbines and so on. The system is composed of pressure sensors and a speed sensor, the control board containing the electronic circuits for control computation and protective sequence, the oil cylinders, servo valves and opening detectors of the valves for control, a high pressure oil hydraulic machine and piping, the operating panel and so on. The main features are the adoption of tripling intermediate value selection method, the multiplying of protection sensors and the adoption of 2 out of 3 trip logic, the multiplying of power sources, the improvement of the reliability of electronic circuit hardware and oil hydraulic system. (Kako, I.)

  19. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  20. A Novel Reliability Enhanced Handoff Method in Future Wireless Heterogeneous Networks

    Directory of Open Access Journals (Sweden)

    Wang YuPeng

    2016-01-01

    Full Text Available As the demand increases, future networks will follow the trends of network variety and service flexibility, which requires heterogeneous type of network deployment and reliable communication method. In practice, most communication failure happens due to the bad radio link quality, i.e., high-speed users suffers a lot on the problem of radio link failure, which causes the problem of communication interrupt and radio link recovery. To make the communication more reliable, especially for the high mobility users, we propose a novel communication handoff mechanism to reduce the occurrence of service interrupt. Based on computer simulation, we find that the reliability on the service is greatly improved.

  1. Application of reliability methods in Ontario Hydro

    International Nuclear Information System (INIS)

    Jeppesen, R.; Ravishankar, T.J.

    1985-01-01

    Ontario Hydro have established a reliability program in support of its substantial nuclear program. Application of the reliability program to achieve both production and safety goals is described. The value of such a reliability program is evident in the record of Ontario Hydro's operating nuclear stations. The factors which have contributed to the success of the reliability program are identified as line management's commitment to reliability; selective and judicious application of reliability methods; establishing performance goals and monitoring the in-service performance; and collection, distribution, review and utilization of performance information to facilitate cost-effective achievement of goals and improvements. (orig.)

  2. High-reliability computing for the smarter planet

    International Nuclear Information System (INIS)

    Quinn, Heather M.; Graham, Paul; Manuzzato, Andrea; Dehon, Andre

    2010-01-01

    The geometric rate of improvement of transistor size and integrated circuit performance, known as Moore's Law, has been an engine of growth for our economy, enabling new products and services, creating new value and wealth, increasing safety, and removing menial tasks from our daily lives. Affordable, highly integrated components have enabled both life-saving technologies and rich entertainment applications. Anti-lock brakes, insulin monitors, and GPS-enabled emergency response systems save lives. Cell phones, internet appliances, virtual worlds, realistic video games, and mp3 players enrich our lives and connect us together. Over the past 40 years of silicon scaling, the increasing capabilities of inexpensive computation have transformed our society through automation and ubiquitous communications. In this paper, we will present the concept of the smarter planet, how reliability failures affect current systems, and methods that can be used to increase the reliable adoption of new automation in the future. We will illustrate these issues using a number of different electronic devices in a couple of different scenarios. Recently IBM has been presenting the idea of a 'smarter planet.' In smarter planet documents, IBM discusses increased computer automation of roadways, banking, healthcare, and infrastructure, as automation could create more efficient systems. A necessary component of the smarter planet concept is to ensure that these new systems have very high reliability. Even extremely rare reliability problems can easily escalate to problematic scenarios when implemented at very large scales. For life-critical systems, such as automobiles, infrastructure, medical implantables, and avionic systems, unmitigated failures could be dangerous. As more automation moves into these types of critical systems, reliability failures will need to be managed. As computer automation continues to increase in our society, the need for greater radiation reliability is necessary

  3. High-reliability computing for the smarter planet

    Energy Technology Data Exchange (ETDEWEB)

    Quinn, Heather M [Los Alamos National Laboratory; Graham, Paul [Los Alamos National Laboratory; Manuzzato, Andrea [UNIV OF PADOVA; Dehon, Andre [UNIV OF PENN; Carter, Nicholas [INTEL CORPORATION

    2010-01-01

    The geometric rate of improvement of transistor size and integrated circuit performance, known as Moore's Law, has been an engine of growth for our economy, enabling new products and services, creating new value and wealth, increasing safety, and removing menial tasks from our daily lives. Affordable, highly integrated components have enabled both life-saving technologies and rich entertainment applications. Anti-lock brakes, insulin monitors, and GPS-enabled emergency response systems save lives. Cell phones, internet appliances, virtual worlds, realistic video games, and mp3 players enrich our lives and connect us together. Over the past 40 years of silicon scaling, the increasing capabilities of inexpensive computation have transformed our society through automation and ubiquitous communications. In this paper, we will present the concept of the smarter planet, how reliability failures affect current systems, and methods that can be used to increase the reliable adoption of new automation in the future. We will illustrate these issues using a number of different electronic devices in a couple of different scenarios. Recently IBM has been presenting the idea of a 'smarter planet.' In smarter planet documents, IBM discusses increased computer automation of roadways, banking, healthcare, and infrastructure, as automation could create more efficient systems. A necessary component of the smarter planet concept is to ensure that these new systems have very high reliability. Even extremely rare reliability problems can easily escalate to problematic scenarios when implemented at very large scales. For life-critical systems, such as automobiles, infrastructure, medical implantables, and avionic systems, unmitigated failures could be dangerous. As more automation moves into these types of critical systems, reliability failures will need to be managed. As computer automation continues to increase in our society, the need for greater radiation reliability is

  4. Validity and Interrater Reliability of the Visual Quarter-Waste Method for Assessing Food Waste in Middle School and High School Cafeteria Settings.

    Science.gov (United States)

    Getts, Katherine M; Quinn, Emilee L; Johnson, Donna B; Otten, Jennifer J

    2017-11-01

    Measuring food waste (ie, plate waste) in school cafeterias is an important tool to evaluate the effectiveness of school nutrition policies and interventions aimed at increasing consumption of healthier meals. Visual assessment methods are frequently applied in plate waste studies because they are more convenient than weighing. The visual quarter-waste method has become a common tool in studies of school meal waste and consumption, but previous studies of its validity and reliability have used correlation coefficients, which measure association but not necessarily agreement. The aims of this study were to determine, using a statistic measuring interrater agreement, whether the visual quarter-waste method is valid and reliable for assessing food waste in a school cafeteria setting when compared with the gold standard of weighed plate waste. To evaluate validity, researchers used the visual quarter-waste method and weighed food waste from 748 trays at four middle schools and five high schools in one school district in Washington State during May 2014. To assess interrater reliability, researcher pairs independently assessed 59 of the same trays using the visual quarter-waste method. Both validity and reliability were assessed using a weighted κ coefficient. For validity, as compared with the measured weight, 45% of foods assessed using the visual quarter-waste method were in almost perfect agreement, 42% of foods were in substantial agreement, 10% were in moderate agreement, and 3% were in slight agreement. For interrater reliability between pairs of visual assessors, 46% of foods were in perfect agreement, 31% were in almost perfect agreement, 15% were in substantial agreement, and 8% were in moderate agreement. These results suggest that the visual quarter-waste method is a valid and reliable tool for measuring plate waste in school cafeteria settings. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  5. Bearing Procurement Analysis Method by Total Cost of Ownership Analysis and Reliability Prediction

    Science.gov (United States)

    Trusaji, Wildan; Akbar, Muhammad; Sukoyo; Irianto, Dradjad

    2018-03-01

    In making bearing procurement analysis, price and its reliability must be considered as decision criteria, since price determines the direct cost as acquisition cost and reliability of bearing determine the indirect cost such as maintenance cost. Despite the indirect cost is hard to identify and measured, it has high contribution to overall cost that will be incurred. So, the indirect cost of reliability must be considered when making bearing procurement analysis. This paper tries to explain bearing evaluation method with the total cost of ownership analysis to consider price and maintenance cost as decision criteria. Furthermore, since there is a lack of failure data when bearing evaluation phase is conducted, reliability prediction method is used to predict bearing reliability from its dynamic load rating parameter. With this method, bearing with a higher price but has higher reliability is preferable for long-term planning. But for short-term planning the cheaper one but has lower reliability is preferable. This contextuality can give rise to conflict between stakeholders. Thus, the planning horizon needs to be agreed by all stakeholder before making a procurement decision.

  6. Reliability Assessment of Active Distribution System Using Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Shaoyun Ge

    2014-01-01

    Full Text Available In this paper we have treated the reliability assessment problem of low and high DG penetration level of active distribution system using the Monte Carlo simulation method. The problem is formulated as a two-case program, the program of low penetration simulation and the program of high penetration simulation. The load shedding strategy and the simulation process were introduced in detail during each FMEA process. Results indicate that the integration of DG can improve the reliability of the system if the system was operated actively.

  7. Exact reliability quantification of highly reliable systems with maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bris, Radim, E-mail: radim.bris@vsb.c [VSB-Technical University Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Applied Mathematics, 17. listopadu 15, 70833 Ostrava-Poruba (Czech Republic)

    2010-12-15

    When a system is composed of highly reliable elements, exact reliability quantification may be problematic, because computer accuracy is limited. Inaccuracy can be due to different aspects. For example, an error may be made when subtracting two numbers that are very close to each other, or at the process of summation of many very different numbers, etc. The basic objective of this paper is to find a procedure, which eliminates errors made by PC when calculations close to an error limit are executed. Highly reliable system is represented by the use of directed acyclic graph which is composed from terminal nodes, i.e. highly reliable input elements, internal nodes representing subsystems and edges that bind all of these nodes. Three admissible unavailability models of terminal nodes are introduced, including both corrective and preventive maintenance. The algorithm for exact unavailability calculation of terminal nodes is based on merits of a high-performance language for technical computing MATLAB. System unavailability quantification procedure applied to a graph structure, which considers both independent and dependent (i.e. repeatedly occurring) terminal nodes is based on combinatorial principle. This principle requires summation of a lot of very different non-negative numbers, which may be a source of an inaccuracy. That is why another algorithm for exact summation of such numbers is designed in the paper. The summation procedure uses benefits from a special number system with the base represented by the value 2{sup 32}. Computational efficiency of the new computing methodology is compared with advanced simulation software. Various calculations on systems from references are performed to emphasize merits of the methodology.

  8. The Monte Carlo Simulation Method for System Reliability and Risk Analysis

    CERN Document Server

    Zio, Enrico

    2013-01-01

    Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling.   Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques.   This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...

  9. Development of a highly reliable CRT processor

    International Nuclear Information System (INIS)

    Shimizu, Tomoya; Saiki, Akira; Hirai, Kenji; Jota, Masayoshi; Fujii, Mikiya

    1996-01-01

    Although CRT processors have been employed by the main control board to reduce the operator's workload during monitoring, the control systems are still operated by hardware switches. For further advancement, direct controller operation through a display device is expected. A CRT processor providing direct controller operation must be as reliable as the hardware switches are. The authors are developing a new type of highly reliable CRT processor that enables direct controller operations. In this paper, we discuss the design principles behind a highly reliable CRT processor. The principles are defined by studies of software reliability and of the functional reliability of the monitoring and operation systems. The functional configuration of an advanced CRT processor is also addressed. (author)

  10. A novel reliability evaluation method for large engineering systems

    Directory of Open Access Journals (Sweden)

    Reda Farag

    2016-06-01

    Full Text Available A novel reliability evaluation method for large nonlinear engineering systems excited by dynamic loading applied in time domain is presented. For this class of problems, the performance functions are expected to be function of time and implicit in nature. Available first- or second-order reliability method (FORM/SORM will be challenging to estimate reliability of such systems. Because of its inefficiency, the classical Monte Carlo simulation (MCS method also cannot be used for large nonlinear dynamic systems. In the proposed approach, only tens instead of hundreds or thousands of deterministic evaluations at intelligently selected points are used to extract the reliability information. A hybrid approach, consisting of the stochastic finite element method (SFEM developed by the author and his research team using FORM, response surface method (RSM, an interpolation scheme, and advanced factorial schemes, is proposed. The method is clarified with the help of several numerical examples.

  11. Structural reliability calculation method based on the dual neural network and direct integration method.

    Science.gov (United States)

    Li, Haibin; He, Yun; Nie, Xiaobo

    2018-01-01

    Structural reliability analysis under uncertainty is paid wide attention by engineers and scholars due to reflecting the structural characteristics and the bearing actual situation. The direct integration method, started from the definition of reliability theory, is easy to be understood, but there are still mathematics difficulties in the calculation of multiple integrals. Therefore, a dual neural network method is proposed for calculating multiple integrals in this paper. Dual neural network consists of two neural networks. The neural network A is used to learn the integrand function, and the neural network B is used to simulate the original function. According to the derivative relationships between the network output and the network input, the neural network B is derived from the neural network A. On this basis, the performance function of normalization is employed in the proposed method to overcome the difficulty of multiple integrations and to improve the accuracy for reliability calculations. The comparisons between the proposed method and Monte Carlo simulation method, Hasofer-Lind method, the mean value first-order second moment method have demonstrated that the proposed method is an efficient and accurate reliability method for structural reliability problems.

  12. High-reliability health care: getting there from here.

    Science.gov (United States)

    Chassin, Mark R; Loeb, Jerod M

    2013-09-01

    Despite serious and widespread efforts to improve the quality of health care, many patients still suffer preventable harm every day. Hospitals find improvement difficult to sustain, and they suffer "project fatigue" because so many problems need attention. No hospitals or health systems have achieved consistent excellence throughout their institutions. High-reliability science is the study of organizations in industries like commercial aviation and nuclear power that operate under hazardous conditions while maintaining safety levels that are far better than those of health care. Adapting and applying the lessons of this science to health care offer the promise of enabling hospitals to reach levels of quality and safety that are comparable to those of the best high-reliability organizations. We combined the Joint Commission's knowledge of health care organizations with knowledge from the published literature and from experts in high-reliability industries and leading safety scholars outside health care. We developed a conceptual and practical framework for assessing hospitals' readiness for and progress toward high reliability. By iterative testing with hospital leaders, we refined the framework and, for each of its fourteen components, defined stages of maturity through which we believe hospitals must pass to reach high reliability. We discovered that the ways that high-reliability organizations generate and maintain high levels of safety cannot be directly applied to today's hospitals. We defined a series of incremental changes that hospitals should undertake to progress toward high reliability. These changes involve the leadership's commitment to achieving zero patient harm, a fully functional culture of safety throughout the organization, and the widespread deployment of highly effective process improvement tools. Hospitals can make substantial progress toward high reliability by undertaking several specific organizational change initiatives. Further research

  13. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    OpenAIRE

    Hai An; Ling Zhou; Hui Sun

    2016-01-01

    Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new...

  14. Reliability Evaluation of Bridges Based on Nonprobabilistic Response Surface Limit Method

    Directory of Open Access Journals (Sweden)

    Xuyong Chen

    2017-01-01

    Full Text Available Due to many uncertainties in nonprobabilistic reliability assessment of bridges, the limit state function is generally unknown. The traditional nonprobabilistic response surface method is a lengthy and oscillating iteration process and leads to difficultly solving the nonprobabilistic reliability index. This article proposes a nonprobabilistic response surface limit method based on the interval model. The intention of this method is to solve the upper and lower limits of the nonprobabilistic reliability index and to narrow the range of the nonprobabilistic reliability index. If the range of the reliability index reduces to an acceptable accuracy, the solution will be considered convergent, and the nonprobabilistic reliability index will be obtained. The case study indicates that using the proposed method can avoid oscillating iteration process, make iteration process stable and convergent, reduce iteration steps significantly, and improve computational efficiency and precision significantly compared with the traditional nonprobabilistic response surface method. Finally, the nonprobabilistic reliability evaluation process of bridge will be built through evaluating the reliability of one PC continuous rigid frame bridge with three spans using the proposed method, which appears to be more simple and reliable when lack of samples and parameters in the bridge nonprobabilistic reliability evaluation is present.

  15. Contribution to high voltage matrix switches reliability

    International Nuclear Information System (INIS)

    Lausenaz, Yvan

    2000-01-01

    Nowadays, power electronic equipment requirements are important, concerning performances, quality and reliability. On the other hand, costs have to be reduced in order to satisfy the market rules. To provide cheap, reliability and performances, many standard components with mass production are developed. But the construction of specific products must be considered following these two different points: in one band you can produce specific components, with delay, over-cost problems and eventuality quality and reliability problems, in the other and you can use standard components in a adapted topologies. The CEA of Pierrelatte has adopted this last technique of power electronic conception for the development of these high voltage pulsed power converters. The technique consists in using standard components and to associate them in series and in parallel. The matrix constitutes high voltage macro-switch where electrical parameters are distributed between the synchronized components. This study deals with the reliability of these structures. It brings up the high reliability aspect of MOSFETs matrix associations. Thanks to several homemade test facilities, we obtained lots of data concerning the components we use. The understanding of defects propagation mechanisms in matrix structures has allowed us to put forwards the necessity of robust drive system, adapted clamping voltage protection, and careful geometrical construction. All these reliability considerations in matrix associations have notably allowed the construction of a new matrix structure regrouping all solutions insuring reliability. Reliable and robust, this product has already reaches the industrial stage. (author) [fr

  16. Methods for qualification of highly reliable software - international procedure

    International Nuclear Information System (INIS)

    Kersken, M.

    1997-01-01

    Despite the advantages of computer-assisted safety technology, there still is some uneasyness to be observed with respect to the novel processes, resulting from absence of a body of generally accepted and uncontentious qualification guides (regulatory provisions, standards) for safety evaluation of the computer codes applied. Warranty of adequate protection of the population, operators or plant components is an essential aspect in this context, too - as it is in general with reliability and risk assessment of novel technology - so that, due to appropriate legislation still missing, there currently is a licensing risk involved in the introduction of digital safety systems. Nevertheless, there is some extent of agreement within the international community and utility operators about what standards and measures should be applied for qualification of software of relevance to plant safety. The standard IEC 880/IEC 86/ in particular, in its original version, or national documents based on this standard, are applied in all countries using or planning to install those systems. A novel supplement to this standard, document /IEC 96/, is in the process of finalization and defines the requirements to be met by modern methods of software engineering. (orig./DG) [de

  17. A Bayesian reliability evaluation method with integrated accelerated degradation testing and field information

    International Nuclear Information System (INIS)

    Wang, Lizhi; Pan, Rong; Li, Xiaoyang; Jiang, Tongmin

    2013-01-01

    Accelerated degradation testing (ADT) is a common approach in reliability prediction, especially for products with high reliability. However, oftentimes the laboratory condition of ADT is different from the field condition; thus, to predict field failure, one need to calibrate the prediction made by using ADT data. In this paper a Bayesian evaluation method is proposed to integrate the ADT data from laboratory with the failure data from field. Calibration factors are introduced to calibrate the difference between the lab and the field conditions so as to predict a product's actual field reliability more accurately. The information fusion and statistical inference procedure are carried out through a Bayesian approach and Markov chain Monte Carlo methods. The proposed method is demonstrated by two examples and the sensitivity analysis to prior distribution assumption

  18. Comparison of methods for dependency determination between human failure events within human reliability analysis

    International Nuclear Information System (INIS)

    Cepis, M.

    2007-01-01

    The Human Reliability Analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease of subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan - Human Reliability Analysis (IJS-HRA) and Standardized Plant Analysis Risk Human Reliability Analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance. (author)

  19. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan human reliability analysis (IJS-HRA) and standardized plant analysis risk human reliability analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance

  20. Rapid and reliable high-throughput methods of DNA extraction for use in barcoding and molecular systematics of mushrooms.

    Science.gov (United States)

    Dentinger, Bryn T M; Margaritescu, Simona; Moncalvo, Jean-Marc

    2010-07-01

    We present two methods for DNA extraction from fresh and dried mushrooms that are adaptable to high-throughput sequencing initiatives, such as DNA barcoding. Our results show that these protocols yield ∼85% sequencing success from recently collected materials. Tests with both recent (100 years) specimens reveal that older collections have low success rates and may be an inefficient resource for populating a barcode database. However, our method of extracting DNA from herbarium samples using small amount of tissue is reliable and could be used for important historical specimens. The application of these protocols greatly reduces time, and therefore cost, of generating DNA sequences from mushrooms and other fungi vs. traditional extraction methods. The efficiency of these methods illustrates that standardization and streamlining of sample processing should be shifted from the laboratory to the field. © 2009 Blackwell Publishing Ltd.

  1. AK-SYS: An adaptation of the AK-MCS method for system reliability

    International Nuclear Information System (INIS)

    Fauriat, W.; Gayton, N.

    2014-01-01

    A lot of research work has been proposed over the last two decades to evaluate the probability of failure of a structure involving a very time-consuming mechanical model. Surrogate model approaches based on Kriging, such as the Efficient Global Reliability Analysis (EGRA) or the Active learning and Kriging-based Monte-Carlo Simulation (AK-MCS) methods, are very efficient and each has advantages of its own. EGRA is well suited to evaluating small probabilities, as the surrogate can be used to classify any population. AK-MCS is built in relation to a given population and requires no optimization program for the active learning procedure to be performed. It is therefore easier to implement and more likely to spend computational effort on areas with a significant probability content. When assessing system reliability, analytical approaches and first-order approximation are widely used in the literature. However, in the present paper we rather focus on sampling techniques and, considering the recent adaptation of the EGRA method for systems, a strategy is presented to adapt the AK-MCS method for system reliability. The AK-SYS method, “Active learning and Kriging-based SYStem reliability method”, is presented. Its high efficiency and accuracy are illustrated via various examples

  2. Management systems for high reliability organizations. Integration and effectiveness; Managementsysteme fuer Hochzuverlaessigkeitsorganisationen. Integration und Wirksamkeit

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, Michael

    2015-03-09

    The scope of the thesis is the development of a method for improvement of efficient integrated management systems for high reliability organizations (HRO). A comprehensive analysis of severe accident prevention is performed. Severe accident management, mitigation measures and business continuity management are not included. High reliability organizations are complex and potentially dynamic organization forms that can be inherently dangerous like nuclear power plants, offshore platforms, chemical facilities, large ships or large aircrafts. A recursive generic management system model (RGM) was development based on the following factors: systemic and cybernetic Asepcts; integration of different management fields, high decision quality, integration of efficient methods of safety and risk analysis, integration of human reliability aspects, effectiveness evaluation and improvement.

  3. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  4. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    International Nuclear Information System (INIS)

    Boring, Ronald Laurids

    2010-01-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  5. Review of methods for the integration of reliability and design engineering

    International Nuclear Information System (INIS)

    Reilly, J.T.

    1978-03-01

    A review of methods for the integration of reliability and design engineering was carried out to establish a reliability program philosophy, an initial set of methods, and procedures to be used by both the designer and reliability analyst. The report outlines a set of procedures which implements a philosophy that requires increased involvement by the designer in reliability analysis. Discussions of each method reviewed include examples of its application

  6. Reliability of an Automated High-Resolution Manometry Analysis Program across Expert Users, Novice Users, and Speech-Language Pathologists

    Science.gov (United States)

    Jones, Corinne A.; Hoffman, Matthew R.; Geng, Zhixian; Abdelhalim, Suzan M.; Jiang, Jack J.; McCulloch, Timothy M.

    2014-01-01

    Purpose: The purpose of this study was to investigate inter- and intrarater reliability among expert users, novice users, and speech-language pathologists with a semiautomated high-resolution manometry analysis program. We hypothesized that all users would have high intrarater reliability and high interrater reliability. Method: Three expert…

  7. Study on application of a high-speed trigger-type SFCL (TSFCL) for interconnection of power systems with different reliabilities

    International Nuclear Information System (INIS)

    Kim, Hye Ji; Yoon, Yong Tae

    2016-01-01

    Highlights: • Application of TSFCL to interconnect systems with different reliabilities is proposed. • TSFCL protects a grid by preventing detrimental effects from being delivered through the interconnection line. • A high-speed TSFCL with high impedance for transmission systems is required to be developed. - Abstract: Interconnection of power systems is one effective way to improve power supply reliability. However, differences in the reliability of each power system create a greater obstacle for the stable interconnection of power systems, as after interconnection a high-reliability system is affected by frequent faults in low reliability side systems. Several power system interconnection methods, such as the back-to-back method and the installation of either transformers or series reactors, have been investigated to counteract the damage caused by faults in the other neighboring systems. However, these methods are uneconomical and require complex operational management plans. In this work, a high-speed trigger-type superconducting fault current limiter (TSFCL) with large-impedance is proposed as a solution to maintain reliability and power quality when a high reliability power system is interconnected with a low reliability power system. Through analysis of the reliability index for the numerical examples obtained from a PSCAD/EMTDC simulator, a high-speed TSFCL with a large-impedance is confirmed to be effective for the interconnection between power systems with different reliabilities.

  8. Reliability of Lyapunov characteristic exponents computed by the two-particle method

    Science.gov (United States)

    Mei, Lijie; Huang, Li

    2018-03-01

    For highly complex problems, such as the post-Newtonian formulation of compact binaries, the two-particle method may be a better, or even the only, choice to compute the Lyapunov characteristic exponent (LCE). This method avoids the complex calculations of variational equations compared with the variational method. However, the two-particle method sometimes provides spurious estimates to LCEs. In this paper, we first analyze the equivalence in the definition of LCE between the variational and two-particle methods for Hamiltonian systems. Then, we develop a criterion to determine the reliability of LCEs computed by the two-particle method by considering the magnitude of the initial tangent (or separation) vector ξ0 (or δ0), renormalization time interval τ, machine precision ε, and global truncation error ɛT. The reliable Lyapunov characteristic indicators estimated by the two-particle method form a V-shaped region, which is restricted by d0, ε, and ɛT. Finally, the numerical experiments with the Hénon-Heiles system, the spinning compact binaries, and the post-Newtonian circular restricted three-body problem strongly support the theoretical results.

  9. Reliability of Estimation Pile Load Capacity Methods

    Directory of Open Access Journals (Sweden)

    Yudhi Lastiasih

    2014-04-01

    Full Text Available None of numerous previous methods for predicting pile capacity is known how accurate any of them are when compared with the actual ultimate capacity of piles tested to failure. The author’s of the present paper have conducted such an analysis, based on 130 data sets of field loading tests. Out of these 130 data sets, only 44 could be analysed, of which 15 were conducted until the piles actually reached failure. The pile prediction methods used were: Brinch Hansen’s method (1963, Chin’s method (1970, Decourt’s Extrapolation Method (1999, Mazurkiewicz’s method (1972, Van der Veen’s method (1953, and the Quadratic Hyperbolic Method proposed by Lastiasih et al. (2012. It was obtained that all the above methods were sufficiently reliable when applied to data from pile loading tests that loaded to reach failure. However, when applied to data from pile loading tests that loaded without reaching failure, the methods that yielded lower values for correction factor N are more recommended. Finally, the empirical method of Reese and O’Neill (1988 was found to be reliable enough to be used to estimate the Qult of a pile foundation based on soil data only.

  10. Risk-based methods for reliability investments in electric power distribution systems

    Energy Technology Data Exchange (ETDEWEB)

    Alvehag, Karin

    2011-07-01

    Society relies more and more on a continuous supply of electricity. However, while under investments in reliability lead to an unacceptable number of power interruptions, over investments result in too high costs for society. To give incentives for a socio economically optimal level of reliability, quality regulations have been adopted in many European countries. These quality regulations imply new financial risks for the distribution system operator (DSO) since poor reliability can reduce the allowed revenue for the DSO and compensation may have to be paid to affected customers. This thesis develops a method for evaluating the incentives for reliability investments implied by different quality regulation designs. The method can be used to investigate whether socio economically beneficial projects are also beneficial for a profit-maximizing DSO subject to a particular quality regulation design. To investigate which reinvestment projects are preferable for society and a DSO, risk-based methods are developed. With these methods, the probability of power interruptions and the consequences of these can be simulated. The consequences of interruptions for the DSO will to a large extent depend on the quality regulation. The consequences for the customers, and hence also society, will depend on factors such as the interruption duration and time of occurrence. The proposed risk-based methods consider extreme outage events in the risk assessments by incorporating the impact of severe weather, estimating the full probability distribution of the total reliability cost, and formulating a risk-averse strategy. Results from case studies performed show that quality regulation design has a significant impact on reinvestment project profitability for a DSO. In order to adequately capture the financial risk that the DSO is exposed to, detailed riskbased methods, such as the ones developed in this thesis, are needed. Furthermore, when making investment decisions, a risk

  11. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  12. Design of piezoelectric transducer layer with electromagnetic shielding and high connection reliability

    Science.gov (United States)

    Qiu, Lei; Yuan, Shenfang; Shi, Xiaoling; Huang, Tianxiang

    2012-07-01

    Piezoelectric transducer (PZT) and Lamb wave based structural health monitoring (SHM) method have been widely studied for on-line SHM of high-performance structures. To monitor large-scale structures, a dense PZTs array is required. In order to improve the placement efficiency and reduce the wire burden of the PZTs array, the concept of the piezoelectric transducers layer (PSL) was proposed. The PSL consists of PZTs, a flexible interlayer with printed wires and signal input/output interface. For on-line SHM on real aircraft structures, there are two main issues on electromagnetic interference and connection reliability of the PSL. To address the issues, an electromagnetic shielding design method of the PSL to reduce spatial electromagnetic noise and crosstalk is proposed and a combined welding-cementation process based connection reliability design method is proposed to enhance the connection reliability between the PZTs and the flexible interlayer. Two experiments on electromagnetic interference suppression are performed to validate the shielding design of the PSL. The experimental results show that the amplitudes of the spatial electromagnetic noise and crosstalk output from the shielded PSL developed by this paper are - 15 dB and - 25 dB lower than those of the ordinary PSL, respectively. Other two experiments on temperature durability ( - 55 °C-80 °C ) and strength durability (160-1600μɛ, one million load cycles) are applied to the PSL to validate the connection reliability. The low repeatability errors (less than 3% and less than 5%, respectively) indicate that the developed PSL is of high connection reliability and long fatigue life.

  13. The role of high cycle fatigue (HCF) onset in Francis runner reliability

    International Nuclear Information System (INIS)

    Gagnon, M; Tahan, S A; Bocher, P; Thibault, D

    2012-01-01

    High Cycle Fatigue (HCF) plays an important role in Francis runner reliability. This paper presents a model in which reliability is defined as the probability of not exceeding a threshold above which HCF contributes to crack propagation. In the context of combined Low Cycle Fatigue (LCF) and HCF loading, the Kitagawa diagram is used as the limit state threshold for reliability. The reliability problem is solved using First-Order Reliability Methods (FORM). A study case is proposed using in situ measured strains and operational data. All the parameters of the reliability problem are based either on observed data or on typical design specifications. From the results obtained, we observed that the uncertainty around the defect size and the HCF stress range play an important role in reliability. At the same time, we observed that expected values for the LCF stress range and the number of LCF cycles have a significant influence on life assessment, but the uncertainty around these values could be neglected in the reliability assessment.

  14. Toward reliable and repeatable automated STEM-EDS metrology with high throughput

    Science.gov (United States)

    Zhong, Zhenxin; Donald, Jason; Dutrow, Gavin; Roller, Justin; Ugurlu, Ozan; Verheijen, Martin; Bidiuk, Oleksii

    2018-03-01

    New materials and designs in complex 3D architectures in logic and memory devices have raised complexity in S/TEM metrology. In this paper, we report about a newly developed, automated, scanning transmission electron microscopy (STEM) based, energy dispersive X-ray spectroscopy (STEM-EDS) metrology method that addresses these challenges. Different methodologies toward repeatable and efficient, automated STEM-EDS metrology with high throughput are presented: we introduce the best known auto-EDS acquisition and quantification methods for robust and reliable metrology and present how electron exposure dose impacts the EDS metrology reproducibility, either due to poor signalto-noise ratio (SNR) at low dose or due to sample modifications at high dose conditions. Finally, we discuss the limitations of the STEM-EDS metrology technique and propose strategies to optimize the process both in terms of throughput and metrology reliability.

  15. Reliability and Failure in NASA Missions: Blunders, Normal Accidents, High Reliability, Bad Luck

    Science.gov (United States)

    Jones, Harry W.

    2015-01-01

    NASA emphasizes crew safety and system reliability but several unfortunate failures have occurred. The Apollo 1 fire was mistakenly unanticipated. After that tragedy, the Apollo program gave much more attention to safety. The Challenger accident revealed that NASA had neglected safety and that management underestimated the high risk of shuttle. Probabilistic Risk Assessment was adopted to provide more accurate failure probabilities for shuttle and other missions. NASA's "faster, better, cheaper" initiative and government procurement reform led to deliberately dismantling traditional reliability engineering. The Columbia tragedy and Mars mission failures followed. Failures can be attributed to blunders, normal accidents, or bad luck. Achieving high reliability is difficult but possible.

  16. Network reliability analysis of complex systems using a non-simulation-based method

    International Nuclear Information System (INIS)

    Kim, Youngsuk; Kang, Won-Hee

    2013-01-01

    Civil infrastructures such as transportation, water supply, sewers, telecommunications, and electrical and gas networks often establish highly complex networks, due to their multiple source and distribution nodes, complex topology, and functional interdependence between network components. To understand the reliability of such complex network system under catastrophic events such as earthquakes and to provide proper emergency management actions under such situation, efficient and accurate reliability analysis methods are necessary. In this paper, a non-simulation-based network reliability analysis method is developed based on the Recursive Decomposition Algorithm (RDA) for risk assessment of generic networks whose operation is defined by the connections of multiple initial and terminal node pairs. The proposed method has two separate decomposition processes for two logical functions, intersection and union, and combinations of these processes are used for the decomposition of any general system event with multiple node pairs. The proposed method is illustrated through numerical network examples with a variety of system definitions, and is applied to a benchmark gas transmission pipe network in Memphis TN to estimate the seismic performance and functional degradation of the network under a set of earthquake scenarios.

  17. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  18. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    Science.gov (United States)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    proposed by the Society of Automotive Engineers (SAE). The test cases compare different probabilistic methods within NESSUS because it is important that a user can have confidence that estimates of stochastic parameters of a response will be within an acceptable error limit. For each response, the mean, standard deviation, and 0.99 percentile, are repeatedly estimated which allows confidence statements to be made for each parameter estimated, and for each method. Thus, the ability of several stochastic methods to efficiently and accurately estimate density parameters is compared using four valid test cases. While all of the reliability methods used performed quite well, for the new LHS module within NESSUS it was found that it had a lower estimation error than MC when they were used to estimate the mean, standard deviation, and 0.99 percentile of the four different stochastic responses. Also, LHS required a smaller amount of calculations to obtain low error answers with a high amount of confidence than MC. It can therefore be stated that NESSUS is an important reliability tool that has a variety of sound probabilistic methods a user can employ and the newest LHS module is a valuable new enhancement of the program.

  19. Reliability methods in nuclear power plant ageing management

    International Nuclear Information System (INIS)

    Simola, K.

    1999-01-01

    The aim of nuclear power plant ageing management is to maintain an adequate safety level throughout the lifetime of the plant. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time-dependent degradation. The phases of ageing analyses are generally the identification of critical components, identification and evaluation of ageing effects, and development of mitigation methods. This thesis focuses on the use of reliability methods and analyses of plant- specific operating experience in nuclear power plant ageing studies. The presented applications and method development have been related to nuclear power plants, but many of the approaches can also be applied outside the nuclear industry. The thesis consists of a summary and seven publications. The summary provides an overview of ageing management and discusses the role of reliability methods in ageing analyses. In the publications, practical applications and method development are described in more detail. The application areas at component and system level are motor-operated valves and protection automation systems, for which experience-based ageing analyses have been demonstrated. Furthermore, Bayesian ageing models for repairable components have been developed, and the management of ageing by improving maintenance practices is discussed. Recommendations for improvement of plant information management in order to facilitate ageing analyses are also given. The evaluation and mitigation of ageing effects on structural components is addressed by promoting the use of probabilistic modelling of crack growth, and developing models for evaluation of the reliability of inspection results. (orig.)

  20. Reliability methods in nuclear power plant ageing management

    Energy Technology Data Exchange (ETDEWEB)

    Simola, K. [VTT Automation, Espoo (Finland). Industrial Automation

    1999-07-01

    The aim of nuclear power plant ageing management is to maintain an adequate safety level throughout the lifetime of the plant. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time-dependent degradation. The phases of ageing analyses are generally the identification of critical components, identification and evaluation of ageing effects, and development of mitigation methods. This thesis focuses on the use of reliability methods and analyses of plant- specific operating experience in nuclear power plant ageing studies. The presented applications and method development have been related to nuclear power plants, but many of the approaches can also be applied outside the nuclear industry. The thesis consists of a summary and seven publications. The summary provides an overview of ageing management and discusses the role of reliability methods in ageing analyses. In the publications, practical applications and method development are described in more detail. The application areas at component and system level are motor-operated valves and protection automation systems, for which experience-based ageing analyses have been demonstrated. Furthermore, Bayesian ageing models for repairable components have been developed, and the management of ageing by improving maintenance practices is discussed. Recommendations for improvement of plant information management in order to facilitate ageing analyses are also given. The evaluation and mitigation of ageing effects on structural components is addressed by promoting the use of probabilistic modelling of crack growth, and developing models for evaluation of the reliability of inspection results. (orig.)

  1. Achieving High Reliability with People, Processes, and Technology.

    Science.gov (United States)

    Saunders, Candice L; Brennan, John A

    2017-01-01

    High reliability as a corporate value in healthcare can be achieved by meeting the "Quadruple Aim" of improving population health, reducing per capita costs, enhancing the patient experience, and improving provider wellness. This drive starts with the board of trustees, CEO, and other senior leaders who ingrain high reliability throughout the organization. At WellStar Health System, the board developed an ambitious goal to become a top-decile health system in safety and quality metrics. To achieve this goal, WellStar has embarked on a journey toward high reliability and has committed to Lean management practices consistent with the Institute for Healthcare Improvement's definition of a high-reliability organization (HRO): one that is committed to the prevention of failure, early identification and mitigation of failure, and redesign of processes based on identifiable failures. In the end, a successful HRO can provide safe, effective, patient- and family-centered, timely, efficient, and equitable care through a convergence of people, processes, and technology.

  2. Design of piezoelectric transducer layer with electromagnetic shielding and high connection reliability

    International Nuclear Information System (INIS)

    Qiu, Lei; Yuan, Shenfang; Shi, Xiaoling; Huang, Tianxiang

    2012-01-01

    Piezoelectric transducer (PZT) and Lamb wave based structural health monitoring (SHM) method have been widely studied for on-line SHM of high-performance structures. To monitor large-scale structures, a dense PZTs array is required. In order to improve the placement efficiency and reduce the wire burden of the PZTs array, the concept of the piezoelectric transducers layer (PSL) was proposed. The PSL consists of PZTs, a flexible interlayer with printed wires and signal input/output interface. For on-line SHM on real aircraft structures, there are two main issues on electromagnetic interference and connection reliability of the PSL. To address the issues, an electromagnetic shielding design method of the PSL to reduce spatial electromagnetic noise and crosstalk is proposed and a combined welding–cementation process based connection reliability design method is proposed to enhance the connection reliability between the PZTs and the flexible interlayer. Two experiments on electromagnetic interference suppression are performed to validate the shielding design of the PSL. The experimental results show that the amplitudes of the spatial electromagnetic noise and crosstalk output from the shielded PSL developed by this paper are − 15 dB and − 25 dB lower than those of the ordinary PSL, respectively. Other two experiments on temperature durability ( − 55 °C–80 °C ) and strength durability (160–1600με, one million load cycles) are applied to the PSL to validate the connection reliability. The low repeatability errors (less than 3% and less than 5%, respectively) indicate that the developed PSL is of high connection reliability and long fatigue life. (paper)

  3. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    Directory of Open Access Journals (Sweden)

    Hai An

    2016-08-01

    Full Text Available Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new hybrid reliability index definition is presented based on the random–fuzzy–interval model. Furthermore, the calculation flowchart of the hybrid reliability index is presented and it is solved using the modified limit-step length iterative algorithm, which ensures convergence. And the validity of convergent algorithm for the hybrid reliability model is verified through the calculation examples in literature. In the end, a numerical example is demonstrated to show that the hybrid reliability index is applicable for the wear reliability assessment of mechanisms, where truncated random variables, fuzzy random variables, and interval variables coexist. The demonstration also shows the good convergence of the iterative algorithm proposed in this article.

  4. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was......, on the other hand, lighter than the single-step method....

  5. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2015-05-01

    Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.

  6. Reliability of the imaging software in the preoperative planning of the open-wedge high tibial osteotomy.

    Science.gov (United States)

    Lee, Yong Seuk; Kim, Min Kyu; Byun, Hae Won; Kim, Sang Bum; Kim, Jin Goo

    2015-03-01

    The purpose of this study was to verify a recently developed picture-archiving and communications system-photoshop method by comparing reliabilities between real-size paper template and the PACS-photoshop methods in preoperative planning of open-wedge high tibial osteotomy. A prospective case series was conducted, including patients with medial osteoarthritis undergoing open-wedge high tibial osteotomy. In the preoperative planning, the picture-archiving and communications system-photoshop method and real-size paper template method were used simultaneously in all patients. Preoperative hip-knee-ankle angle, height, and angle of the osteotomy were evaluated. The reliability of this newly devised method was evaluated, and the consistency between the two methods was also evaluated using intra-class correlation coefficient. Using the picture-archiving and communications system-photoshop method, the mean correction angle and height of osteotomy gap of rater-1 were 11.7° ± 3.6° and 10.7 ± 3.6 mm, respectively. The mean correction angle and height of osteotomy gap of rater-2 were 12.0 ± 2.6 and 10.8 ± 3.6, respectively. The inter- and intra-rater reliabilities of the correction angle were 0.956 ~ 0.979 and 0.980 ~ 0.992, respectively. The inter- and intra-rater reliabilities of the height of the osteotomy gap were 0.968 ~ 0.985 and 0.971 ~ 0.994, respectively (p photoshop method, mean values of the correction angle and height of the osteotomy gap were 11.9° ± 3.6° and 10.8 ± 3.6 mm, respectively. Consistency between the two methods by comparing the means of the correction angle and the height of the osteotomy gap were 0.985 and 0.985, respectively (p photoshop method enables direct measurement of the height of the osteotomy gap with high reliability.

  7. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  8. High level issues in reliability quantification of safety-critical software

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2012-01-01

    For the purpose of developing a consensus method for the reliability assessment of safety-critical digital instrumentation and control systems in nuclear power plants, several high level issues in reliability assessment of the safety-critical software based on Bayesian belief network modeling and statistical testing are discussed. Related to the Bayesian belief network modeling, the relation between the assessment approach and the sources of evidence, the relation between qualitative evidence and quantitative evidence, how to consider qualitative evidence, and the cause-consequence relation are discussed. Related to the statistical testing, the need of the consideration of context-specific software failure probabilities and the inability to perform a huge number of tests in the real world are discussed. The discussions in this paper are expected to provide a common basis for future discussions on the reliability assessment of safety-critical software. (author)

  9. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.

  10. High power klystrons for efficient reliable high power amplifiers

    Science.gov (United States)

    Levin, M.

    1980-11-01

    This report covers the design of reliable high efficiency, high power klystrons which may be used in both existing and proposed troposcatter radio systems. High Power (10 kW) klystron designs were generated in C-band (4.4 GHz to 5.0 GHz), S-band (2.5 GHz to 2.7 GHz), and L-band or UHF frequencies (755 MHz to 985 MHz). The tubes were designed for power supply compatibility and use with a vapor/liquid phase heat exchanger. Four (4) S-band tubes were developed in the course of this program along with two (2) matching focusing solenoids and two (2) heat exchangers. These tubes use five (5) tuners with counters which are attached to the focusing solenoids. A reliability mathematical model of the tube and heat exchanger system was also generated.

  11. Assessment of modern methods of human factor reliability analysis in PSA studies

    International Nuclear Information System (INIS)

    Holy, J.

    2001-12-01

    The report is structured as follows: Classical terms and objects (Probabilistic safety assessment as a framework for human reliability assessment; Human failure within the PSA model; Basic types of operator failure modelled in a PSA study and analyzed by HRA methods; Qualitative analysis of human reliability; Quantitative analysis of human reliability used; Process of analysis of nuclear reactor operator reliability in a PSA study); New terms and objects (Analysis of dependences; Errors of omission; Errors of commission; Error forcing context); and Overview and brief assessment of human reliability analysis (Basic characteristics of the methods; Assets and drawbacks of the use of each of HRA method; History and prospects of the use of the methods). (P.A.)

  12. Method for assessing reliability of a network considering probabilistic safety assessment

    International Nuclear Information System (INIS)

    Cepin, M.

    2005-01-01

    A method for assessment of reliability of the network is developed, which uses the features of the fault tree analysis. The method is developed in a way that the increase of the network under consideration does not require significant increase of the model. The method is applied to small examples of network consisting of a small number of nodes and a small number of their connections. The results give the network reliability. They identify equipment, which is to be carefully maintained in order that the network reliability is not reduced, and equipment, which is a candidate for redundancy, as this would improve network reliability significantly. (author)

  13. Assessment of the reliability of ultrasonic inspection methods

    International Nuclear Information System (INIS)

    Haines, N.F.; Langston, D.B.; Green, A.J.; Wilson, R.

    1982-01-01

    The reliability of NDT techniques has remained an open question for many years. A reliable technique may be defined as one that, when rigorously applied by a number of inspection teams, consistently finds then correctly sizes all defects of concern. In this paper we report an assessment of the reliability of defect detection by manual ultrasonic methods applied to the inspection of thick section pressure vessel weldments. Initially we consider the available data relating to the inherent physical capabilities of ultrasonic techniques to detect cracks in weldment and then, independently, we assess the likely variability in team to team performance when several teams are asked to follow the same specified test procedure. The two aspects of 'capability' and 'variability' are brought together to provide quantitative estimates of the overall reliability of ultrasonic inspection of thick section pressure vessel weldments based on currently existing data. The final section of the paper considers current research programmes on reliability and presents a view on how these will help to further improve NDT reliability. (author)

  14. RELIABILITY ASSESSMENT OF ENTROPY METHOD FOR SYSTEM CONSISTED OF IDENTICAL EXPONENTIAL UNITS

    Institute of Scientific and Technical Information of China (English)

    Sun Youchao; Shi Jun

    2004-01-01

    The reliability assessment of unit-system near two levels is the most important content in the reliability multi-level synthesis of complex systems. Introducing the information theory into system reliability assessment, using the addible characteristic of information quantity and the principle of equivalence of information quantity, an entropy method of data information conversion is presented for the system consisted of identical exponential units. The basic conversion formulae of entropy method of unit test data are derived based on the principle of information quantity equivalence. The general models of entropy method synthesis assessment for system reliability approximate lower limits are established according to the fundamental principle of the unit reliability assessment. The applications of the entropy method are discussed by way of practical examples. Compared with the traditional methods, the entropy method is found to be valid and practicable and the assessment results are very satisfactory.

  15. A Simple and Reliable Method of Design for Standalone Photovoltaic Systems

    Science.gov (United States)

    Srinivasarao, Mantri; Sudha, K. Rama; Bhanu, C. V. K.

    2017-06-01

    Standalone photovoltaic (SAPV) systems are seen as a promoting method of electrifying areas of developing world that lack power grid infrastructure. Proliferations of these systems require a design procedure that is simple, reliable and exhibit good performance over its life time. The proposed methodology uses simple empirical formulae and easily available parameters to design SAPV systems, that is, array size with energy storage. After arriving at the different array size (area), performance curves are obtained for optimal design of SAPV system with high amount of reliability in terms of autonomy at a specified value of loss of load probability (LOLP). Based on the array to load ratio (ALR) and levelized energy cost (LEC) through life cycle cost (LCC) analysis, it is shown that the proposed methodology gives better performance, requires simple data and is more reliable when compared with conventional design using monthly average daily load and insolation.

  16. Reliability Estimation of the Pultrusion Process Using the First-Order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem Celal; Hattel, Jesper Henri

    2013-01-01

    In the present study the reliability estimation of the pultrusion process of a flat plate is analyzed by using the first order reliability method (FORM). The implementation of the numerical process model is validated by comparing the deterministic temperature and cure degree profiles...... with corresponding analyses in the literature. The centerline degree of cure at the exit (CDOCE) being less than a critical value and the maximum composite temperature (Tmax) during the process being greater than a critical temperature are selected as the limit state functions (LSFs) for the FORM. The cumulative...

  17. A method to assign failure rates for piping reliability assessments

    International Nuclear Information System (INIS)

    Gamble, R.M.; Tagart, S.W. Jr.

    1991-01-01

    This paper reports on a simplified method that has been developed to assign failure rates that can be used in reliability and risk studies of piping. The method can be applied on a line-by-line basis by identifying line and location specific attributes that can lead to piping unreliability from in-service degradation mechanisms and random events. A survey of service experience for nuclear piping reliability also was performed. The data from this survey provides a basis for identifying in-service failure attributes and assigning failure rates for risk and reliability studies

  18. Reliability analysis of neutron transport simulation using Monte Carlo method

    International Nuclear Information System (INIS)

    Souza, Bismarck A. de; Borges, Jose C.

    1995-01-01

    This work presents a statistical and reliability analysis covering data obtained by computer simulation of neutron transport process, using the Monte Carlo method. A general description of the method and its applications is presented. Several simulations, corresponding to slowing down and shielding problems have been accomplished. The influence of the physical dimensions of the materials and of the sample size on the reliability level of results was investigated. The objective was to optimize the sample size, in order to obtain reliable results, optimizing computation time. (author). 5 refs, 8 figs

  19. Statistical estimation Monte Carlo for unreliability evaluation of highly reliable system

    International Nuclear Information System (INIS)

    Xiao Gang; Su Guanghui; Jia Dounan; Li Tianduo

    2000-01-01

    Based on analog Monte Carlo simulation, statistical Monte Carlo methods for unreliable evaluation of highly reliable system are constructed, including direct statistical estimation Monte Carlo method and weighted statistical estimation Monte Carlo method. The basal element is given, and the statistical estimation Monte Carlo estimators are derived. Direct Monte Carlo simulation method, bounding-sampling method, forced transitions Monte Carlo method, direct statistical estimation Monte Carlo and weighted statistical estimation Monte Carlo are used to evaluate unreliability of a same system. By comparing, weighted statistical estimation Monte Carlo estimator has smallest variance, and has highest calculating efficiency

  20. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-11-01

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  1. A reliability evaluation method for NPP safety DCS application software

    International Nuclear Information System (INIS)

    Li Yunjian; Zhang Lei; Liu Yuan

    2014-01-01

    In the field of nuclear power plant (NPP) digital i and c application, reliability evaluation for safety DCS application software is a key obstacle to be removed. In order to quantitatively evaluate reliability of NPP safety DCS application software, this paper propose a reliability evaluating method based on software development life cycle every stage's v and v defects density characteristics, by which the operating reliability level of the software can be predicted before its delivery, and helps to improve the reliability of NPP safety important software. (authors)

  2. Characteristics and application study of AP1000 NPPs equipment reliability classification method

    International Nuclear Information System (INIS)

    Guan Gao

    2013-01-01

    AP1000 nuclear power plant applies an integrated approach to establish equipment reliability classification, which includes probabilistic risk assessment technique, maintenance rule administrative, power production reliability classification and functional equipment group bounding method, and eventually classify equipment reliability into 4 levels. This classification process and result are very different from classical RCM and streamlined RCM. It studied the characteristic of AP1000 equipment reliability classification approach, considered that equipment reliability classification should effectively support maintenance strategy development and work process control, recommended to use a combined RCM method to establish the future equipment reliability program of AP1000 nuclear power plants. (authors)

  3. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    1984-10-01

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  4. Matrix-based system reliability method and applications to bridge networks

    International Nuclear Information System (INIS)

    Kang, W.-H.; Song Junho; Gardoni, Paolo

    2008-01-01

    Using a matrix-based system reliability (MSR) method, one can estimate the probabilities of complex system events by simple matrix calculations. Unlike existing system reliability methods whose complexity depends highly on that of the system event, the MSR method describes any general system event in a simple matrix form and therefore provides a more convenient way of handling the system event and estimating its probability. Even in the case where one has incomplete information on the component probabilities and/or the statistical dependence thereof, the matrix-based framework enables us to estimate the narrowest bounds on the system failure probability by linear programming. This paper presents the MSR method and applies it to a transportation network consisting of bridge structures. The seismic failure probabilities of bridges are estimated by use of the predictive fragility curves developed by a Bayesian methodology based on experimental data and existing deterministic models of the seismic capacity and demand. Using the MSR method, the probability of disconnection between each city/county and a critical facility is estimated. The probability mass function of the number of failed bridges is computed as well. In order to quantify the relative importance of bridges, the MSR method is used to compute the conditional probabilities of bridge failures given that there is at least one city disconnected from the critical facility. The bounds on the probability of disconnection are also obtained for cases with incomplete information

  5. A criterion of the performance of thermometric systems of high metrological reliability

    International Nuclear Information System (INIS)

    Sal'nikov, N.L.; Filimonov, E.V.

    1995-01-01

    Monitoring temperature regimes is an important part of ensuring the operational safety of a nuclear power plant. Therefore, high standards are imposed upon the reliability of the primary information on the heat field of the object obtained from different sensors, and it is urgent to develop methods of evaluating the metrological reliability of these sensors. THe main sources of thermometric information at nuclear power plants are contact temperature sensors, the most widely used of these being thermoelectric converters (TEC) and thermal resistance converters (TRC)

  6. Reliability research to nuclear power plant operators based on several methods

    International Nuclear Information System (INIS)

    Fang Xiang; Li Fu; Zhao Bingquan

    2009-01-01

    The paper utilizes many kinds of international reliability research methods, and summarizes the review of reliability research of Chinese nuclear power plant operators in past over ten years based on the simulator platform of nuclear power plant. The paper shows the necessity and feasibility of the research to nuclear power plant operators from many angles including human cognition reliability, fuzzy mathematics model and psychological research model, etc. It will be good to the safe operation of nuclear power plant based on many kinds of research methods to the reliability research of nuclear power plant operators. (authors)

  7. Investigation of MLE in nonparametric estimation methods of reliability function

    International Nuclear Information System (INIS)

    Ahn, Kwang Won; Kim, Yoon Ik; Chung, Chang Hyun; Kim, Kil Yoo

    2001-01-01

    There have been lots of trials to estimate a reliability function. In the ESReDA 20 th seminar, a new method in nonparametric way was proposed. The major point of that paper is how to use censored data efficiently. Generally there are three kinds of approach to estimate a reliability function in nonparametric way, i.e., Reduced Sample Method, Actuarial Method and Product-Limit (PL) Method. The above three methods have some limits. So we suggest an advanced method that reflects censored information more efficiently. In many instances there will be a unique maximum likelihood estimator (MLE) of an unknown parameter, and often it may be obtained by the process of differentiation. It is well known that the three methods generally used to estimate a reliability function in nonparametric way have maximum likelihood estimators that are uniquely exist. So, MLE of the new method is derived in this study. The procedure to calculate a MLE is similar just like that of PL-estimator. The difference of the two is that in the new method, the mass (or weight) of each has an influence of the others but the mass in PL-estimator not

  8. Comprehensive reliability allocation method for CNC lathes based on cubic transformed functions of failure mode and effects analysis

    Science.gov (United States)

    Yang, Zhou; Zhu, Yunpeng; Ren, Hongrui; Zhang, Yimin

    2015-03-01

    Reliability allocation of computerized numerical controlled(CNC) lathes is very important in industry. Traditional allocation methods only focus on high-failure rate components rather than moderate failure rate components, which is not applicable in some conditions. Aiming at solving the problem of CNC lathes reliability allocating, a comprehensive reliability allocation method based on cubic transformed functions of failure modes and effects analysis(FMEA) is presented. Firstly, conventional reliability allocation methods are introduced. Then the limitations of direct combination of comprehensive allocation method with the exponential transformed FMEA method are investigated. Subsequently, a cubic transformed function is established in order to overcome these limitations. Properties of the new transformed functions are discussed by considering the failure severity and the failure occurrence. Designers can choose appropriate transform amplitudes according to their requirements. Finally, a CNC lathe and a spindle system are used as an example to verify the new allocation method. Seven criteria are considered to compare the results of the new method with traditional methods. The allocation results indicate that the new method is more flexible than traditional methods. By employing the new cubic transformed function, the method covers a wider range of problems in CNC reliability allocation without losing the advantages of traditional methods.

  9. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    Science.gov (United States)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  10. Modifying nodal pricing method considering market participants optimality and reliability

    Directory of Open Access Journals (Sweden)

    A. R. Soofiabadi

    2015-06-01

    Full Text Available This paper develops a method for nodal pricing and market clearing mechanism considering reliability of the system. The effects of components reliability on electricity price, market participants’ profit and system social welfare is considered. This paper considers reliability both for evaluation of market participant’s optimality as well as for fair pricing and market clearing mechanism. To achieve fair pricing, nodal price has been obtained through a two stage optimization problem and to achieve fair market clearing mechanism, comprehensive criteria has been introduced for optimality evaluation of market participant. Social welfare of the system and system efficiency are increased under proposed modified nodal pricing method.

  11. Column Grid Array Rework for High Reliability

    Science.gov (United States)

    Mehta, Atul C.; Bodie, Charles C.

    2008-01-01

    Due to requirements for reduced size and weight, use of grid array packages in space applications has become common place. To meet the requirement of high reliability and high number of I/Os, ceramic column grid array packages (CCGA) were selected for major electronic components used in next MARS Rover mission (specifically high density Field Programmable Gate Arrays). ABSTRACT The probability of removal and replacement of these devices on the actual flight printed wiring board assemblies is deemed to be very high because of last minute discoveries in final test which will dictate changes in the firmware. The questions and challenges presented to the manufacturing organizations engaged in the production of high reliability electronic assemblies are, Is the reliability of the PWBA adversely affected by rework (removal and replacement) of the CGA package? and How many times can we rework the same board without destroying a pad or degrading the lifetime of the assembly? To answer these questions, the most complex printed wiring board assembly used by the project was chosen to be used as the test vehicle, the PWB was modified to provide a daisy chain pattern, and a number of bare PWB s were acquired to this modified design. Non-functional 624 pin CGA packages with internal daisy chained matching the pattern on the PWB were procured. The combination of the modified PWB and the daisy chained packages enables continuity measurements of every soldered contact during subsequent testing and thermal cycling. Several test vehicles boards were assembled, reworked and then thermal cycled to assess the reliability of the solder joints and board material including pads and traces near the CGA. The details of rework process and results of thermal cycling are presented in this paper.

  12. Assessment of Advanced Life Support competence when combining different test methods--reliability and validity

    DEFF Research Database (Denmark)

    Ringsted, C; Lippert, F; Hesselfeldt, R

    2007-01-01

    Cardiac Arrest Simulation Test (CASTest) scenarios for the assessments according to guidelines 2005. AIMS: To analyse the reliability and validity of the individual sub-tests provided by ERC and to find a combination of MCQ and CASTest that provides a reliable and valid single effect measure of ALS...... that possessed high reliability, equality of test sets, and ability to discriminate between the two groups of supposedly different ALS competence. CONCLUSIONS: ERC sub-tests of ALS competence possess sufficient reliability and validity. A combined ALS score with equal weighting of one MCQ and one CASTest can...... competence. METHODS: Two groups of participants were included in this randomised, controlled experimental study: a group of newly graduated doctors, who had not taken the ALS course (N=17) and a group of students, who had passed the ALS course 9 months before the study (N=16). Reliability in terms of inter...

  13. GNSS Single Frequency, Single Epoch Reliable Attitude Determination Method with Baseline Vector Constraint

    Directory of Open Access Journals (Sweden)

    Ang Gong

    2015-12-01

    Full Text Available For Global Navigation Satellite System (GNSS single frequency, single epoch attitude determination, this paper proposes a new reliable method with baseline vector constraint. First, prior knowledge of baseline length, heading, and pitch obtained from other navigation equipment or sensors are used to reconstruct objective function rigorously. Then, searching strategy is improved. It substitutes gradually Enlarged ellipsoidal search space for non-ellipsoidal search space to ensure correct ambiguity candidates are within it and make the searching process directly be carried out by least squares ambiguity decorrelation algorithm (LAMBDA method. For all vector candidates, some ones are further eliminated by derived approximate inequality, which accelerates the searching process. Experimental results show that compared to traditional method with only baseline length constraint, this new method can utilize a priori baseline three-dimensional knowledge to fix ambiguity reliably and achieve a high success rate. Experimental tests also verify it is not very sensitive to baseline vector error and can perform robustly when angular error is not great.

  14. REMOTE SENSING APPLICATIONS WITH HIGH RELIABILITY IN CHANGJIANG WATER RESOURCE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    L. Ma

    2018-04-01

    Full Text Available Remote sensing technology has been widely used in many fields. But most of the applications cannot get the information with high reliability and high accuracy in large scale, especially for the applications using automatic interpretation methods. We have designed an application-oriented technology system (PIR composed of a series of accurate interpretation techniques,which can get over 85 % correctness in Water Resource Management from the view of photogrammetry and expert knowledge. The techniques compose of the spatial positioning techniques from the view of photogrammetry, the feature interpretation techniques from the view of expert knowledge, and the rationality analysis techniques from the view of data mining. Each interpreted polygon is accurate enough to be applied to the accuracy sensitive projects, such as the Three Gorge Project and the South - to - North Water Diversion Project. In this paper, we present several remote sensing applications with high reliability in Changjiang Water Resource Management,including water pollution investigation, illegal construction inspection, and water conservation monitoring, etc.

  15. Remote Sensing Applications with High Reliability in Changjiang Water Resource Management

    Science.gov (United States)

    Ma, L.; Gao, S.; Yang, A.

    2018-04-01

    Remote sensing technology has been widely used in many fields. But most of the applications cannot get the information with high reliability and high accuracy in large scale, especially for the applications using automatic interpretation methods. We have designed an application-oriented technology system (PIR) composed of a series of accurate interpretation techniques,which can get over 85 % correctness in Water Resource Management from the view of photogrammetry and expert knowledge. The techniques compose of the spatial positioning techniques from the view of photogrammetry, the feature interpretation techniques from the view of expert knowledge, and the rationality analysis techniques from the view of data mining. Each interpreted polygon is accurate enough to be applied to the accuracy sensitive projects, such as the Three Gorge Project and the South - to - North Water Diversion Project. In this paper, we present several remote sensing applications with high reliability in Changjiang Water Resource Management,including water pollution investigation, illegal construction inspection, and water conservation monitoring, etc.

  16. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine

    DEFF Research Database (Denmark)

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon

    2013-01-01

    as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine...

  17. Level III Reliability methods feasible for complex structures

    NARCIS (Netherlands)

    Waarts, P.H.; Boer, A. de

    2001-01-01

    The paper describes the comparison between three types of reliability methods: code type level I used by a designer, full level I and a level III method. Two cases that are typical for civil engineering practise, a cable-stayed subjected to traffic load and the installation of a soil retaining sheet

  18. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  19. Assessment and Improving Methods of Reliability Indices in Bakhtar Regional Electricity Company

    Directory of Open Access Journals (Sweden)

    Saeed Shahrezaei

    2013-04-01

    Full Text Available Reliability of a system is the ability of a system to do prospected duties in future and the probability of desirable operation for doing predetermined duties. Power system elements failures data are the main data of reliability assessment in the network. Determining antiseptic parameters is the goal of reliability assessment by using system history data. These parameters help to recognize week points of the system. In other words, the goal of reliability assessment is operation improving and decreasing of the failures and power outages. This paper is developed to assess reliability indices of Bakhtar Regional Electricity Company up to 1393 and the improving methods and their effects on the reliability indices in this network. DIgSILENT Power Factory software is employed for simulation. Simulation results show the positive effect of improving methods in reliability indices of Bakhtar Regional Electricity Company.

  20. Delivering high performance BWR fuel reliably

    International Nuclear Information System (INIS)

    Schardt, J.F.

    1998-01-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  1. Review of Quantitative Software Reliability Methods

    Energy Technology Data Exchange (ETDEWEB)

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of

  2. An Intelligent Method for Structural Reliability Analysis Based on Response Surface

    Institute of Scientific and Technical Information of China (English)

    桂劲松; 刘红; 康海贵

    2004-01-01

    As water depth increases, the structural safety and reliability of a system become more and more important and challenging. Therefore, the structural reliability method must be applied in ocean engineering design such as offshore platform design. If the performance function is known in structural reliability analysis, the first-order second-moment method is often used. If the performance function could not be definitely expressed, the response surface method is always used because it has a very clear train of thought and simple programming. However, the traditional response surface method fits the response surface of quadratic polynomials where the problem of accuracy could not be solved, because the true limit state surface can be fitted well only in the area near the checking point. In this paper, an intelligent computing method based on the whole response surface is proposed, which can be used for the situation where the performance function could not be definitely expressed in structural reliability analysis. In this method, a response surface of the fuzzy neural network for the whole area should be constructed first, and then the structural reliability can be calculated by the genetic algorithm. In the proposed method, all the sample points for the training network come from the whole area, so the true limit state surface in the whole area can be fitted. Through calculational examples and comparative analysis, it can be known that the proposed method is much better than the traditional response surface method of quadratic polynomials, because, the amount of calculation of finite element analysis is largely reduced, the accuracy of calculation is improved,and the true limit state surface can be fitted very well in the whole area. So, the method proposed in this paper is suitable for engineering application.

  3. Reliability Evaluation of Bridges Based on Nonprobabilistic Response Surface Limit Method

    OpenAIRE

    Chen, Xuyong; Chen, Qian; Bian, Xiaoya; Fan, Jianping

    2017-01-01

    Due to many uncertainties in nonprobabilistic reliability assessment of bridges, the limit state function is generally unknown. The traditional nonprobabilistic response surface method is a lengthy and oscillating iteration process and leads to difficultly solving the nonprobabilistic reliability index. This article proposes a nonprobabilistic response surface limit method based on the interval model. The intention of this method is to solve the upper and lower limits of the nonprobabilistic ...

  4. Survey of methods used to asses human reliability in the human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1988-01-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim to assess the state-of-the-art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participate in the HF-RBE, which is organised around two study cases: (1) analysis of routine functional test and maintenance procedures, with the aim to assess the probability of test-induced failures, the probability of failures to remain unrevealed, and the potential to initiate transients because of errors performed in the test; and (2) analysis of human actions during an operational transient, with the aim to assess the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. The paper briefly reports how the HF-RBE was structured and gives an overview of the methods that have been used for predicting human reliability in both study cases. The experience in applying these methods is discussed and the results obtained are compared. (author)

  5. Limitations in simulator time-based human reliability analysis methods

    International Nuclear Information System (INIS)

    Wreathall, J.

    1989-01-01

    Developments in human reliability analysis (HRA) methods have evolved slowly. Current methods are little changed from those of almost a decade ago, particularly in the use of time-reliability relationships. While these methods were suitable as an interim step, the time (and the need) has come to specify the next evolution of HRA methods. As with any performance-oriented data source, power plant simulator data have no direct connection to HRA models. Errors reported in data are normal deficiencies observed in human performance; failures are events modeled in probabilistic risk assessments (PRAs). Not all errors cause failures; not all failures are caused by errors. Second, the times at which actions are taken provide no measure of the likelihood of failures to act correctly within an accident scenario. Inferences can be made about human reliability, but they must be made with great care. Specific limitations are discussed. Simulator performance data are useful in providing qualitative evidence of the variety of error types and their potential influences on operating systems. More work is required to combine recent developments in the psychology of error with the qualitative data collected at stimulators. Until data become openly available, however, such an advance will not be practical

  6. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine: An overview

    Science.gov (United States)

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon; Prasad, Ramjee

    2013-01-01

    Recently, a need to develop supportive new scientific evidence for contemporary Ayurveda has emerged. One of the research objectives is an assessment of the reliability of diagnoses and treatment. Reliability is a quantitative measure of consistency. It is a crucial issue in classification (such as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda. PMID:23930037

  7. Reliably detectable flaw size for NDE methods that use calibration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-04-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh18232 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.

  8. Delivering high performance BWR fuel reliably

    Energy Technology Data Exchange (ETDEWEB)

    Schardt, J.F. [GE Nuclear Energy, Wilmington, NC (United States)

    1998-07-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  9. Statistical Bayesian method for reliability evaluation based on ADT data

    Science.gov (United States)

    Lu, Dawei; Wang, Lizhi; Sun, Yusheng; Wang, Xiaohong

    2018-05-01

    Accelerated degradation testing (ADT) is frequently conducted in the laboratory to predict the products’ reliability under normal operating conditions. Two kinds of methods, degradation path models and stochastic process models, are utilized to analyze degradation data and the latter one is the most popular method. However, some limitations like imprecise solution process and estimation result of degradation ratio still exist, which may affect the accuracy of the acceleration model and the extrapolation value. Moreover, the conducted solution of this problem, Bayesian method, lose key information when unifying the degradation data. In this paper, a new data processing and parameter inference method based on Bayesian method is proposed to handle degradation data and solve the problems above. First, Wiener process and acceleration model is chosen; Second, the initial values of degradation model and parameters of prior and posterior distribution under each level is calculated with updating and iteration of estimation values; Third, the lifetime and reliability values are estimated on the basis of the estimation parameters; Finally, a case study is provided to demonstrate the validity of the proposed method. The results illustrate that the proposed method is quite effective and accuracy in estimating the lifetime and reliability of a product.

  10. Inter- and intra- observer reliability of risk assessment of repetitive work without an explicit method.

    Science.gov (United States)

    Eliasson, Kristina; Palm, Peter; Nyman, Teresia; Forsman, Mikael

    2017-07-01

    A common way to conduct practical risk assessments is to observe a job and report the observed long term risks for musculoskeletal disorders. The aim of this study was to evaluate the inter- and intra-observer reliability of ergonomists' risk assessments without the support of an explicit risk assessment method. Twenty-one experienced ergonomists assessed the risk level (low, moderate, high risk) of eight upper body regions, as well as the global risk of 10 video recorded work tasks. Intra-observer reliability was assessed by having nine of the ergonomists repeat the procedure at least three weeks after the first assessment. The ergonomists made their risk assessment based on his/her experience and knowledge. The statistical parameters of reliability included agreement in %, kappa, linearly weighted kappa, intraclass correlation and Kendall's coefficient of concordance. The average inter-observer agreement of the global risk was 53% and the corresponding weighted kappa (K w ) was 0.32, indicating fair reliability. The intra-observer agreement was 61% and 0.41 (K w ). This study indicates that risk assessments of the upper body, without the use of an explicit observational method, have non-acceptable reliability. It is therefore recommended to use systematic risk assessment methods to a higher degree. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Mathematical Methods in Survival Analysis, Reliability and Quality of Life

    CERN Document Server

    Huber, Catherine; Mesbah, Mounir

    2008-01-01

    Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.

  12. DEPEND-HRA-A method for consideration of dependency in human reliability analysis

    International Nuclear Information System (INIS)

    Cepin, Marko

    2008-01-01

    A consideration of dependencies between human actions is an important issue within the human reliability analysis. A method was developed, which integrates the features of existing methods and the experience from a full scope plant simulator. The method is used on real plant-specific human reliability analysis as a part of the probabilistic safety assessment of a nuclear power plant. The method distinguishes dependency for pre-initiator events from dependency for initiator and post-initiator events. The method identifies dependencies based on scenarios, where consecutive human actions are modeled, and based on a list of minimal cut sets, which is obtained by running the minimal cut set analysis considering high values of human error probabilities in the evaluation. A large example study, which consisted of a large number of human failure events, demonstrated the applicability of the method. Comparative analyses that were performed show that both selection of dependency method and selection of dependency levels within the method largely impact the results of probabilistic safety assessment. If the core damage frequency is not impacted much, the listings of important basic events in terms of risk increase and risk decrease factors may change considerably. More efforts are needed on the subject, which will prepare the background for more detailed guidelines, which will remove the subjectivity from the evaluations as much as it is possible

  13. A method of predicting the reliability of CDM coil insulation

    International Nuclear Information System (INIS)

    Kytasty, A.; Ogle, C.; Arrendale, H.

    1992-01-01

    This paper presents a method of predicting the reliability of the Collider Dipole Magnet (CDM) coil insulation design. The method proposes a probabilistic treatment of electrical test data, stress analysis, material properties variability and loading uncertainties to give the reliability estimate. The approach taken to predict reliability of design related failure modes of the CDM is to form analytical models of the various possible failure modes and their related mechanisms or causes, and then statistically assess the contributions of the various contributing variables. The probability of the failure mode occurring is interpreted as the number of times one would expect certain extreme situations to combine and randomly occur. One of the more complex failure modes of the CDM will be used to illustrate this methodology

  14. Developing a reliable signal wire attachment method for rail.

    Science.gov (United States)

    2014-11-01

    The goal of this project was to develop a better attachment method for rail signal wires to improve the reliability of signaling : systems. EWI conducted basic research into the failure mode of current attachment methods and developed and tested a ne...

  15. Advancing methods for reliably assessing motivational interviewing fidelity using the motivational interviewing skills code.

    Science.gov (United States)

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C

    2015-02-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Development of improved processing and evaluation methods for high reliability structural ceramics for advanced heat engine applications Phase II. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Pujari, V.J.; Tracey, D.M.; Foley, M.R. [and others

    1996-02-01

    The research program had as goals the development and demonstration of significant improvements in processing methods, process controls, and nondestructive evaluation (NDE) which can be commercially implemented to produce high reliability silicon nitride components for advanced heat engine applications at temperatures to 1370{degrees}C. In Phase I of the program a process was developed that resulted in a silicon nitride - 4 w% yttria HIP`ed material (NCX 5102) that displayed unprecedented strength and reliability. An average tensile strength of 1 GPa and a strength distribution following a 3-parameter Weibull distribution were demonstrated by testing several hundred buttonhead tensile specimens. The Phase II program focused on the development of methodology for colloidal consolidation producing green microstructure which minimizes downstream process problems such as drying, shrinkage, cracking, and part distortion during densification. Furthermore, the program focused on the extension of the process to gas pressure sinterable (GPS) compositions. Excellent results were obtained for the HIP composition processed for minimal density gradients, both with respect to room-temperature strength and high-temperature creep resistance. Complex component fabricability of this material was demonstrated by producing engine-vane prototypes. Strength data for the GPS material (NCX-5400) suggest that it ranks very high relative to other silicon nitride materials in terms of tensile/flexure strength ratio, a measure of volume quality. This high quality was derived from the closed-loop colloidal process employed in the program.

  17. Reliability analysis for thermal cutting method based non-explosive separation device

    International Nuclear Information System (INIS)

    Choi, Jun Woo; Hwang, Kuk Ha; Kim, Byung Kyu

    2016-01-01

    In order to increase the reliability of a separation device for a small satellite, a new non-explosive separation device is invented. This device is activated using a thermal cutting method with a Ni-Cr wire. A reliability analysis is carried out for the proposed non-explosive separation device by applying the Fault tree analysis (FTA) method. In the FTA results for the separation device, only ten single-point failure modes are found. The reliability modeling and analysis for the device are performed considering failure of the power supply, the Ni-Cr wire burns failure and unwinds, the holder separation failure, the balls separation failure, and the pin release failure. Ultimately, the reliability of the proposed device is calculated as 0.999989 with five Ni-Cr wire coils

  18. Reliability analysis for thermal cutting method based non-explosive separation device

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jun Woo; Hwang, Kuk Ha; Kim, Byung Kyu [Korea Aerospace University, Goyang (Korea, Republic of)

    2016-12-15

    In order to increase the reliability of a separation device for a small satellite, a new non-explosive separation device is invented. This device is activated using a thermal cutting method with a Ni-Cr wire. A reliability analysis is carried out for the proposed non-explosive separation device by applying the Fault tree analysis (FTA) method. In the FTA results for the separation device, only ten single-point failure modes are found. The reliability modeling and analysis for the device are performed considering failure of the power supply, the Ni-Cr wire burns failure and unwinds, the holder separation failure, the balls separation failure, and the pin release failure. Ultimately, the reliability of the proposed device is calculated as 0.999989 with five Ni-Cr wire coils.

  19. Custom high-reliability radiation-hard CMOS-LSI circuit design

    International Nuclear Information System (INIS)

    Barnard, W.J.

    1981-01-01

    Sandia has developed a custom CMOS-LSI design capability to provide high reliability radiation-hardened circuits. This capability relies on (1) proven design practices to enhance reliability, (2) use of well characterized cells and logic modules, (3) computer-aided design tools to reduce design time and errors and to standardize design definition, and (4) close working relationships with the system designer and technology fabrication personnel. Trade-offs are made during the design between circuit complexity/performance and technology/producibility for high reliability and radiation-hardened designs to result. Sandia has developed and is maintaining a radiation-hardened bulk CMOS technology fabrication line for production of prototype and small production volume parts

  20. Safety and reliability of automatization software

    Energy Technology Data Exchange (ETDEWEB)

    Kapp, K; Daum, R [Karlsruhe Univ. (TH) (Germany, F.R.). Lehrstuhl fuer Angewandte Informatik, Transport- und Verkehrssysteme

    1979-02-01

    Automated technical systems have to meet very high requirements concerning safety, security and reliability. Today, modern computers, especially microcomputers, are used as integral parts of those systems. In consequence computer programs must work in a safe and reliable mannter. Methods are discussed which allow to construct safe and reliable software for automatic systems such as reactor protection systems and to prove that the safety requirements are met. As a result it is shown that only the method of total software diversification can satisfy all safety requirements at tolerable cost. In order to achieve a high degree of reliability, structured and modular programming in context with high level programming languages are recommended.

  1. Reliability test and failure analysis of high power LED packages

    International Nuclear Information System (INIS)

    Chen Zhaohui; Zhang Qin; Wang Kai; Luo Xiaobing; Liu Sheng

    2011-01-01

    A new type application specific light emitting diode (LED) package (ASLP) with freeform polycarbonate lens for street lighting is developed, whose manufacturing processes are compatible with a typical LED packaging process. The reliability test methods and failure criterions from different vendors are reviewed and compared. It is found that test methods and failure criterions are quite different. The rapid reliability assessment standards are urgently needed for the LED industry. 85 0 C/85 RH with 700 mA is used to test our LED modules with three other vendors for 1000 h, showing no visible degradation in optical performance for our modules, with two other vendors showing significant degradation. Some failure analysis methods such as C-SAM, Nano X-ray CT and optical microscope are used for LED packages. Some failure mechanisms such as delaminations and cracks are detected in the LED packages after the accelerated reliability testing. The finite element simulation method is helpful for the failure analysis and design of the reliability of the LED packaging. One example is used to show one currently used module in industry is vulnerable and may not easily pass the harsh thermal cycle testing. (semiconductor devices)

  2. High Reliability Oscillators for Terahertz Systems, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — To develop reliable THz sources with high power and high DC-RF efficiency, Virginia Diodes, Inc. will develop a thorough understanding of the complex interactions...

  3. An Investment Level Decision Method to Secure Long-term Reliability

    Science.gov (United States)

    Bamba, Satoshi; Yabe, Kuniaki; Seki, Tomomichi; Shibaya, Tetsuji

    The slowdown in power demand increase and facility replacement causes the aging and lower reliability in power facility. And the aging is followed by the rapid increase of repair and replacement when many facilities reach their lifetime in future. This paper describes a method to estimate the repair and replacement costs in future by applying the life-cycle cost model and renewal theory to the historical data. This paper also describes a method to decide the optimum investment plan, which replaces facilities in the order of cost-effectiveness by setting replacement priority formula, and the minimum investment level to keep the reliability. Estimation examples applied to substation facilities show that the reasonable and leveled future cash-out can keep the reliability by lowering the percentage of replacements caused by fatal failures.

  4. Assessment of microelectronics packaging for high temperature, high reliability applications

    Energy Technology Data Exchange (ETDEWEB)

    Uribe, F.

    1997-04-01

    This report details characterization and development activities in electronic packaging for high temperature applications. This project was conducted through a Department of Energy sponsored Cooperative Research and Development Agreement between Sandia National Laboratories and General Motors. Even though the target application of this collaborative effort is an automotive electronic throttle control system which would be located in the engine compartment, results of this work are directly applicable to Sandia`s national security mission. The component count associated with the throttle control dictates the use of high density packaging not offered by conventional surface mount. An enabling packaging technology was selected and thermal models defined which characterized the thermal and mechanical response of the throttle control module. These models were used to optimize thick film multichip module design, characterize the thermal signatures of the electronic components inside the module, and to determine the temperature field and resulting thermal stresses under conditions that may be encountered during the operational life of the throttle control module. Because the need to use unpackaged devices limits the level of testing that can be performed either at the wafer level or as individual dice, an approach to assure a high level of reliability of the unpackaged components was formulated. Component assembly and interconnect technologies were also evaluated and characterized for high temperature applications. Electrical, mechanical and chemical characterizations of enabling die and component attach technologies were performed. Additionally, studies were conducted to assess the performance and reliability of gold and aluminum wire bonding to thick film conductor inks. Kinetic models were developed and validated to estimate wire bond reliability.

  5. Analysis of fatigue reliability for high temperature and high pressure multi-stage decompression control valve

    Science.gov (United States)

    Yu, Long; Xu, Juanjuan; Zhang, Lifang; Xu, Xiaogang

    2018-03-01

    Based on stress-strength interference theory to establish the reliability mathematical model for high temperature and high pressure multi-stage decompression control valve (HMDCV), and introduced to the temperature correction coefficient for revising material fatigue limit at high temperature. Reliability of key dangerous components and fatigue sensitivity curve of each component are calculated and analyzed by the means, which are analyzed the fatigue life of control valve and combined with reliability theory of control valve model. The impact proportion of each component on the control valve system fatigue failure was obtained. The results is shown that temperature correction factor makes the theoretical calculations of reliability more accurate, prediction life expectancy of main pressure parts accords with the technical requirements, and valve body and the sleeve have obvious influence on control system reliability, the stress concentration in key part of control valve can be reduced in the design process by improving structure.

  6. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    Directory of Open Access Journals (Sweden)

    Nielsen Morten

    2009-07-01

    Full Text Available Abstract Background Estimation of the reliability of specific real value predictions is nontrivial and the efficacy of this is often questionable. It is important to know if you can trust a given prediction and therefore the best methods associate a prediction with a reliability score or index. For discrete qualitative predictions, the reliability is conventionally estimated as the difference between output scores of selected classes. Such an approach is not feasible for methods that predict a biological feature as a single real value rather than a classification. As a solution to this challenge, we have implemented a method that predicts the relative surface accessibility of an amino acid and simultaneously predicts the reliability for each prediction, in the form of a Z-score. Results An ensemble of artificial neural networks has been trained on a set of experimentally solved protein structures to predict the relative exposure of the amino acids. The method assigns a reliability score to each surface accessibility prediction as an inherent part of the training process. This is in contrast to the most commonly used procedures where reliabilities are obtained by post-processing the output. Conclusion The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability score with the individual predictions. However, our implementation of reliability scores in the form of a Z-score is shown to be the more informative measure for discriminating good predictions from bad ones in the entire range from completely buried to fully exposed amino acids. This is evident when comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0

  7. The Reliability, Impact, and Cost-Effectiveness of Value-Added Teacher Assessment Methods

    Science.gov (United States)

    Yeh, Stuart S.

    2012-01-01

    This article reviews evidence regarding the intertemporal reliability of teacher rankings based on value-added methods. Value-added methods exhibit low reliability, yet are broadly supported by prominent educational researchers and are increasingly being used to evaluate and fire teachers. The article then presents a cost-effectiveness analysis…

  8. Reliability of a semi-quantitative method for dermal exposure assessment (DREAM)

    NARCIS (Netherlands)

    Wendel de Joode, B. van; Hemmen, J.J. van; Meijster, T.; Major, V.; London, L.; Kromhout, H.

    2005-01-01

    Valid and reliable semi-quantitative dermal exposure assessment methods for epidemiological research and for occupational hygiene practice, applicable for different chemical agents, are practically nonexistent. The aim of this study was to assess the reliability of a recently developed

  9. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  10. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    Science.gov (United States)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  11. Applicability of simplified human reliability analysis methods for severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Boring, R.; St Germain, S. [Idaho National Lab., Idaho Falls, Idaho (United States); Banaseanu, G.; Chatri, H.; Akl, Y. [Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2016-03-15

    Most contemporary human reliability analysis (HRA) methods were created to analyse design-basis accidents at nuclear power plants. As part of a comprehensive expansion of risk assessments at many plants internationally, HRAs will begin considering severe accident scenarios. Severe accidents, while extremely rare, constitute high consequence events that significantly challenge successful operations and recovery. Challenges during severe accidents include degraded and hazardous operating conditions at the plant, the shift in control from the main control room to the technical support center, the unavailability of plant instrumentation, and the need to use different types of operating procedures. Such shifts in operations may also test key assumptions in existing HRA methods. This paper discusses key differences between design basis and severe accidents, reviews efforts to date to create customized HRA methods suitable for severe accidents, and recommends practices for adapting existing HRA methods that are already being used for HRAs at the plants. (author)

  12. System reliability with correlated components: Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, A.C.W.M.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  13. System reliability with correlated components : Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, T.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  14. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement.

    Science.gov (United States)

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-12-01

    The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis.

  15. Selected Methods For Increases Reliability The Of Electronic Systems Security

    Directory of Open Access Journals (Sweden)

    Paś Jacek

    2015-11-01

    Full Text Available The article presents the issues related to the different methods to increase the reliability of electronic security systems (ESS for example, a fire alarm system (SSP. Reliability of the SSP in the descriptive sense is a property preservation capacity to implement the preset function (e.g. protection: fire airport, the port, logistics base, etc., at a certain time and under certain conditions, e.g. Environmental, despite the possible non-compliance by a specific subset of elements this system. Analyzing the available literature on the ESS-SSP is not available studies on methods to increase the reliability (several works similar topics but moving with respect to the burglary and robbery (Intrusion. Based on the analysis of the set of all paths in the system suitability of the SSP for the scenario mentioned elements fire events (device critical because of security.

  16. Reliability analysis for radiographic measures of lumbar lordosis in adult scoliosis: a case–control study comparing 6 methods

    Science.gov (United States)

    Hong, Jae Young; Modi, Hitesh N.; Hur, Chang Yong; Song, Hae Ryong; Park, Jong Hoon

    2010-01-01

    Several methods are used to measure lumbar lordosis. In adult scoliosis patients, the measurement is difficult due to degenerative changes in the vertebral endplate as well as the coronal and sagittal deformity. We did the observational study with three examiners to determine the reliability of six methods for measuring the global lumbar lordosis in adult scoliosis patients. Ninety lateral lumbar radiographs were collected for the study. The radiographs were divided into normal (Cobb lordosis measurement decreased with increasing severity of scoliosis. In Cobb L1–S1, centroid and posterior tangent L1–S1 methods, the ICCs were relatively lower in the high-grade scoliosis group (≥0.60). And, the mean absolute difference (MAD) in these methods was high in the high-grade scoliosis group (≤7.17°). However, in the Cobb L1–L5 and posterior tangent L1–L5 method, the ICCs were ≥0.86 in all groups. And, in the TRALL method, the ICCs were ≥0.76 in all groups. In addition, in the Cobb L1–L5 and posterior tangent L1–L5 method, the MAD was ≤3.63°. And, in the TRALL method, the MAD was ≤3.84° in all groups. We concluded that the Cobb L1–L5 and the posterior tangent L1–L5 methods are reliable methods for measuring the global lumbar lordosis in adult scoliosis. And the TRALL method is more reliable method than other methods which include the L5–S1 joint in lordosis measurement. PMID:20437183

  17. High reliability low jitter 80 kV pulse generator

    International Nuclear Information System (INIS)

    Savage, Mark Edward; Stoltzfus, Brian Scott

    2009-01-01

    Switching can be considered to be the essence of pulsed power. Time accurate switch/trigger systems with low inductance are useful in many applications. This article describes a unique switch geometry coupled with a low-inductance capacitive energy store. The system provides a fast-rising high voltage pulse into a low impedance load. It can be challenging to generate high voltage (more than 50 kilovolts) into impedances less than 10 (Omega), from a low voltage control signal with a fast rise time and high temporal accuracy. The required power amplification is large, and is usually accomplished with multiple stages. The multiple stages can adversely affect the temporal accuracy and the reliability of the system. In the present application, a highly reliable and low jitter trigger generator was required for the Z pulsed-power facility [M. E. Savage, L. F. Bennett, D. E. Bliss, W. T. Clark, R. S. Coats,J. M. Elizondo, K. R. LeChien, H. C. Harjes, J. M. Lehr, J. E. Maenchen, D. H. McDaniel, M. F. Pasik, T. D. Pointon, A. C. Owen, D. B. Seidel, D. L. Smith, B. S. Stoltzfus, K.W. Struve, W.A. Stygar, L.K. Warne, and J. R. Woodworth, 2007 IEEE Pulsed Power Conference, Albuquerque, NM (IEEE, Piscataway, NJ, 2007), p. 979]. The large investment in each Z experiment demands low prefire probability and low jitter simultaneously. The system described here is based on a 100 kV DC-charged high-pressure spark gap, triggered with an ultraviolet laser. The system uses a single optical path for simultaneously triggering two parallel switches, allowing lower inductance and electrode erosion with a simple optical system. Performance of the system includes 6 ns output rise time into 5.6 (Omega), 550 ps one-sigma jitter measured from the 5 V trigger to the high voltage output, and misfire probability less than 10 -4 . The design of the system and some key measurements will be shown in the paper. We will discuss the design goals related to high reliability and low jitter. While

  18. Memorial Hermann: high reliability from board to bedside.

    Science.gov (United States)

    Shabot, M Michael; Monroe, Douglas; Inurria, Juan; Garbade, Debbi; France, Anne-Claire

    2013-06-01

    In 2006 the Memorial Hermann Health System (MHHS), which includes 12 hospitals, began applying principles embraced by high reliability organizations (HROs). Three factors support its HRO journey: (1) aligned organizational structure with transparent management systems and compressed reporting processes; (2) Robust Process Improvement (RPI) with high-reliability interventions; and (3) cultural establishment, sustainment, and evolution. The Quality and Safety strategic plan contains three domains, each with a specific set of measures that provide goals for performance: (1) "Clinical Excellence;" (2) "Do No Harm;" and (3) "Saving Lives," as measured by the Serious Safety Event rate. MHHS uses a uniform approach to performance improvement--RPI, which includes Six Sigma, Lean, and change management, to solve difficult safety and quality problems. The 9 acute care hospitals provide multiple opportunities to integrate high-reliability interventions and best practices across MHHS. For example, MHHS partnered with the Joint Commission Center for Transforming Healthcare in its inaugural project to establish reliable hand hygiene behaviors, which improved MHHS's average hand hygiene compliance rate from 44% to 92% currently. Soon after compliance exceeded 85% at all 12 hospitals, the average rate of central line-associated bloodstream and ventilator-associated pneumonias decreased to essentially zero. MHHS's size and diversity require a disciplined approach to performance improvement and systemwide achievement of measurable success. The most significant cultural change at MHHS has been the expectation for 100% compliance with evidence-based quality measures and 0% incidence of patient harm.

  19. Dating of zircon from high-grade rocks: Which is the most reliable method?

    Directory of Open Access Journals (Sweden)

    Alfred Kröner

    2014-07-01

    Full Text Available Magmatic zircon in high-grade metamorphic rocks is often characterized by complex textures as revealed by cathodoluminenscence (CL that result from multiple episodes of recrystallization, overgrowth, Pb-loss and modifications through fluid-induced disturbances of the crystal structure and the original U-Th-Pb isotopic systematics. Many of these features can be recognized in 2-dimensional CL images, and isotopic analysis of such domains using a high resolution ion-microprobe with only shallow penetration of the zircon surface may be able to reconstruct much of the magmatic and complex post-magmatic history of such grains. In particular it is generally possible to find original magmatic domains yielding concordant ages. In contrast, destructive techniques such as LA-ICP-MS consume a large volume, leave a deep crater in the target grain, and often sample heterogeneous domains that are not visible and thus often yield discordant results which are difficult to interpret. We provide examples of complex magmatic zircon from a southern Indian granulite terrane where SHRIMP II and LA-ICP-MS analyses are compared. The SHRIMP data are shown to be more precise and reliable, and we caution against the use of LA-ICP-MS in deciphering the chronology of complex zircons from high-grade terranes.

  20. A new kind high-reliability digital reactivity meter

    International Nuclear Information System (INIS)

    Shen Feng; Jiang Zongbing

    2001-01-01

    The paper introduces a new kind of high-reliability Digital Reactivity Meter developed by the DRM developing group in designing department of Nuclear Power Institute of China. The meter has two independent measure channels, which can be set as either master-slave structure or working independently. This structure will ensure that the meter can continually fulfill its online measure task under the condition of single failure with it. It provides a solution for the conflict between nuclear station's extreme demand in DRM's reliability and instability of computer's business software platform. The instrument reaches both advance and reliability by covering a lot of kinds of complex functions in data process and display

  1. Issues in benchmarking human reliability analysis methods: A literature review

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Hendrickson, Stacey M.L.; Forester, John A.; Tran, Tuan Q.; Lois, Erasmia

    2010-01-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies is presented in order to aid in the design of future HRA benchmarking endeavors.

  2. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  3. Design reliability engineering

    International Nuclear Information System (INIS)

    Buden, D.; Hunt, R.N.M.

    1989-01-01

    Improved design techniques are needed to achieve high reliability at minimum cost. This is especially true of space systems where lifetimes of many years without maintenance are needed and severe mass limitations exist. Reliability must be designed into these systems from the start. Techniques are now being explored to structure a formal design process that will be more complete and less expensive. The intent is to integrate the best features of design, reliability analysis, and expert systems to design highly reliable systems to meet stressing needs. Taken into account are the large uncertainties that exist in materials, design models, and fabrication techniques. Expert systems are a convenient method to integrate into the design process a complete definition of all elements that should be considered and an opportunity to integrate the design process with reliability, safety, test engineering, maintenance and operator training. 1 fig

  4. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  5. [Knowledge of university students in Szeged, Hungary about reliable contraceptive methods and sexually transmitted diseases].

    Science.gov (United States)

    Devosa, Iván; Kozinszky, Zoltán; Vanya, Melinda; Szili, Károly; Fáyné Dombi, Alice; Barabás, Katalin

    2016-04-03

    Promiscuity and lack of use of reliable contraceptive methods increase the probability of sexually transmitted diseases and the risk of unwanted pregnancies, which are quite common among university students. The aim of the study was to assess the knowledge of university students about reliable contraceptive methods and sexually transmitted diseases, and to assess the effectiveness of the sexual health education in secondary schools, with specific focus on the education held by peers. An anonymous, self-administered questionnaire survey was carried out in a randomized sample of students at the University of Szeged (n = 472, 298 women and 174 men, average age 21 years) between 2009 and 2011. 62.1% of the respondents declared that reproductive health education lessons in high schools held by peers were reliable and authentic source of information, 12.3% considered as a less reliable source, and 25.6% defined the school health education as irrelevant source. Among those, who considered the health education held by peers as a reliable source, there were significantly more females (69.3% vs. 46.6%, p = 0.001), significantly fewer lived in cities (83.6% vs. 94.8%, p = 0.025), and significantly more responders knew that Candida infection can be transmitted through sexual intercourse (79.5% versus 63.9%, p = 0.02) as compared to those who did not consider health education held by peers as a reliable source. The majority of respondents obtained knowledge about sexual issues from the mass media. Young people who considered health educating programs reliable were significantly better informed about Candida disease.

  6. A generic method for estimating system reliability using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Ramirez-Marquez, Jose Emmanuel

    2009-01-01

    This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples

  7. A generic method for estimating system reliability using Bayesian networks

    Energy Technology Data Exchange (ETDEWEB)

    Doguc, Ozge [Stevens Institute of Technology, Hoboken, NJ 07030 (United States); Ramirez-Marquez, Jose Emmanuel [Stevens Institute of Technology, Hoboken, NJ 07030 (United States)], E-mail: jmarquez@stevens.edu

    2009-02-15

    This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples.

  8. Research on Control Method Based on Real-Time Operational Reliability Evaluation for Space Manipulator

    Directory of Open Access Journals (Sweden)

    Yifan Wang

    2014-05-01

    Full Text Available A control method based on real-time operational reliability evaluation for space manipulator is presented for improving the success rate of a manipulator during the execution of a task. In this paper, a method for quantitative analysis of operational reliability is given when manipulator is executing a specified task; then a control model which could control the quantitative operational reliability is built. First, the control process is described by using a state space equation. Second, process parameters are estimated in real time using Bayesian method. Third, the expression of the system's real-time operational reliability is deduced based on the state space equation and process parameters which are estimated using Bayesian method. Finally, a control variable regulation strategy which considers the cost of control is given based on the Theory of Statistical Process Control. It is shown via simulations that this method effectively improves the operational reliability of space manipulator control system.

  9. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    Science.gov (United States)

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  10. A rapid reliability estimation method for directed acyclic lifeline networks with statistically dependent components

    International Nuclear Information System (INIS)

    Kang, Won-Hee; Kliese, Alyce

    2014-01-01

    Lifeline networks, such as transportation, water supply, sewers, telecommunications, and electrical and gas networks, are essential elements for the economic and societal functions of urban areas, but their components are highly susceptible to natural or man-made hazards. In this context, it is essential to provide effective pre-disaster hazard mitigation strategies and prompt post-disaster risk management efforts based on rapid system reliability assessment. This paper proposes a rapid reliability estimation method for node-pair connectivity analysis of lifeline networks especially when the network components are statistically correlated. Recursive procedures are proposed to compound all network nodes until they become a single super node representing the connectivity between the origin and destination nodes. The proposed method is applied to numerical network examples and benchmark interconnected power and water networks in Memphis, Shelby County. The connectivity analysis results show the proposed method's reasonable accuracy and remarkable efficiency as compared to the Monte Carlo simulations

  11. Accuracy and Reliability of the Klales et al. (2012) Morphoscopic Pelvic Sexing Method.

    Science.gov (United States)

    Lesciotto, Kate M; Doershuk, Lily J

    2018-01-01

    Klales et al. (2012) devised an ordinal scoring system for the morphoscopic pelvic traits described by Phenice (1969) and used for sex estimation of skeletal remains. The aim of this study was to test the accuracy and reliability of the Klales method using a large sample from the Hamann-Todd collection (n = 279). Two observers were blinded to sex, ancestry, and age and used the Klales et al. method to estimate the sex of each individual. Sex was correctly estimated for females with over 95% accuracy; however, the male allocation accuracy was approximately 50%. Weighted Cohen's kappa and intraclass correlation coefficient analysis for evaluating intra- and interobserver error showed moderate to substantial agreement for all traits. Although each trait can be reliably scored using the Klales method, low accuracy rates and high sex bias indicate better trait descriptions and visual guides are necessary to more accurately reflect the range of morphological variation. © 2017 American Academy of Forensic Sciences.

  12. An automated method for estimating reliability of grid systems using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Emmanuel Ramirez-Marquez, Jose

    2012-01-01

    Grid computing has become relevant due to its applications to large-scale resource sharing, wide-area information transfer, and multi-institutional collaborating. In general, in grid computing a service requests the use of a set of resources, available in a grid, to complete certain tasks. Although analysis tools and techniques for these types of systems have been studied, grid reliability analysis is generally computation-intensive to obtain due to the complexity of the system. Moreover, conventional reliability models have some common assumptions that cannot be applied to the grid systems. Therefore, new analytical methods are needed for effective and accurate assessment of grid reliability. This study presents a new method for estimating grid service reliability, which does not require prior knowledge about the grid system structure unlike the previous studies. Moreover, the proposed method does not rely on any assumptions about the link and node failure rates. This approach is based on a data-mining algorithm, the K2, to discover the grid system structure from raw historical system data, that allows to find minimum resource spanning trees (MRST) within the grid then, uses Bayesian networks (BN) to model the MRST and estimate grid service reliability.

  13. Method for critical software event execution reliability in high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Kidd, M.E. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This report contains viewgraphs on a method called SEER, which provides a high level of confidence that critical software driven event execution sequences faithfully exceute in the face of transient computer architecture failures in both normal and abnormal operating environments.

  14. Extended block diagram method for a multi-state system reliability assessment

    International Nuclear Information System (INIS)

    Lisnianski, Anatoly

    2007-01-01

    The presented method extends the classical reliability block diagram method to a repairable multi-state system. It is very suitable for engineering applications since the procedure is well formalized and based on the natural decomposition of the entire multi-state system (the system is represented as a collection of its elements). Until now, the classical block diagram method did not provide the reliability assessment for the repairable multi-state system. The straightforward stochastic process methods are very difficult for engineering application in such cases due to the 'dimension damnation'-huge number of system states. The suggested method is based on the combined random processes and the universal generating function technique and drastically reduces the number of states in the multi-state model

  15. Reliability of lower limb alignment measures using an established landmark-based method with a customized computer software program

    Science.gov (United States)

    Sled, Elizabeth A.; Sheehy, Lisa M.; Felson, David T.; Costigan, Patrick A.; Lam, Miu; Cooke, T. Derek V.

    2010-01-01

    The objective of the study was to evaluate the reliability of frontal plane lower limb alignment measures using a landmark-based method by (1) comparing inter- and intra-reader reliability between measurements of alignment obtained manually with those using a computer program, and (2) determining inter- and intra-reader reliability of computer-assisted alignment measures from full-limb radiographs. An established method for measuring alignment was used, involving selection of 10 femoral and tibial bone landmarks. 1) To compare manual and computer methods, we used digital images and matching paper copies of five alignment patterns simulating healthy and malaligned limbs drawn using AutoCAD. Seven readers were trained in each system. Paper copies were measured manually and repeat measurements were performed daily for 3 days, followed by a similar routine with the digital images using the computer. 2) To examine the reliability of computer-assisted measures from full-limb radiographs, 100 images (200 limbs) were selected as a random sample from 1,500 full-limb digital radiographs which were part of the Multicenter Osteoarthritis (MOST) Study. Three trained readers used the software program to measure alignment twice from the batch of 100 images, with two or more weeks between batch handling. Manual and computer measures of alignment showed excellent agreement (intraclass correlations [ICCs] 0.977 – 0.999 for computer analysis; 0.820 – 0.995 for manual measures). The computer program applied to full-limb radiographs produced alignment measurements with high inter- and intra-reader reliability (ICCs 0.839 – 0.998). In conclusion, alignment measures using a bone landmark-based approach and a computer program were highly reliable between multiple readers. PMID:19882339

  16. Design and reliability, availability, maintainability, and safety analysis of a high availability quadruple vital computer system

    Institute of Scientific and Technical Information of China (English)

    Ping TAN; Wei-ting HE; Jia LIN; Hong-ming ZHAO; Jian CHU

    2011-01-01

    With the development of high-speed railways in China,more than 2000 high-speed trains will be put into use.Safety and efficiency of railway transportation is increasingly important.We have designed a high availability quadruple vital computer (HAQVC) system based on the analysis of the architecture of the traditional double 2-out-of-2 system and 2-out-of-3 system.The HAQVC system is a system with high availability and safety,with prominent characteristics such as fire-new internal architecture,high efficiency,reliable data interaction mechanism,and operation state change mechanism.The hardware of the vital CPU is based on ARM7 with the real-time embedded safe operation system (ES-OS).The Markov modeling method is designed to evaluate the reliability,availability,maintainability,and safety (RAMS) of the system.In this paper,we demonstrate that the HAQVC system is more reliable than the all voting triple modular redundancy (AVTMR) system and double 2-out-of-2 system.Thus,the design can be used for a specific application system,such as an airplane or high-speed railway system.

  17. Reliability and validity of a brief method to assess nociceptive flexion reflex (NFR) threshold.

    Science.gov (United States)

    Rhudy, Jamie L; France, Christopher R

    2011-07-01

    The nociceptive flexion reflex (NFR) is a physiological tool to study spinal nociception. However, NFR assessment can take several minutes and expose participants to repeated suprathreshold stimulations. The 4 studies reported here assessed the reliability and validity of a brief method to assess NFR threshold that uses a single ascending series of stimulations (Peak 1 NFR), by comparing it to a well-validated method that uses 3 ascending/descending staircases of stimulations (Staircase NFR). Correlations between the NFR definitions were high, were on par with test-retest correlations of Staircase NFR, and were not affected by participant sex or chronic pain status. Results also indicated the test-retest reliabilities for the 2 definitions were similar. Using larger stimulus increments (4 mAs) to assess Peak 1 NFR tended to result in higher NFR threshold estimates than using the Staircase NFR definition, whereas smaller stimulus increments (2 mAs) tended to result in lower NFR threshold estimates than the Staircase NFR definition. Neither NFR definition was correlated with anxiety, pain catastrophizing, or anxiety sensitivity. In sum, a single ascending series of electrical stimulations results in a reliable and valid estimate of NFR threshold. However, caution may be warranted when comparing NFR thresholds across studies that differ in the ascending stimulus increments. This brief method to assess NFR threshold is reliable and valid; therefore, it should be useful to clinical pain researchers interested in quickly assessing inter- and intra-individual differences in spinal nociceptive processes. Copyright © 2011 American Pain Society. Published by Elsevier Inc. All rights reserved.

  18. Development of high-reliability control system for nuclear power plants

    International Nuclear Information System (INIS)

    Asami, K.; Yanai, K.; Hirose, H.; Ito, T.

    1983-01-01

    In Japan, many nuclear power generating plants are in operation and under construction. There is a general awareness of the problems in connection with nuclear power generation and strong emphasis is put on achieving highly reliable operation of nuclear power plants. Hitachi has developed a new high-reliability control system. NURECS-3000 (NUclear Power Plant High-REliability Control System), which is applied to the main control systems, such as the reactor feedwater control system, the reactor recirculation control system and the main turbine control system. The NURECS-3000 system was designed taking into account the fact that there will be failures, but the aim is for the system to continue to function correctly; it is therefore a fault-tolerant system. It has redundant components which can be completely isolated from each other in order to prevent fault propagation. The system has a hierarchical configuration, with a main controller, consisting of a triplex microcomputer system, and sub-loop controllers. Special care was taken to ensure the independence of these subsystems. Since most of the redundant system failures are caused by common-mode failures and the reliability of redundant systems depends on the reliability of the common-mode parts, the aim was to minimize these parts. (author)

  19. High reliability megawatt transformer/rectifier

    Science.gov (United States)

    Zwass, Samuel; Ashe, Harry; Peters, John W.

    1991-01-01

    The goal of the two phase program is to develop the technology and design and fabricate ultralightweight high reliability DC to DC converters for space power applications. The converters will operate from a 5000 V dc source and deliver 1 MW of power at 100 kV dc. The power weight density goal is 0.1 kg/kW. The cycle to cycle voltage stability goals was + or - 1 percent RMS. The converter is to operate at an ambient temperature of -40 C with 16 minute power pulses and one hour off time. The uniqueness of the design in Phase 1 resided in the dc switching array which operates the converter at 20 kHz using Hollotron plasma switches along with a specially designed low loss, low leakage inductance and a light weight high voltage transformer. This approach reduced considerably the number of components in the converter thereby increasing the system reliability. To achieve an optimum transformer for this application, the design uses four 25 kV secondary windings to produce the 100 kV dc output, thus reducing the transformer leakage inductance, and the ac voltage stresses. A specially designed insulation system improves the high voltage dielectric withstanding ability and reduces the insulation path thickness thereby reducing the component weight. Tradeoff studies and tests conducted on scaled-down model circuits and using representative coil insulation paths have verified the calculated transformer wave shape parameters and the insulation system safety. In Phase 1 of the program a converter design approach was developed and a preliminary transformer design was completed. A fault control circuit was designed and a thermal profile of the converter was also developed.

  20. Achieving High Reliability Operations Through Multi-Program Integration

    Energy Technology Data Exchange (ETDEWEB)

    Holly M. Ashley; Ronald K. Farris; Robert E. Richards

    2009-04-01

    Over the last 20 years the Idaho National Laboratory (INL) has adopted a number of operations and safety-related programs which has each periodically taken its turn in the limelight. As new programs have come along there has been natural competition for resources, focus and commitment. In the last few years, the INL has made real progress in integrating all these programs and are starting to realize important synergies. Contributing to this integration are both collaborative individuals and an emerging shared vision and goal of the INL fully maturing in its high reliability operations. This goal is so powerful because the concept of high reliability operations (and the resulting organizations) is a masterful amalgam and orchestrator of the best of all the participating programs (i.e. conduct of operations, behavior based safety, human performance, voluntary protection, quality assurance, and integrated safety management). This paper is a brief recounting of the lessons learned, thus far, at the INL in bringing previously competing programs into harmony under the goal (umbrella) of seeking to perform regularly as a high reliability organization. In addition to a brief diagram-illustrated historical review, the authors will share the INL’s primary successes (things already effectively stopped or started) and the gaps yet to be bridged.

  1. Method for assessing software reliability of the document management system using the RFID technology

    Directory of Open Access Journals (Sweden)

    Kiedrowicz Maciej

    2016-01-01

    Full Text Available The deliberations presented in this study refer to the method for assessing software reliability of the docu-ment management system, using the RFID technology. A method for determining the reliability structure of the dis-cussed software, understood as the index vector for assessing reliability of its components, was proposed. The model of the analyzed software is the control transfer graph, in which the probability of activating individual components during the system's operation results from the so-called operational profile, which characterizes the actual working environment. The reliability structure is established as a result of the solution of a specific mathematical software task. The knowledge of the reliability structure of the software makes it possible to properly plan the time and finan-cial expenses necessary to build the software, which would meet the reliability requirements. The application of the presented method is illustrated by the number example, corresponding to the software reality of the RFID document management system.

  2. Structural system reliability calculation using a probabilistic fault tree analysis method

    Science.gov (United States)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  3. High Reliability Cryogenic Piezoelectric Valve Actuator, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Cryogenic fluid valves are subject to harsh exposure and actuators to drive these valves require robust performance and high reliability. DSM's piezoelectric...

  4. A human reliability based usability evaluation method for safety-critical software

    International Nuclear Information System (INIS)

    Boring, R. L.; Tran, T. Q.; Gertman, D. I.; Ragsdale, A.

    2006-01-01

    Boring and Gertman (2005) introduced a novel method that augments heuristic usability evaluation methods with that of the human reliability analysis method of SPAR-H. By assigning probabilistic modifiers to individual heuristics, it is possible to arrive at the usability error probability (UEP). Although this UEP is not a literal probability of error, it nonetheless provides a quantitative basis to heuristic evaluation. This method allows one to seamlessly prioritize and identify usability issues (i.e., a higher UEP requires more immediate fixes). However, the original version of this method required the usability evaluator to assign priority weights to the final UEP, thus allowing the priority of a usability issue to differ among usability evaluators. The purpose of this paper is to explore an alternative approach to standardize the priority weighting of the UEP in an effort to improve the method's reliability. (authors)

  5. Method of administration of PROMIS scales did not significantly impact score level, reliability, or validity

    DEFF Research Database (Denmark)

    Bjorner, Jakob B; Rose, Matthias; Gandek, Barbara

    2014-01-01

    OBJECTIVES: To test the impact of the method of administration (MOA) on score level, reliability, and validity of scales developed in the Patient Reported Outcomes Measurement Information System (PROMIS). STUDY DESIGN AND SETTING: Two nonoverlapping parallel forms each containing eight items from......, no significant mode differences were found and all confidence intervals were within the prespecified minimal important difference of 0.2 standard deviation. Parallel-forms reliabilities were very high (ICC = 0.85-0.93). Only one across-mode ICC was significantly lower than the same-mode ICC. Tests of validity...... questionnaire (PQ), personal digital assistant (PDA), or personal computer (PC) and a second form by PC, in the same administration. Method equivalence was evaluated through analyses of difference scores, intraclass correlations (ICCs), and convergent/discriminant validity. RESULTS: In difference score analyses...

  6. An exact method for solving logical loops in reliability analysis

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2009-01-01

    This paper presents an exact method for solving logical loops in reliability analysis. The systems that include logical loops are usually described by simultaneous Boolean equations. First, present a basic rule of solving simultaneous Boolean equations. Next, show the analysis procedures for three-component system with external supports. Third, more detailed discussions are given for the establishment of logical loop relation. Finally, take up two typical structures which include more than one logical loop. Their analysis results and corresponding GO-FLOW charts are given. The proposed analytical method is applicable to loop structures that can be described by simultaneous Boolean equations, and it is very useful in evaluating the reliability of complex engineering systems.

  7. Structural reliability methods: Code development status

    Science.gov (United States)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  8. High-Reliable PLC RTOS Development and RPS Structure Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, H. S.; Song, D. Y.; Sohn, D. S.; Kim, J. H. [Enersys Co., Daejeon (Korea, Republic of)

    2008-04-15

    One of the KNICS objectives is to develop a platform for Nuclear Power Plant(NPP) I and C(Instrumentation and Control) system, especially plant protection system. The developed platform is POSAFE-Q and this work supports the development of POSAFE-Q with the development of high-reliable real-time operating system(RTOS) and programmable logic device(PLD) software. Another KNICS objective is to develop safety I and C systems, such as Reactor Protection System(RPS) and Engineered Safety Feature-Component Control System(ESF-CCS). This work plays an important role in the structure analysis for RPS. Validation and verification(V and V) of the safety critical software is an essential work to make digital plant protection system highly reliable and safe. Generally, the reliability and safety of software based system can be improved by strict quality assurance framework including the software development itself. In other words, through V and V, the reliability and safety of a system can be improved and the development activities like software requirement specification, software design specification, component tests, integration tests, and system tests shall be appropriately documented for V and V.

  9. High-Reliable PLC RTOS Development and RPS Structure Analysis

    International Nuclear Information System (INIS)

    Sohn, H. S.; Song, D. Y.; Sohn, D. S.; Kim, J. H.

    2008-04-01

    One of the KNICS objectives is to develop a platform for Nuclear Power Plant(NPP) I and C(Instrumentation and Control) system, especially plant protection system. The developed platform is POSAFE-Q and this work supports the development of POSAFE-Q with the development of high-reliable real-time operating system(RTOS) and programmable logic device(PLD) software. Another KNICS objective is to develop safety I and C systems, such as Reactor Protection System(RPS) and Engineered Safety Feature-Component Control System(ESF-CCS). This work plays an important role in the structure analysis for RPS. Validation and verification(V and V) of the safety critical software is an essential work to make digital plant protection system highly reliable and safe. Generally, the reliability and safety of software based system can be improved by strict quality assurance framework including the software development itself. In other words, through V and V, the reliability and safety of a system can be improved and the development activities like software requirement specification, software design specification, component tests, integration tests, and system tests shall be appropriately documented for V and V.

  10. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  11. Reliability Analysis Of Fire System On The Industry Facility By Use Fameca Method

    International Nuclear Information System (INIS)

    Sony T, D.T.; Situmorang, Johnny; Ismu W, Puradwi; Demon H; Mulyanto, Dwijo; Kusmono, Slamet; Santa, Sigit Asmara

    2000-01-01

    FAMECA is one of the analysis method to determine system reliability on the industry facility. Analysis is done by some procedure that is identification of component function, determination of failure mode, severity level and effect of their failure. Reliability value is determined by three combinations that is severity level, component failure value and critical component. Reliability of analysis has been done for fire system on the industry by FAMECA method. Critical component which identified is pump, air release valve, check valve, manual test valve, isolation valve, control system etc

  12. Models of Information Security Highly Reliable Computing Systems

    Directory of Open Access Journals (Sweden)

    Vsevolod Ozirisovich Chukanov

    2016-03-01

    Full Text Available Methods of the combined reservation are considered. The models of reliability of systems considering parameters of restoration and prevention of blocks of system are described. Ratios for average quantity prevention and an availability quotient of blocks of system are given.

  13. COMPOSITE METHOD OF RELIABILITY RESEARCH FOR HIERARCHICAL MULTILAYER ROUTING SYSTEMS

    Directory of Open Access Journals (Sweden)

    R. B. Tregubov

    2016-09-01

    Full Text Available The paper deals with the idea of a research method for hierarchical multilayer routing systems. The method represents a composition of methods of graph theories, reliability, probabilities, etc. These methods are applied to the solution of different private analysis and optimization tasks and are systemically connected and coordinated with each other through uniform set-theoretic representation of the object of research. The hierarchical multilayer routing systems are considered as infrastructure facilities (gas and oil pipelines, automobile and railway networks, systems of power supply and communication with distribution of material resources, energy or information with the use of hierarchically nested functions of routing. For descriptive reasons theoretical constructions are considered on the example of task solution of probability determination for up state of specific infocommunication system. The author showed the possibility of constructive combination of graph representation of structure of the object of research and a logic probable analysis method of its reliability indices through uniform set-theoretic representation of its elements and processes proceeding in them.

  14. A study in the reliability analysis method for nuclear power plant structures (I)

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Byung Hwan; Choi, Seong Cheol; Shin, Ho Sang; Yang, In Hwan; Kim, Yi Sung; Yu, Young; Kim, Se Hun [Seoul, Nationl Univ., Seoul (Korea, Republic of)

    1999-03-15

    Nuclear power plant structures may be exposed to aggressive environmental effects that may cause their strength and stiffness to decrease over their service life. Although the physics of these damage mechanisms are reasonably well understood and quantitative evaluation of their effects on time-dependent structural behavior is possible in some instances, such evaluations are generally very difficult and remain novel. The assessment of existing steel containment in nuclear power plants for continued service must provide quantitative evidence that they are able to withstand future extreme loads during a service period with an acceptable level of reliability. Rational methodologies to perform the reliability assessment can be developed from mechanistic models of structural deterioration, using time-dependent structural reliability analysis to take loading and strength uncertainties into account. The final goal of this study is to develop the analysis method for the reliability of containment structures. The cause and mechanism of corrosion is first clarified and the reliability assessment method has been established. By introducing the equivalent normal distribution, the procedure of reliability analysis which can determine the failure probabilities has been established. The influence of design variables to reliability and the relation between the reliability and service life will be continued second year research.

  15. Performance and Reliability of Bonded Interfaces for High-temperature Packaging: Annual Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    DeVoto, Douglas J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-10-19

    As maximum device temperatures approach 200 °Celsius, continuous operation, sintered silver materials promise to maintain bonds at these high temperatures without excessive degradation rates. A detailed characterization of the thermal performance and reliability of sintered silver materials and processes has been initiated for the next year. Future steps in crack modeling include efforts to simulate crack propagation directly using the extended finite element method (X-FEM), a numerical technique that uses the partition of unity method for modeling discontinuities such as cracks in a system.

  16. A critical evaluation of deterministic methods in size optimisation of reliable and cost effective standalone hybrid renewable energy systems

    International Nuclear Information System (INIS)

    Maheri, Alireza

    2014-01-01

    Reliability of a hybrid renewable energy system (HRES) strongly depends on various uncertainties affecting the amount of power produced by the system. In the design of systems subject to uncertainties, both deterministic and nondeterministic design approaches can be adopted. In a deterministic design approach, the designer considers the presence of uncertainties and incorporates them indirectly into the design by applying safety factors. It is assumed that, by employing suitable safety factors and considering worst-case-scenarios, reliable systems can be designed. In fact, the multi-objective optimisation problem with two objectives of reliability and cost is reduced to a single-objective optimisation problem with the objective of cost only. In this paper the competence of deterministic design methods in size optimisation of reliable standalone wind–PV–battery, wind–PV–diesel and wind–PV–battery–diesel configurations is examined. For each configuration, first, using different values of safety factors, the optimal size of the system components which minimises the system cost is found deterministically. Then, for each case, using a Monte Carlo simulation, the effect of safety factors on the reliability and the cost are investigated. In performing reliability analysis, several reliability measures, namely, unmet load, blackout durations (total, maximum and average) and mean time between failures are considered. It is shown that the traditional methods of considering the effect of uncertainties in deterministic designs such as design for an autonomy period and employing safety factors have either little or unpredictable impact on the actual reliability of the designed wind–PV–battery configuration. In the case of wind–PV–diesel and wind–PV–battery–diesel configurations it is shown that, while using a high-enough margin of safety in sizing diesel generator leads to reliable systems, the optimum value for this margin of safety leading to a

  17. High pressure, high current, low inductance, high reliability sealed terminals

    Science.gov (United States)

    Hsu, John S [Oak Ridge, TN; McKeever, John W [Oak Ridge, TN

    2010-03-23

    The invention is a terminal assembly having a casing with at least one delivery tapered-cone conductor and at least one return tapered-cone conductor routed there-through. The delivery and return tapered-cone conductors are electrically isolated from each other and positioned in the annuluses of ordered concentric cones at an off-normal angle. The tapered cone conductor service can be AC phase conductors and DC link conductors. The center core has at least one service conduit of gate signal leads, diagnostic signal wires, and refrigerant tubing routed there-through. A seal material is in direct contact with the casing inner surface, the tapered-cone conductors, and the service conduits thereby hermetically filling the interstitial space in the casing interior core and center core. The assembly provides simultaneous high-current, high-pressure, low-inductance, and high-reliability service.

  18. Assessing high reliability via Bayesian approach and accelerated tests

    International Nuclear Information System (INIS)

    Erto, Pasquale; Giorgio, Massimiliano

    2002-01-01

    Sometimes the assessment of very high reliability levels is difficult for the following main reasons: - the high reliability level of each item makes it impossible to obtain, in a reasonably short time, a sufficient number of failures; - the high cost of the high reliability items to submit to life tests makes it unfeasible to collect enough data for 'classical' statistical analyses. In the above context, this paper presents a Bayesian solution to the problem of estimation of the parameters of the Weibull-inverse power law model, on the basis of a limited number (say six) of life tests, carried out at different stress levels, all higher than the normal one. The over-stressed (i.e. accelerated) tests allow the use of experimental data obtained in a reasonably short time. The Bayesian approach enables one to reduce the required number of failures adding to the failure information the available a priori engineers' knowledge. This engineers' involvement conforms to the most advanced management policy that aims at involving everyone's commitment in order to obtain total quality. A Monte Carlo study of the non-asymptotic properties of the proposed estimators and a comparison with the properties of maximum likelihood estimators closes the work

  19. The quadrant method measuring four points is as a reliable and accurate as the quadrant method in the evaluation after anatomical double-bundle ACL reconstruction.

    Science.gov (United States)

    Mochizuki, Yuta; Kaneko, Takao; Kawahara, Keisuke; Toyoda, Shinya; Kono, Norihiko; Hada, Masaru; Ikegami, Hiroyasu; Musha, Yoshiro

    2017-11-20

    The quadrant method was described by Bernard et al. and it has been widely used for postoperative evaluation of anterior cruciate ligament (ACL) reconstruction. The purpose of this research is to further develop the quadrant method measuring four points, which we named four-point quadrant method, and to compare with the quadrant method. Three-dimensional computed tomography (3D-CT) analyses were performed in 25 patients who underwent double-bundle ACL reconstruction using the outside-in technique. The four points in this study's quadrant method were defined as point1-highest, point2-deepest, point3-lowest, and point4-shallowest, in femoral tunnel position. Value of depth and height in each point was measured. Antero-medial (AM) tunnel is (depth1, height2) and postero-lateral (PL) tunnel is (depth3, height4) in this four-point quadrant method. The 3D-CT images were evaluated independently by 2 orthopaedic surgeons. A second measurement was performed by both observers after a 4-week interval. Intra- and inter-observer reliability was calculated by means of intra-class correlation coefficient (ICC). Also, the accuracy of the method was evaluated against the quadrant method. Intra-observer reliability was almost perfect for both AM and PL tunnel (ICC > 0.81). Inter-observer reliability of AM tunnel was substantial (ICC > 0.61) and that of PL tunnel was almost perfect (ICC > 0.81). The AM tunnel position was 0.13% deep, 0.58% high and PL tunnel position was 0.01% shallow, 0.13% low compared to quadrant method. The four-point quadrant method was found to have high intra- and inter-observer reliability and accuracy. This method can evaluate the tunnel position regardless of the shape and morphology of the bone tunnel aperture for use of comparison and can provide measurement that can be compared with various reconstruction methods. The four-point quadrant method of this study is considered to have clinical relevance in that it is a detailed and accurate tool for

  20. Research on Horizontal Accuracy Method of High Spatial Resolution Remotely Sensed Orthophoto Image

    Science.gov (United States)

    Xu, Y. M.; Zhang, J. X.; Yu, F.; Dong, S.

    2018-04-01

    At present, in the inspection and acceptance of high spatial resolution remotly sensed orthophoto image, the horizontal accuracy detection is testing and evaluating the accuracy of images, which mostly based on a set of testing points with the same accuracy and reliability. However, it is difficult to get a set of testing points with the same accuracy and reliability in the areas where the field measurement is difficult and the reference data with high accuracy is not enough. So it is difficult to test and evaluate the horizontal accuracy of the orthophoto image. The uncertainty of the horizontal accuracy has become a bottleneck for the application of satellite borne high-resolution remote sensing image and the scope of service expansion. Therefore, this paper proposes a new method to test the horizontal accuracy of orthophoto image. This method using the testing points with different accuracy and reliability. These points' source is high accuracy reference data and field measurement. The new method solves the horizontal accuracy detection of the orthophoto image in the difficult areas and provides the basis for providing reliable orthophoto images to the users.

  1. Reliability studies of high operating temperature MCT photoconductor detectors

    Science.gov (United States)

    Wang, Wei; Xu, Jintong; Zhang, Yan; Li, Xiangyang

    2010-10-01

    This paper concerns HgCdTe (MCT) infrared photoconductor detectors with high operating temperature. The near room temperature operation of detectors have advantages of light weight, less cost and convenient usage. Their performances are modest and they suffer from reliable problems. These detectors face with stability of the package, chip bonding area and passivation layers. It's important to evaluate and improve the reliability of such detectors. Defective detectors were studied with SEM(Scanning electron microscope) and microscopy. Statistically significant differences were observed between the influence of operating temperature and the influence of humidity. It was also found that humility has statistically significant influence upon the stability of the chip bonding and passivation layers, and the amount of humility isn't strongly correlated to the damage on the surface. Considering about the commonly found failures modes in detectors, special test structures were designed to improve the reliability of detectors. An accelerated life test was also implemented to estimate the lifetime of the high operating temperature MCT photoconductor detectors.

  2. A comparative study on the HW reliability assessment methods for digital I and C equipment

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hoan Sung; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Lee, G. Y. [Korea Atomic Energy Research Institute, Taejeon (Korea); Kim, M. C. [Korea Advanced Institute of Science and Technology, Taejeon (Korea); Jun, S. T. [KHNP, Taejeon (Korea)

    2002-03-01

    It is necessary to predict or to evaluate the reliability of electronic equipment for the probabilistic safety analysis of digital instrument and control equipment. But most databases for the reliability prediction have no data for the up-to-date equipment and the failure modes are not classified. The prediction results for the specific component show different values according to the methods and databases. For boards and systems each method shows different values than others also. This study is for reliability prediction of PDC system for Wolsong NPP1 as a digital I and C equipment. Various reliability prediction methods and failure databases are used in calculation of the reliability to compare the effects of sensitivity and accuracy of each model and database. Many considerations for the reliability assessment of digital systems are derived with the results of this study. 14 refs., 19 figs., 15 tabs. (Author)

  3. Model correction factor method for reliability problems involving integrals of non-Gaussian random fields

    DEFF Research Database (Denmark)

    Franchin, P.; Ditlevsen, Ove Dalager; Kiureghian, Armen Der

    2002-01-01

    The model correction factor method (MCFM) is used in conjunction with the first-order reliability method (FORM) to solve structural reliability problems involving integrals of non-Gaussian random fields. The approach replaces the limit-state function with an idealized one, in which the integrals ...

  4. A Reliability Assessment Method for the VHTR Safety Systems

    International Nuclear Information System (INIS)

    Lee, Hyung Sok; Jae, Moo Sung; Kim, Yong Wan

    2011-01-01

    The Passive safety system by very high temperature reactor which has attracted worldwide attention in the last century is the reliability safety system introduced for the improvement in the safety of the next generation nuclear power plant design. The Passive system functionality does not rely on an external source of energy, but on an intelligent use of the natural phenomena, such as gravity, conduction and radiation, which are always present. Because of these features, it is difficult to evaluate the passive safety on the risk analysis methodology having considered the existing active system failure. Therefore new reliability methodology has to be considered. In this study, the preliminary evaluation and conceptualization are tried, applying the concept of the load and capacity from the reliability physics model, designing the new passive system analysis methodology, and the trial applying to paper plant.

  5. A dynamic discretization method for reliability inference in Dynamic Bayesian Networks

    International Nuclear Information System (INIS)

    Zhu, Jiandao; Collette, Matthew

    2015-01-01

    The material and modeling parameters that drive structural reliability analysis for marine structures are subject to a significant uncertainty. This is especially true when time-dependent degradation mechanisms such as structural fatigue cracking are considered. Through inspection and monitoring, information such as crack location and size can be obtained to improve these parameters and the corresponding reliability estimates. Dynamic Bayesian Networks (DBNs) are a powerful and flexible tool to model dynamic system behavior and update reliability and uncertainty analysis with life cycle data for problems such as fatigue cracking. However, a central challenge in using DBNs is the need to discretize certain types of continuous random variables to perform network inference while still accurately tracking low-probability failure events. Most existing discretization methods focus on getting the overall shape of the distribution correct, with less emphasis on the tail region. Therefore, a novel scheme is presented specifically to estimate the likelihood of low-probability failure events. The scheme is an iterative algorithm which dynamically partitions the discretization intervals at each iteration. Through applications to two stochastic crack-growth example problems, the algorithm is shown to be robust and accurate. Comparisons are presented between the proposed approach and existing methods for the discretization problem. - Highlights: • A dynamic discretization method is developed for low-probability events in DBNs. • The method is compared to existing approaches on two crack growth problems. • The method is shown to improve on existing methods for low-probability events

  6. Performance and reliability of TPE-2 device with pulsed high power source

    International Nuclear Information System (INIS)

    Sato, Y.; Takeda, S.; Kiyama, S.

    1987-01-01

    The performance and the reliability of TPE-2 device with pulsed high power sources are described. To obtain the stable high beta plasma, the reproducibility and the reliability of the pulsed power sources must be maintained. A new power crowbar system with high efficiency and the switches with low jitter time are adopted to the bank system. A monitor system which always watches the operational states of the switches is developed too, and applied for the fast rising capacitor banks of TPE-2 device. The reliable operation for the bank has been realized, based on the data of switch monitor system

  7. High reliability low jitter 80 kV pulse generator

    Directory of Open Access Journals (Sweden)

    M. E. Savage

    2009-08-01

    Full Text Available Switching can be considered to be the essence of pulsed power. Time accurate switch/trigger systems with low inductance are useful in many applications. This article describes a unique switch geometry coupled with a low-inductance capacitive energy store. The system provides a fast-rising high voltage pulse into a low impedance load. It can be challenging to generate high voltage (more than 50 kilovolts into impedances less than 10  Ω, from a low voltage control signal with a fast rise time and high temporal accuracy. The required power amplification is large, and is usually accomplished with multiple stages. The multiple stages can adversely affect the temporal accuracy and the reliability of the system. In the present application, a highly reliable and low jitter trigger generator was required for the Z pulsed-power facility [M. E. Savage, L. F. Bennett, D. E. Bliss, W. T. Clark, R. S. Coats,J. M. Elizondo, K. R. LeChien, H. C. Harjes, J. M. Lehr, J. E. Maenchen, D. H. McDaniel, M. F. Pasik, T. D. Pointon, A. C. Owen, D. B. Seidel, D. L. Smith, B. S. Stoltzfus, K. W. Struve, W. A. Stygar, L. K. Warne, and J. R. Woodworth, 2007 IEEE Pulsed Power Conference, Albuquerque, NM (IEEE, Piscataway, NJ, 2007, p. 979]. The large investment in each Z experiment demands low prefire probability and low jitter simultaneously. The system described here is based on a 100 kV DC-charged high-pressure spark gap, triggered with an ultraviolet laser. The system uses a single optical path for simultaneously triggering two parallel switches, allowing lower inductance and electrode erosion with a simple optical system. Performance of the system includes 6 ns output rise time into 5.6  Ω, 550 ps one-sigma jitter measured from the 5 V trigger to the high voltage output, and misfire probability less than 10^{-4}. The design of the system and some key measurements will be shown in the paper. We will discuss the

  8. Reliability and discriminatory power of methods for dental plaque quantification

    Directory of Open Access Journals (Sweden)

    Daniela Prócida Raggio

    2010-04-01

    Full Text Available OBJECTIVE: This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI and fluorescence camera (FC to detect plaque. MATERIAL AND METHODS: Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. RESULTS: Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. CONCLUSION: The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque.

  9. Application of reliability analysis methods to the comparison of two safety circuits

    International Nuclear Information System (INIS)

    Signoret, J.-P.

    1975-01-01

    Two circuits of different design, intended for assuming the ''Low Pressure Safety Injection'' function in PWR reactors are analyzed using reliability methods. The reliability analysis of these circuits allows the failure trees to be established and the failure probability derived. The dependence of these results on test use and maintenance is emphasized as well as critical paths. The great number of results obtained may allow a well-informed choice taking account of the reliability wanted for the type of circuits [fr

  10. Highly reliable TOFD UT Technique

    International Nuclear Information System (INIS)

    Acharya, G.D.; Trivedi, S.A.R.; Pai, K.B.

    2003-01-01

    The high performance of the time of flight diffraction technique (TOFD) with regard to the detection capabilities of weld defects such as crack, slag, lack of fusion has led to a rapidly increasing acceptance of the technique as a pre?service inspection tool. Since the early 1990s TOFD has been applied to several projects, where it replaced the commonly used radiographic testing. The use of TOM lead to major time savings during new build and replacement projects. At the same time the TOFD technique was used as base line inspection, which enables monitoring in the future for critical welds, but also provides documented evidence for life?time. The TOFD technique as the ability to detect and simultaneously size flows of nearly any orientation within the weld and heat affected zone. TOM is recognized as a reliable, proven technique for detection and sizing of defects and proven to be a time saver, resulting in shorter shutdown periods and construction project times. Thus even in cases where inspection price of TOFD per welds is higher, in the end it will result in significantly lower overall costs and improve quality. This paper deals with reliability, economy, acceptance criteria and field experience. It also covers comparative study between radiography technique Vs. TOFD. (Author)

  11. The Berg Balance Scale has high intra- and inter-rater reliability but absolute reliability varies across the scale: a systematic review.

    Science.gov (United States)

    Downs, Stephen; Marquez, Jodie; Chiarelli, Pauline

    2013-06-01

    What is the intra-rater and inter-rater relative reliability of the Berg Balance Scale? What is the absolute reliability of the Berg Balance Scale? Does the absolute reliability of the Berg Balance Scale vary across the scale? Systematic review with meta-analysis of reliability studies. Any clinical population that has undergone assessment with the Berg Balance Scale. Relative intra-rater reliability, relative inter-rater reliability, and absolute reliability. Eleven studies involving 668 participants were included in the review. The relative intrarater reliability of the Berg Balance Scale was high, with a pooled estimate of 0.98 (95% CI 0.97 to 0.99). Relative inter-rater reliability was also high, with a pooled estimate of 0.97 (95% CI 0.96 to 0.98). A ceiling effect of the Berg Balance Scale was evident for some participants. In the analysis of absolute reliability, all of the relevant studies had an average score of 20 or above on the 0 to 56 point Berg Balance Scale. The absolute reliability across this part of the scale, as measured by the minimal detectable change with 95% confidence, varied between 2.8 points and 6.6 points. The Berg Balance Scale has a higher absolute reliability when close to 56 points due to the ceiling effect. We identified no data that estimated the absolute reliability of the Berg Balance Scale among participants with a mean score below 20 out of 56. The Berg Balance Scale has acceptable reliability, although it might not detect modest, clinically important changes in balance in individual subjects. The review was only able to comment on the absolute reliability of the Berg Balance Scale among people with moderately poor to normal balance. Copyright © 2013 Australian Physiotherapy Association. Published by .. All rights reserved.

  12. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  13. Seeking high reliability in primary care: Leadership, tools, and organization.

    Science.gov (United States)

    Weaver, Robert R

    2015-01-01

    Leaders in health care increasingly recognize that improving health care quality and safety requires developing an organizational culture that fosters high reliability and continuous process improvement. For various reasons, a reliability-seeking culture is lacking in most health care settings. Developing a reliability-seeking culture requires leaders' sustained commitment to reliability principles using key mechanisms to embed those principles widely in the organization. The aim of this study was to examine how key mechanisms used by a primary care practice (PCP) might foster a reliability-seeking, system-oriented organizational culture. A case study approach was used to investigate the PCP's reliability culture. The study examined four cultural artifacts used to embed reliability-seeking principles across the organization: leadership statements, decision support tools, and two organizational processes. To decipher their effects on reliability, the study relied on observations of work patterns and the tools' use, interactions during morning huddles and process improvement meetings, interviews with clinical and office staff, and a "collective mindfulness" questionnaire. The five reliability principles framed the data analysis. Leadership statements articulated principles that oriented the PCP toward a reliability-seeking culture of care. Reliability principles became embedded in the everyday discourse and actions through the use of "problem knowledge coupler" decision support tools and daily "huddles." Practitioners and staff were encouraged to report unexpected events or close calls that arose and which often initiated a formal "process change" used to adjust routines and prevent adverse events from recurring. Activities that foster reliable patient care became part of the taken-for-granted routine at the PCP. The analysis illustrates the role leadership, tools, and organizational processes play in developing and embedding a reliable-seeking culture across an

  14. A fast approximation method for reliability analysis of cold-standby systems

    International Nuclear Information System (INIS)

    Wang, Chaonan; Xing, Liudong; Amari, Suprasad V.

    2012-01-01

    Analyzing reliability of large cold-standby systems has been a complicated and time-consuming task, especially for systems with components having non-exponential time-to-failure distributions. In this paper, an approximation model, which is based on the central limit theorem, is presented for the reliability analysis of binary cold-standby systems. The proposed model can estimate the reliability of large cold-standby systems with binary-state components having arbitrary time-to-failure distributions in an efficient and easy way. The accuracy and efficiency of the proposed method are illustrated using several different types of distributions for both 1-out-of-n and k-out-of-n cold-standby systems.

  15. Reliability analysis of reactor systems by applying probability method; Analiza pouzdanosti reaktorskih sistema primenom metoda verovatnoce

    Energy Technology Data Exchange (ETDEWEB)

    Milivojevic, S [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1974-12-15

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component.

  16. Calculation of the reliability of large complex systems by the relevant path method

    International Nuclear Information System (INIS)

    Richter, G.

    1975-03-01

    In this paper, analytical methods are presented and tested with which the probabilistic reliability data of technical systems can be determined for given fault trees and block diagrams and known reliability data of the components. (orig./AK) [de

  17. Improving patient safety: patient-focused, high-reliability team training.

    Science.gov (United States)

    McKeon, Leslie M; Cunningham, Patricia D; Oswaks, Jill S Detty

    2009-01-01

    Healthcare systems are recognizing "human factor" flaws that result in adverse outcomes. Nurses work around system failures, although increasing healthcare complexity makes this harder to do without risk of error. Aviation and military organizations achieve ultrasafe outcomes through high-reliability practice. We describe how reliability principles were used to teach nurses to improve patient safety at the front line of care. Outcomes include safety-oriented, teamwork communication competency; reflections on safety culture and clinical leadership are discussed.

  18. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    Science.gov (United States)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  19. A High Reliability Frequency Stabilized Semiconductor Laser Source, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Ultrastable, narrow linewidth, high reliability MOPA sources are needed for high performance LIDARs in NASA for, wind speed measurement, surface topography and earth...

  20. A reliable method for the stability analysis of structures ...

    African Journals Online (AJOL)

    The detection of structural configurations with singular tangent stiffness matrix is essential because they can be unstable. The secondary paths, especially in unstable buckling, can play the most important role in the loss of stability and collapse of the structure. A new method for reliable detection and accurate computation of ...

  1. On the design of high-rise buildings with a specified level of reliability

    Science.gov (United States)

    Dolganov, Andrey; Kagan, Pavel

    2018-03-01

    High-rise buildings have a specificity, which significantly distinguishes them from traditional buildings of high-rise and multi-storey buildings. Steel structures in high-rise buildings are advisable to be used in earthquake-proof regions, since steel, due to its plasticity, provides damping of the kinetic energy of seismic impacts. These aspects should be taken into account when choosing a structural scheme of a high-rise building and designing load-bearing structures. Currently, modern regulatory documents do not quantify the reliability of structures. Although the problem of assigning an optimal level of reliability has existed for a long time. The article shows the possibility of designing metal structures of high-rise buildings with specified reliability. Currently, modern regulatory documents do not quantify the reliability of high-rise buildings. Although the problem of assigning an optimal level of reliability has existed for a long time. It is proposed to establish the value of reliability 0.99865 (3σ) for constructions of buildings and structures of a normal level of responsibility in calculations for the first group of limiting states. For increased (construction of high-rise buildings) and reduced levels of responsibility for the provision of load-bearing capacity, it is proposed to assign respectively 0.99997 (4σ) and 0.97725 (2σ). The coefficients of the use of the cross section of a metal beam for different levels of security are given.

  2. Reliability analysis based on a novel density estimation method for structures with correlations

    Directory of Open Access Journals (Sweden)

    Baoyu LI

    2017-06-01

    Full Text Available Estimating the Probability Density Function (PDF of the performance function is a direct way for structural reliability analysis, and the failure probability can be easily obtained by integration in the failure domain. However, efficiently estimating the PDF is still an urgent problem to be solved. The existing fractional moment based maximum entropy has provided a very advanced method for the PDF estimation, whereas the main shortcoming is that it limits the application of the reliability analysis method only to structures with independent inputs. While in fact, structures with correlated inputs always exist in engineering, thus this paper improves the maximum entropy method, and applies the Unscented Transformation (UT technique to compute the fractional moments of the performance function for structures with correlations, which is a very efficient moment estimation method for models with any inputs. The proposed method can precisely estimate the probability distributions of performance functions for structures with correlations. Besides, the number of function evaluations of the proposed method in reliability analysis, which is determined by UT, is really small. Several examples are employed to illustrate the accuracy and advantages of the proposed method.

  3. An overview of reliability methods in mechanical and structural design

    Science.gov (United States)

    Wirsching, P. H.; Ortiz, K.; Lee, S. J.

    1987-01-01

    An evaluation is made of modern methods of fast probability integration and Monte Carlo treatment for the assessment of structural systems' and components' reliability. Fast probability integration methods are noted to be more efficient than Monte Carlo ones. This is judged to be an important consideration when several point probability estimates must be made in order to construct a distribution function. An example illustrating the relative efficiency of the various methods is included.

  4. Reliability Verification of DBE Environment Simulation Test Facility by using Statistics Method

    International Nuclear Information System (INIS)

    Jang, Kyung Nam; Kim, Jong Soeg; Jeong, Sun Chul; Kyung Heum

    2011-01-01

    In the nuclear power plant, all the safety-related equipment including cables under the harsh environment should perform the equipment qualification (EQ) according to the IEEE std 323. There are three types of qualification methods including type testing, operating experience and analysis. In order to environmentally qualify the safety-related equipment using type testing method, not analysis or operation experience method, the representative sample of equipment, including interfaces, should be subjected to a series of tests. Among these tests, Design Basis Events (DBE) environment simulating test is the most important test. DBE simulation test is performed in DBE simulation test chamber according to the postulated DBE conditions including specified high-energy line break (HELB), loss of coolant accident (LOCA), main steam line break (MSLB) and etc, after thermal and radiation aging. Because most DBE conditions have 100% humidity condition, in order to trace temperature and pressure of DBE condition, high temperature steam should be used. During DBE simulation test, if high temperature steam under high pressure inject to the DBE test chamber, the temperature and pressure in test chamber rapidly increase over the target temperature. Therefore, the temperature and pressure in test chamber continue fluctuating during the DBE simulation test to meet target temperature and pressure. We should ensure fairness and accuracy of test result by confirming the performance of DBE environment simulation test facility. In this paper, in order to verify reliability of DBE environment simulation test facility, statistics method is used

  5. Studies on the reliability of high-field intra-operative MRI in brain glioma resection

    Directory of Open Access Journals (Sweden)

    Zhi-jun SONG

    2011-07-01

    Full Text Available Objective To evaluate the reliability of high-field intra-operative magnetic resonance imaging(iMRI in detecting the residual tumors during glioma resection.Method One hundred and thirty-one cases of brain glioma(69 males and 62 females,aged from 7 to 79 years with mean of 39.6 years hospitalized from Nov.2009 to Aug.2010 were involved in present study.All the patients were evaluated using magnetic resonance imaging(MRI before the operation.The tumors were resected under conventional navigation microscope,and the high-field iMRI was used for all the patients when the operators considered the tumor was satisfactorily resected,while the residual tumor was difficult to detect under the microscope,but resected after being revealed by high-field iMRI.Histopathological examination was performed.The patients without residual tumors recieved high-field MRI scan at day 4 or 5 after operation to evaluate the accuracy of high-field iMRI during operation.Results High quality intra-operative images were obtained by using high-field iMRI.Twenty-eight cases were excluded because their residual tumors were not resected due to their location too close to functional area.Combined with the results of intra-operative histopathological examination and post-operative MRI at the early recovery stage,the sensitivity of high-field iMRI in residual tumor diagnosis was 98.0%(49/50,the specificity was 94.3%(50/53,and the accuracy was 96.1%(99/103.Conclusion High-quality intra-operative imaging could be acquired by high-field iMRI,which maybe used as a safe and reliable method in detecting the residual tumors during glioma resection.

  6. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    DEFF Research Database (Denmark)

    Petersen, Bent; Petersen, Thomas Nordahl; Andersen, Pernille

    2009-01-01

    : The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability...... comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0.79 and 0.74 are obtained using our and the compared method, respectively. This tendency is true for any selected subset....

  7. Study on reliability analysis based on multilevel flow models and fault tree method

    International Nuclear Information System (INIS)

    Chen Qiang; Yang Ming

    2014-01-01

    Multilevel flow models (MFM) and fault tree method describe the system knowledge in different forms, so the two methods express an equivalent logic of the system reliability under the same boundary conditions and assumptions. Based on this and combined with the characteristics of MFM, a method mapping MFM to fault tree was put forward, thus providing a way to establish fault tree rapidly and realizing qualitative reliability analysis based on MFM. Taking the safety injection system of pressurized water reactor nuclear power plant as an example, its MFM was established and its reliability was analyzed qualitatively. The analysis result shows that the logic of mapping MFM to fault tree is correct. The MFM is easily understood, created and modified. Compared with the traditional fault tree analysis, the workload is greatly reduced and the modeling time is saved. (authors)

  8. Highly-reliable laser diodes and modules for spaceborne applications

    Science.gov (United States)

    Deichsel, E.

    2017-11-01

    Laser applications become more and more interesting in contemporary missions such as earth observations or optical communication in space. One of these applications is light detection and ranging (LIDAR), which comprises huge scientific potential in future missions. The Nd:YAG solid-state laser of such a LIDAR system is optically pumped using 808nm emitting pump sources based on semiconductor laser-diodes in quasi-continuous wave (qcw) operation. Therefore reliable and efficient laser diodes with increased output powers are an important requirement for a spaceborne LIDAR-system. In the past, many tests were performed regarding the performance and life-time of such laser-diodes. There were also studies for spaceborne applications, but a test with long operation times at high powers and statistical relevance is pending. Other applications, such as science packages (e.g. Raman-spectroscopy) on planetary rovers require also reliable high-power light sources. Typically fiber-coupled laser diode modules are used for such applications. Besides high reliability and life-time, designs compatible to the harsh environmental conditions must be taken in account. Mechanical loads, such as shock or strong vibration are expected due to take-off or landing procedures. Many temperature cycles with high change rates and differences must be taken in account due to sun-shadow effects in planetary orbits. Cosmic radiation has strong impact on optical components and must also be taken in account. Last, a hermetic sealing must be considered, since vacuum can have disadvantageous effects on optoelectronics components.

  9. Utilizing leadership to achieve high reliability in the delivery of perinatal care

    Directory of Open Access Journals (Sweden)

    Parrotta C

    2012-11-01

    Full Text Available Carmen Parrotta,1 William Riley,1 Les Meredith21School of Public Health, University of Minnesota, Minneapolis, MN, 2Premier Insurance Management Services Inc, Charlotte, NC, USAAbstract: Highly reliable care requires standardization of clinical practices and is a prerequisite for patient safety. However, standardization in complex hospital settings is extremely difficult to attain and health care leaders are challenged to create care delivery processes that ensure patient safety. Moreover, once high reliability is achieved in a hospital unit, it must be maintained to avoid process deterioration. This case study examines an intervention to implement care bundles (a collection of evidence-based practices in four hospitals to achieve standardized care in perinatal units. The results show different patterns in the rate and magnitude of change within the hospitals to achieve high reliability. The study is part of a larger nationwide study of 16 hospitals to improve perinatal safety. Based on the findings, we discuss the role of leadership for implementing and sustaining high reliability to ensure freedom from unintended injury.Keywords: care bundles, evidence-based practice, standardized care, process improvement

  10. To the problem of reliability of high-voltage accelerators for industrial purposes

    International Nuclear Information System (INIS)

    Al'bertinskij, B.I.; Svin'in, M.P.; Tsepakin, S.G.

    1979-01-01

    Statistical data characterizing the reliability of ELECTRON and AVRORA-2 type accelerators are presented. Used as a reliability index was the mean time to failure of the main accelerator units. The analysis of accelerator failures allowed a number of conclusions to be drawn. The high failure rate level is connected with inadequate training of the servicing personnel and a natural period of equipment adjustment. The mathematical analysis of the failure rate showed that the main responsibility for insufficient high reliability rests with selenium diodes which are employed in the high voltage power supply. Substitution of selenium diodes by silicon ones increases time between failures. It is shown that accumulation and processing of operational statistical data will permit more accurate prediction of the reliability of produced high-voltage accelerators, make it possible to cope with the problems of planning optimal, in time, preventive inspections and repair, and to select optimal safety factors and test procedures n time, preventive inspections and repair, and to select optimal safety factors and test procedures n time, prevent

  11. Reliability of high power electron accelerators for radiation processing

    Energy Technology Data Exchange (ETDEWEB)

    Zimek, Z. [Department of Radiation Chemistry and Technology, Institute of Nuclear Chemistry and Technology, Warsaw (Poland)

    2011-07-01

    Accelerators applied for radiation processing are installed in industrial facilities where accelerator availability coefficient should be at the level of 95% to fulfill requirements according to industry standards. Usually the exploitation of electron accelerator reviles the number of short and few long lasting failures. Some technical shortages can be overcome by practical implementation the experience gained in accelerator technology development by different accelerator manufactures. The reliability/availability of high power accelerators for application in flue gas treatment process must be dramatically improved to meet industrial standards. Support of accelerator technology dedicated for environment protection should be provided by governmental and international institutions to overcome accelerator reliability/availability problem and high risk and low direct profit in this particular application. (author)

  12. Reliability of high power electron accelerators for radiation processing

    International Nuclear Information System (INIS)

    Zimek, Z.

    2011-01-01

    Accelerators applied for radiation processing are installed in industrial facilities where accelerator availability coefficient should be at the level of 95% to fulfill requirements according to industry standards. Usually the exploitation of electron accelerator reviles the number of short and few long lasting failures. Some technical shortages can be overcome by practical implementation the experience gained in accelerator technology development by different accelerator manufactures. The reliability/availability of high power accelerators for application in flue gas treatment process must be dramatically improved to meet industrial standards. Support of accelerator technology dedicated for environment protection should be provided by governmental and international institutions to overcome accelerator reliability/availability problem and high risk and low direct profit in this particular application. (author)

  13. Highly reliable field electron emitters produced from reproducible damage-free carbon nanotube composite pastes with optimal inorganic fillers

    Science.gov (United States)

    Kim, Jae-Woo; Jeong, Jin-Woo; Kang, Jun-Tae; Choi, Sungyoul; Ahn, Seungjoon; Song, Yoon-Ho

    2014-02-01

    Highly reliable field electron emitters were developed using a formulation for reproducible damage-free carbon nanotube (CNT) composite pastes with optimal inorganic fillers and a ball-milling method. We carefully controlled the ball-milling sequence and time to avoid any damage to the CNTs, which incorporated fillers that were fully dispersed as paste constituents. The field electron emitters fabricated by printing the CNT pastes were found to exhibit almost perfect adhesion of the CNT emitters to the cathode, along with good uniformity and reproducibility. A high field enhancement factor of around 10 000 was achieved from the CNT field emitters developed. By selecting nano-sized metal alloys and oxides and using the same formulation sequence, we also developed reliable field emitters that could survive high-temperature post processing. These field emitters had high durability to post vacuum annealing at 950 °C, guaranteeing survival of the brazing process used in the sealing of field emission x-ray tubes. We evaluated the field emitters in a triode configuration in the harsh environment of a tiny vacuum-sealed vessel and observed very reliable operation for 30 h at a high current density of 350 mA cm-2. The CNT pastes and related field emitters that were developed could be usefully applied in reliable field emission devices.

  14. High-reliability, 4. pi. -scan, leakage-x-ray dosimeter

    Energy Technology Data Exchange (ETDEWEB)

    Kaneko, T; Iida, H; Yoshida, T; Sugimoto, H [Tokyo Shibaura Electric Co. Ltd., Kawasaki, Kanagawa (Japan). Tamagawa Works

    1978-04-01

    A world-wide movement is growing for the protection of living bodies against leakage radiations. In Japan, detailed regulations have been established for the enforcement of the law in regard to this problem. The substances of the measurement provided in the regulations are extremely diversified, much affecting the reliability and the economic efficiency of the equipment. Now a new 4..pi..-scan X-ray dosimeter with high reliability has been developed and proved to effect qualitative improvement of measurement as well as elevation of productivity.

  15. Modeling high-Power Accelerators Reliability-SNS LINAC (SNS-ORNL); MAX LINAC (MYRRHA)

    International Nuclear Information System (INIS)

    Pitigoi, A. E.; Fernandez Ramos, P.

    2013-01-01

    Improving reliability has recently become a very important objective in the field of particle accelerators. The particle accelerators in operation are constantly undergoing modifications, and improvements are implemented using new technologies, more reliable components or redundant schemes (to obtain more reliability, strength, more power, etc.) A reliability model of SNS (Spallation Neutron Source) LINAC has been developed within MAX project and analysis of the accelerator systems reliability has been performed within the MAX project, using the Risk Spectrum reliability analysis software. The analysis results have been evaluated by comparison with the SNS operational data. Results and conclusions are presented in this paper, oriented to identify design weaknesses and provide recommendations for improving reliability of MYRRHA linear accelerator. The SNS reliability model developed for the MAX preliminary design phase indicates possible avenues for further investigation that could be needed to improve the reliability of the high-power accelerators, in view of the future reliability targets of ADS accelerators.

  16. Validity and Reliability of the Academic Resilience Scale in Turkish High School

    Science.gov (United States)

    Kapikiran, Sahin

    2012-01-01

    The present study aims to determine the validity and reliability of the academic resilience scale in Turkish high school. The participances of the study includes 378 high school students in total (192 female and 186 male). A set of analyses were conducted in order to determine the validity and reliability of the study. Firstly, both exploratory…

  17. X-real-time executive (X-RTE) an ultra-high reliable real-time executive for safety critical systems

    International Nuclear Information System (INIS)

    Suresh Babu, R.M.

    1995-01-01

    With growing number of application of computers in safety critical systems of nuclear plants there has been a need to assure high quality and reliability of the software used in these systems. One way to assure software quality is to use qualified software components. Since the safety systems and control systems are real-time systems there is a need for a real-time supervisory software to guarantee temporal response of the system. This report describes one such software package, called X-Real-Time Executive (or X-RTE), which was developed in Reactor Control Division, BARC. The report describes all the capabilities and unique features of X-RTE and compares it with a commercially available operating system. The features of X-RTE include pre-emptive scheduling, process synchronization, inter-process communication, multi-processor support, temporal support, debug facility, high portability, high reliability, high quality, and extensive documentation. Examples have been used very liberally to illustrate the underlying concepts. Besides, the report provides a brief description about the methods used, during the software development, to assure high quality and reliability of X-RTE. (author). refs., 11 figs., tabs

  18. A dynamic particle filter-support vector regression method for reliability prediction

    International Nuclear Information System (INIS)

    Wei, Zhao; Tao, Tao; ZhuoShu, Ding; Zio, Enrico

    2013-01-01

    Support vector regression (SVR) has been applied to time series prediction and some works have demonstrated the feasibility of its use to forecast system reliability. For accuracy of reliability forecasting, the selection of SVR's parameters is important. The existing research works on SVR's parameters selection divide the example dataset into training and test subsets, and tune the parameters on the training data. However, these fixed parameters can lead to poor prediction capabilities if the data of the test subset differ significantly from those of training. Differently, the novel method proposed in this paper uses particle filtering to estimate the SVR model parameters according to the whole measurement sequence up to the last observation instance. By treating the SVR training model as the observation equation of a particle filter, our method allows updating the SVR model parameters dynamically when a new observation comes. Because of the adaptability of the parameters to dynamic data pattern, the new PF–SVR method has superior prediction performance over that of standard SVR. Four application results show that PF–SVR is more robust than SVR to the decrease of the number of training data and the change of initial SVR parameter values. Also, even if there are trends in the test data different from those in the training data, the method can capture the changes, correct the SVR parameters and obtain good predictions. -- Highlights: •A dynamic PF–SVR method is proposed to predict the system reliability. •The method can adjust the SVR parameters according to the change of data. •The method is robust to the size of training data and initial parameter values. •Some cases based on both artificial and real data are studied. •PF–SVR shows superior prediction performance over standard SVR

  19. Methods to compute reliabilities for genomic predictions of feed intake

    Science.gov (United States)

    For new traits without historical reference data, cross-validation is often the preferred method to validate reliability (REL). Time truncation is less useful because few animals gain substantial REL after the truncation point. Accurate cross-validation requires separating genomic gain from pedigree...

  20. Verification of practicability of quantitative reliability evaluation method (De-BDA) in nuclear power plants

    International Nuclear Information System (INIS)

    Takahashi, Kinshiro; Yukimachi, Takeo.

    1988-01-01

    A variety of methods have been applied to study of reliability analysis in which human factors are included in order to enhance the safety and availability of nuclear power plants. De-BDA (Detailed Block Diagram Analysis) is one of such mehtods developed with the objective of creating a more comprehensive and understandable tool for quantitative analysis of reliability associated with plant operations. The practicability of this method has been verified by applying it to reliability analysis of various phases of plant operation as well as evaluation of enhanced man-machine interface in the central control room. (author)

  1. The effect of DLC-coating deposition method on the reliability and mechanical properties of abutment's screws.

    Science.gov (United States)

    Bordin, Dimorvan; Coelho, Paulo G; Bergamo, Edmara T P; Bonfante, Estevam A; Witek, Lukasz; Del Bel Cury, Altair A

    2018-04-10

    To characterize the mechanical properties of different coating methods of DLC (diamond-like carbon) onto dental implant abutment screws, and their effect on the probability of survival (reliability). Seventy-five abutment screws were allocated into three groups according to the coating method: control (no coating); UMS - DLC applied through unbalanced magnetron sputtering; RFPA-DLC applied through radio frequency plasma-activated (n=25/group). Twelve screws (n=4) were used to determine the hardness and Young's modulus (YM). A 3D finite element model composed of titanium substrate, DLC-layer and a counterpart were constructed. The deformation (μm) and shear stress (MPa) were calculated. The remaining screws of each group were torqued into external hexagon abutments and subjected to step-stress accelerated life-testing (SSALT) (n=21/group). The probability Weibull curves and reliability (probability survival) were calculated considering the mission of 100, 150 and 200N at 50,000 and 100,000 cycles. DLC-coated experimental groups evidenced higher hardness than control (p1 indicating that fatigue contributed to failure. High reliability was depicted at a mission of 100N. At 200N a significant decrease in reliability was detected for all groups (ranging from 39% to 66%). No significant difference was observed among groups regardless of mission. Screw fracture was the chief failure mode. DLC-coating have been used to improve titanium's mechanical properties and increase the reliability of dental implant-supported restorations. Copyright © 2018 The Academy of Dental Materials. Published by Elsevier Inc. All rights reserved.

  2. Application of system reliability analytical method, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    1999-01-01

    The Ship Research Institute proceed a developmental study on GO-FLOW method with various advancing functionalities for the system reliability analysis method occupying main parts of PSA (Probabilistic Safety Assessment). Here was attempted to intend to upgrade functionality of the GO-FLOW method, to develop an analytical function integrated with dynamic behavior analytical function, physical behavior and probable subject transfer, and to prepare a main accident sequence picking-out function. In 1997 fiscal year, in dynamic event-tree analytical system, an analytical function was developed by adding dependency between headings. In simulation analytical function of the accident sequence, main accident sequence of MRX for improved ship propulsion reactor became possible to be covered perfectly. And, input data for analysis was prepared with a function capable easily to set by an analysis operator. (G.K.)

  3. Improving Power Converter Reliability

    DEFF Research Database (Denmark)

    Ghimire, Pramod; de Vega, Angel Ruiz; Beczkowski, Szymon

    2014-01-01

    of a high-power IGBT module during converter operation, which may play a vital role in improving the reliability of the power converters. The measured voltage is used to estimate the module average junction temperature of the high and low-voltage side of a half-bridge IGBT separately in every fundamental......The real-time junction temperature monitoring of a high-power insulated-gate bipolar transistor (IGBT) module is important to increase the overall reliability of power converters for industrial applications. This article proposes a new method to measure the on-state collector?emitter voltage...... is measured in a wind power converter at a low fundamental frequency. To illustrate more, the test method as well as the performance of the measurement circuit are also presented. This measurement is also useful to indicate failure mechanisms such as bond wire lift-off and solder layer degradation...

  4. A method of bias correction for maximal reliability with dichotomous measures.

    Science.gov (United States)

    Penev, Spiridon; Raykov, Tenko

    2010-02-01

    This paper is concerned with the reliability of weighted combinations of a given set of dichotomous measures. Maximal reliability for such measures has been discussed in the past, but the pertinent estimator exhibits a considerable bias and mean squared error for moderate sample sizes. We examine this bias, propose a procedure for bias correction, and develop a more accurate asymptotic confidence interval for the resulting estimator. In most empirically relevant cases, the bias correction and mean squared error correction can be performed simultaneously. We propose an approximate (asymptotic) confidence interval for the maximal reliability coefficient, discuss the implementation of this estimator, and investigate the mean squared error of the associated asymptotic approximation. We illustrate the proposed methods using a numerical example.

  5. Reliability Estimation with Uncertainties Consideration for High Power IGBTs in 2.3 MW Wind Turbine Converter System

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Ma, Ke

    2012-01-01

    This paper investigates the lifetime of high power IGBTs (insulated gate bipolar transistors) used in large wind turbine applications. Since the IGBTs are critical components in a wind turbine power converter, it is of great importance to assess their reliability in the design phase of the turbine....... Minimum, maximum and average junction temperatures profiles for the grid side IGBTs are estimated at each wind speed input values. The selected failure mechanism is the crack propagation in solder joint under the silicon die. Based on junction temperature profiles and physics of failure model......, the probabilistic and determinist damage models are presented with estimated fatigue lives. Reliably levels were assessed by means of First Order Reliability Method taking into account uncertainties....

  6. Interaction Entropy: A New Paradigm for Highly Efficient and Reliable Computation of Protein-Ligand Binding Free Energy.

    Science.gov (United States)

    Duan, Lili; Liu, Xiao; Zhang, John Z H

    2016-05-04

    Efficient and reliable calculation of protein-ligand binding free energy is a grand challenge in computational biology and is of critical importance in drug design and many other molecular recognition problems. The main challenge lies in the calculation of entropic contribution to protein-ligand binding or interaction systems. In this report, we present a new interaction entropy method which is theoretically rigorous, computationally efficient, and numerically reliable for calculating entropic contribution to free energy in protein-ligand binding and other interaction processes. Drastically different from the widely employed but extremely expensive normal mode method for calculating entropy change in protein-ligand binding, the new method calculates the entropic component (interaction entropy or -TΔS) of the binding free energy directly from molecular dynamics simulation without any extra computational cost. Extensive study of over a dozen randomly selected protein-ligand binding systems demonstrated that this interaction entropy method is both computationally efficient and numerically reliable and is vastly superior to the standard normal mode approach. This interaction entropy paradigm introduces a novel and intuitive conceptual understanding of the entropic effect in protein-ligand binding and other general interaction systems as well as a practical method for highly efficient calculation of this effect.

  7. Engineering high reliability, low-jitter Marx generators

    International Nuclear Information System (INIS)

    Schneider, L.X.; Lockwood, G.J.

    1985-01-01

    Multimodule pulsed power accelerators typically require high module reliability and nanosecond regime simultaneity between modules. Energy storage using bipolar Marx generators can meet these requirements. Experience gained from computer simulations and the development of the DEMON II Marx generator has led to a fundamental understanding of the operation of these multistage devices. As a result of this research, significant improvements in erection time jitter and reliability have been realized in multistage, bipolar Marx generators. Erection time jitter has been measured as low as 2.5 nanoseconds for the 3.2MV, 16-stage PBFA I Marx and 3.5 nanoseconds for the 6.0MV, 30-stage PBFA II (DEMON II) Marx, while maintaining exceptionally low prefire rates. Performance data are presented from the DEMON II Marx research program, as well as discussions on the use of computer simulations in designing low-jitter Marx generators

  8. Uncertainty analysis methods for estimation of reliability of passive system of VHTR

    International Nuclear Information System (INIS)

    Han, S.J.

    2012-01-01

    An estimation of reliability of passive system for the probabilistic safety assessment (PSA) of a very high temperature reactor (VHTR) is under development in Korea. The essential approach of this estimation is to measure the uncertainty of the system performance under a specific accident condition. The uncertainty propagation approach according to the simulation of phenomenological models (computer codes) is adopted as a typical method to estimate the uncertainty for this purpose. This presentation introduced the uncertainty propagation and discussed the related issues focusing on the propagation object and its surrogates. To achieve a sufficient level of depth of uncertainty results, the applicability of the propagation should be carefully reviewed. For an example study, Latin-hypercube sampling (LHS) method as a direct propagation was tested for a specific accident sequence of VHTR. The reactor cavity cooling system (RCCS) developed by KAERI was considered for this example study. This is an air-cooled type passive system that has no active components for its operation. The accident sequence is a low pressure conduction cooling (LPCC) accident that is considered as a design basis accident for the safety design of VHTR. This sequence is due to a large failure of the pressure boundary of the reactor system such as a guillotine break of coolant pipe lines. The presentation discussed the obtained insights (benefit and weakness) to apply an estimation of reliability of passive system

  9. Digital Processor Module Reliability Analysis of Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lee, Sang Yong; Jung, Jae Hyun; Kim, Jae Ho; Kim, Sung Hun

    2005-01-01

    The system used in plant, military equipment, satellite, etc. consists of many electronic parts as control module, which requires relatively high reliability than other commercial electronic products. Specially, Nuclear power plant related to the radiation safety requires high safety and reliability, so most parts apply to Military-Standard level. Reliability prediction method provides the rational basis of system designs and also provides the safety significance of system operations. Thus various reliability prediction tools have been developed in recent decades, among of them, the MI-HDBK-217 method has been widely used as a powerful tool for the prediction. In this work, It is explained that reliability analysis work for Digital Processor Module (DPM, control module of SMART) is performed by Parts Stress Method based on MIL-HDBK-217F NOTICE2. We are using the Relex 7.6 of Relex software corporation, because reliability analysis process requires enormous part libraries and data for failure rate calculation

  10. Instrument reliability for high-level nuclear-waste-repository applications

    International Nuclear Information System (INIS)

    Rogue, F.; Binnall, E.P.; Armantrout, G.A.

    1983-01-01

    Reliable instrumentation will be needed to evaluate the characteristics of proposed high-level nuclear-wasted-repository sites and to monitor the performance of selected sites during the operational period and into repository closure. A study has been done to assess the reliability of instruments used in Department of Energy (DOE) waste repository related experiments and in other similar geological applications. The study included experiences with geotechnical, hydrological, geochemical, environmental, and radiological instrumentation and associated data acquisition equipment. Though this paper includes some findings on the reliability of instruments in each of these categories, the emphasis is on experiences with geotechnical instrumentation in hostile repository-type environments. We review the failure modes, rates, and mechanisms, along with manufacturers modifications and design changes to enhance and improve instrument performance; and include recommendations on areas where further improvements are needed

  11. Product reliability and the reliability of its emanating operational processes.

    NARCIS (Netherlands)

    Sonnemans, P.J.M.; Geudens, W.H.J.M.

    1999-01-01

    This paper addresses the problem of proper reliability management in business operations today, facing increasing demands on essential business drivers such as time to market, quality and financial profit. In this paper a general method is described of how to achieve product quality in a highly

  12. A New Method of Reliability Evaluation Based on Wavelet Information Entropy for Equipment Condition Identification

    International Nuclear Information System (INIS)

    He, Z J; Zhang, X L; Chen, X F

    2012-01-01

    Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.

  13. Reliable and Accurate Release of Micro-Sized Objects with a Gripper that Uses the Capillary-Force Method

    Directory of Open Access Journals (Sweden)

    Suzana Uran

    2017-06-01

    Full Text Available There have been recent developments in grippers that are based on capillary force and condensed water droplets. These are used for manipulating micro-sized objects. Recently, one-finger grippers have been produced that are able to reliably grip using the capillary force. To release objects, either the van der Waals, gravitational or inertial-forces method is used. This article presents methods for reliably gripping and releasing micro-objects using the capillary force. The moisture from the surrounding air is condensed into a thin layer of water on the contact surfaces of the objects. From the thin layer of water, a water meniscus between the micro-sized object, the gripper and the releasing surface is created. Consequently, the water meniscus between the object and the releasing surface produces a high enough capillary force to release the micro-sized object from the tip of the one-finger gripper. In this case, either polystyrene, glass beads with diameters between 5–60 µm, or irregularly shaped dust particles of similar sizes were used. 3D structures made up of micro-sized objects could be constructed using this method. This method is reliable for releasing during assembly and also for gripping, when the objects are removed from the top of the 3D structure—the so-called “disassembling gripping” process. The accuracy of the release was lower than 0.5 µm.

  14. Planning of operation & maintenance using risk and reliability based methods

    DEFF Research Database (Denmark)

    Florian, Mihai; Sørensen, John Dalsgaard

    2015-01-01

    Operation and maintenance (OM) of offshore wind turbines contributes with a substantial part of the total levelized cost of energy (LCOE). The objective of this paper is to present an application of risk- and reliability-based methods for planning of OM. The theoretical basis is presented...

  15. Approximation of the Monte Carlo Sampling Method for Reliability Analysis of Structures

    Directory of Open Access Journals (Sweden)

    Mahdi Shadab Far

    2016-01-01

    Full Text Available Structural load types, on the one hand, and structural capacity to withstand these loads, on the other hand, are of a probabilistic nature as they cannot be calculated and presented in a fully deterministic way. As such, the past few decades have witnessed the development of numerous probabilistic approaches towards the analysis and design of structures. Among the conventional methods used to assess structural reliability, the Monte Carlo sampling method has proved to be very convenient and efficient. However, it does suffer from certain disadvantages, the biggest one being the requirement of a very large number of samples to handle small probabilities, leading to a high computational cost. In this paper, a simple algorithm was proposed to estimate low failure probabilities using a small number of samples in conjunction with the Monte Carlo method. This revised approach was then presented in a step-by-step flowchart, for the purpose of easy programming and implementation.

  16. A Comparison of Three Methods for the Analysis of Skin Flap Viability: Reliability and Validity.

    Science.gov (United States)

    Tim, Carla Roberta; Martignago, Cintia Cristina Santi; da Silva, Viviane Ribeiro; Dos Santos, Estefany Camila Bonfim; Vieira, Fabiana Nascimento; Parizotto, Nivaldo Antonio; Liebano, Richard Eloin

    2018-05-01

    Objective: Technological advances have provided new alternatives to the analysis of skin flap viability in animal models; however, the interrater validity and reliability of these techniques have yet to be analyzed. The present study aimed to evaluate the interrater validity and reliability of three different methods: weight of paper template (WPT), paper template area (PTA), and photographic analysis. Approach: Sixteen male Wistar rats had their cranially based dorsal skin flap elevated. On the seventh postoperative day, the viable tissue area and the necrotic area of the skin flap were recorded using the paper template method and photo image. The evaluation of the percentage of viable tissue was performed using three methods, simultaneously and independently by two raters. The analysis of interrater reliability and viability was performed using the intraclass correlation coefficient and Bland Altman Plot Analysis was used to visualize the presence or absence of systematic bias in the evaluations of data validity. Results: The results showed that interrater reliability for WPT, measurement of PTA, and photographic analysis were 0.995, 0.990, and 0.982, respectively. For data validity, a correlation >0.90 was observed for all comparisons made between the three methods. In addition, Bland Altman Plot Analysis showed agreement between the comparisons of the methods and the presence of systematic bias was not observed. Innovation: Digital methods are an excellent choice for assessing skin flap viability; moreover, they make data use and storage easier. Conclusion: Independently from the method used, the interrater reliability and validity proved to be excellent for the analysis of skin flaps' viability.

  17. Evaluating the reliability of multi-body mechanisms: A method considering the uncertainties of dynamic performance

    International Nuclear Information System (INIS)

    Wu, Jianing; Yan, Shaoze; Zuo, Ming J.

    2016-01-01

    Mechanism reliability is defined as the ability of a certain mechanism to maintain output accuracy under specified conditions. Mechanism reliability is generally assessed by the classical direct probability method (DPM) derived from the first order second moment (FOSM) method. The DPM relies strongly on the analytical form of the dynamic solution so it is not applicable to multi-body mechanisms that have only numerical solutions. In this paper, an indirect probability model (IPM) is proposed for mechanism reliability evaluation of multi-body mechanisms. IPM combines the dynamic equation, degradation function and Kaplan–Meier estimator to evaluate mechanism reliability comprehensively. Furthermore, to reduce the amount of computation in practical applications, the IPM is simplified into the indirect probability step model (IPSM). A case study of a crank–slider mechanism with clearance is investigated. Results show that relative errors between the theoretical and experimental results of mechanism reliability are less than 5%, demonstrating the effectiveness of the proposed method. - Highlights: • An indirect probability model (IPM) is proposed for mechanism reliability evaluation. • The dynamic equation, degradation function and Kaplan–Meier estimator are used. • Then the simplified form of indirect probability model is proposed. • The experimental results agree well with the predicted results.

  18. Precision profiles and analytic reliability of radioimmunologic methods

    International Nuclear Information System (INIS)

    Yaneva, Z.; Popova, Yu.

    1991-01-01

    The aim of the present study is to investigate and compare some methods for creation of 'precision profiles' (PP) and to clarify their possibilities for determining the analytical reliability of RIA. Only methods without complicated mathematical calculations has been used. The reproducibility in serums with a concentration of the determinable hormone in the whole range of the calibration curve has been studied. The radioimmunoassay has been performed with TSH-RIA set (ex East Germany), and comparative evaluations - with commercial sets of HOECHST (Germany) and AMERSHAM (GB). Three methods for obtaining the relationship concentration (IU/l) -reproducibility (C.V.,%) are used and a comparison is made of their corresponding profiles: preliminary rough profile, Rodbard-PP and Ekins-PP. It is concluded that the creation of a precision profile is obligatory and the method of its construction does not influence the relationship's course. PP allows to determine concentration range giving stable results which improves the efficiency of the analitical work. 16 refs., 4 figs

  19. Novel Methods to Enhance Precision and Reliability in Muscle Synergy Identification during Walking

    Science.gov (United States)

    Kim, Yushin; Bulea, Thomas C.; Damiano, Diane L.

    2016-01-01

    Muscle synergies are hypothesized to reflect modular control of muscle groups via descending commands sent through multiple neural pathways. Recently, the number of synergies has been reported as a functionally relevant indicator of motor control complexity in individuals with neurological movement disorders. Yet the number of synergies extracted during a given activity, e.g., gait, varies within and across studies, even for unimpaired individuals. With no standardized methods for precise determination, this variability remains unexplained making comparisons across studies and cohorts difficult. Here, we utilize k-means clustering and intra-class and between-level correlation coefficients to precisely discriminate reliable from unreliable synergies. Electromyography (EMG) was recorded bilaterally from eight leg muscles during treadmill walking at self-selected speed. Muscle synergies were extracted from 20 consecutive gait cycles using non-negative matrix factorization. We demonstrate that the number of synergies is highly dependent on the threshold when using the variance accounted for by reconstructed EMG. Beyond use of threshold, our method utilized a quantitative metric to reliably identify four or five synergies underpinning walking in unimpaired adults and revealed synergies having poor reproducibility that should not be considered as true synergies. We show that robust and unreliable synergies emerge similarly, emphasizing the need for careful analysis in those with pathology. PMID:27695403

  20. Reliability Study Regarding the Use of Histogram Similarity Methods for Damage Detection

    Directory of Open Access Journals (Sweden)

    Nicoleta Gillich

    2013-01-01

    Full Text Available The paper analyses the reliability of three dissimilarity estimators to compare histograms, as support for a frequency-based damage detection method, able to identify structural changes in beam-like structures. First a brief presentation of the own developed damage detection method is made, with focus on damage localization. It consists actually in comparing a histogram derived from measurement results, with a large series of histograms, namely the damage location indexes for all locations along the beam, obtained by calculus. We tested some dissimilarity estimators like the Minkowski-form Distances, the Kullback-Leibler Divergence and the Histogram Intersection and found the Minkowski Distance as the method providing best results. It was tested for numerous locations, using real measurement results and with results artificially debased by noise, proving its reliability.

  1. Efficiency criteria for high reliability measured system structures

    International Nuclear Information System (INIS)

    Sal'nikov, N.L.

    2012-01-01

    The procedures of structural redundancy are usually used to develop high reliability measured systems. To estimate efficiency of such structures the criteria to compare different systems has been developed. So it is possible to develop more exact system by inspection of redundant system data unit stochastic characteristics in accordance with the developed criteria [ru

  2. Designing reliability into high-effectiveness industrial gas turbine regenerators

    International Nuclear Information System (INIS)

    Valentino, S.J.

    1979-01-01

    The paper addresses the measures necessary to achieve a reliable regenerator design that can withstand higher temperatures (1000-1200 F) and many start and stop cycles - conditions encountered in high-efficiency operation in pipeline applications. The discussion is limited to three major areas: (1) structural analysis of the heat exchanger core - the part of the regenerator that must withstand the higher temperatures and cyclic duty (2) materials data and material selection and (3) a comprehensive test program to demonstrate the reliability of the regenerator. This program includes life-cycle tests, pressure containment in fin panels, core-to-core joint structural test, bellows pressure containment test, sliding pad test, core gas-side passage flow distribution test, and production test. Today's regenerators must have high cyclic life capability, stainless steel construction, and long fault-free service life of 120,000 hr

  3. Reliability testing of tendon disease using two different scanning methods in patients with rheumatoid arthritis

    DEFF Research Database (Denmark)

    Bruyn, George A W; Möller, Ingrid; Garrido, Jesus

    2012-01-01

    To assess the intra- and interobserver reliability of musculoskeletal ultrasonography (US) in detecting inflammatory and destructive tendon abnormalities in patients with RA using two different scanning methods.......To assess the intra- and interobserver reliability of musculoskeletal ultrasonography (US) in detecting inflammatory and destructive tendon abnormalities in patients with RA using two different scanning methods....

  4. A Systematic Review of Statistical Methods Used to Test for Reliability of Medical Instruments Measuring Continuous Variables

    Directory of Open Access Journals (Sweden)

    Rafdzah Zaki

    2013-06-01

    Full Text Available   Objective(s: Reliability measures precision or the extent to which test results can be replicated. This is the first ever systematic review to identify statistical methods used to measure reliability of equipment measuring continuous variables. This studyalso aims to highlight the inappropriate statistical method used in the reliability analysis and its implication in the medical practice.   Materials and Methods: In 2010, five electronic databases were searched between 2007 and 2009 to look for reliability studies. A total of 5,795 titles were initially identified. Only 282 titles were potentially related, and finally 42 fitted the inclusion criteria. Results: The Intra-class Correlation Coefficient (ICC is the most popular method with 25 (60% studies having used this method followed by the comparing means (8 or 19%. Out of 25 studies using the ICC, only 7 (28% reported the confidence intervals and types of ICC used. Most studies (71% also tested the agreement of instruments. Conclusion: This study finds that the Intra-class Correlation Coefficient is the most popular method used to assess the reliability of medical instruments measuring continuous outcomes. There are also inappropriate applications and interpretations of statistical methods in some studies. It is important for medical researchers to be aware of this issue, and be able to correctly perform analysis in reliability studies.

  5. A Sequential Kriging reliability analysis method with characteristics of adaptive sampling regions and parallelizability

    International Nuclear Information System (INIS)

    Wen, Zhixun; Pei, Haiqing; Liu, Hai; Yue, Zhufeng

    2016-01-01

    The sequential Kriging reliability analysis (SKRA) method has been developed in recent years for nonlinear implicit response functions which are expensive to evaluate. This type of method includes EGRA: the efficient reliability analysis method, and AK-MCS: the active learning reliability method combining Kriging model and Monte Carlo simulation. The purpose of this paper is to improve SKRA by adaptive sampling regions and parallelizability. The adaptive sampling regions strategy is proposed to avoid selecting samples in regions where the probability density is so low that the accuracy of these regions has negligible effects on the results. The size of the sampling regions is adapted according to the failure probability calculated by last iteration. Two parallel strategies are introduced and compared, aimed at selecting multiple sample points at a time. The improvement is verified through several troublesome examples. - Highlights: • The ISKRA method improves the efficiency of SKRA. • Adaptive sampling regions strategy reduces the number of needed samples. • The two parallel strategies reduce the number of needed iterations. • The accuracy of the optimal value impacts the number of samples significantly.

  6. Screening, sensitivity, and uncertainty for the CREAM method of Human Reliability Analysis

    International Nuclear Information System (INIS)

    Bedford, Tim; Bayley, Clare; Revie, Matthew

    2013-01-01

    This paper reports a sensitivity analysis of the Cognitive Reliability and Error Analysis Method for Human Reliability Analysis. We consider three different aspects: the difference between the outputs of the Basic and Extended methods, on the same HRA scenario; the variability in outputs through the choices made for common performance conditions (CPCs); and the variability in outputs through the assignment of choices for cognitive function failures (CFFs). We discuss the problem of interpreting categories when applying the method, compare its quantitative structure to that of first generation methods and discuss also how dependence is modelled with the approach. We show that the control mode intervals used in the Basic method are too narrow to be consistent with the Extended method. This motivates a new screening method that gives improved accuracy with respect to the Basic method, in the sense that (on average) halves the uncertainty associated with the Basic method. We make some observations on the design of a screening method that are generally applicable in Risk Analysis. Finally, we propose a new method of combining CPC weights with nominal probabilities so that the calculated probabilities are always in range (i.e. between 0 and 1), while satisfying sensible properties that are consistent with the overall CREAM method

  7. Methods for Calculating Frequency of Maintenance of Complex Information Security System Based on Dynamics of Its Reliability

    Science.gov (United States)

    Varlataya, S. K.; Evdokimov, V. E.; Urzov, A. Y.

    2017-11-01

    This article describes a process of calculating a certain complex information security system (CISS) reliability using the example of the technospheric security management model as well as ability to determine the frequency of its maintenance using the system reliability parameter which allows one to assess man-made risks and to forecast natural and man-made emergencies. The relevance of this article is explained by the fact the CISS reliability is closely related to information security (IS) risks. Since reliability (or resiliency) is a probabilistic characteristic of the system showing the possibility of its failure (and as a consequence - threats to the protected information assets emergence), it is seen as a component of the overall IS risk in the system. As it is known, there is a certain acceptable level of IS risk assigned by experts for a particular information system; in case of reliability being a risk-forming factor maintaining an acceptable risk level should be carried out by the routine analysis of the condition of CISS and its elements and their timely service. The article presents a reliability parameter calculation for the CISS with a mixed type of element connection, a formula of the dynamics of such system reliability is written. The chart of CISS reliability change is a S-shaped curve which can be divided into 3 periods: almost invariable high level of reliability, uniform reliability reduction, almost invariable low level of reliability. Setting the minimum acceptable level of reliability, the graph (or formula) can be used to determine the period of time during which the system would meet requirements. Ideally, this period should not be longer than the first period of the graph. Thus, the proposed method of calculating the CISS maintenance frequency helps to solve a voluminous and critical task of the information assets risk management.

  8. Assessment of reliability of Greulich and Pyle (gp) method for ...

    African Journals Online (AJOL)

    Background: Greulich and Pyle standards are the most widely used age estimation standards all over the world. The applicability of the Greulich and Pyle standards to populations which differ from their reference population is often questioned. This study aimed to assess the reliability of Greulich and Pyle (GP) method for ...

  9. Proceeding of 35th domestic symposium on applications of structural reliability and risk assessment methods to nuclear power plants

    International Nuclear Information System (INIS)

    2005-06-01

    As the 35th domestic symposium of Atomic Energy Research Committee, the Japan Welding Engineering Society, the symposium was held titled as Applications of structural reliability/risk assessment methods to nuclear energy'. Six speakers gave lectures titled as 'Structural reliability and risk assessment methods', 'Risk-informed regulation of US nuclear energy and role of probabilistic risk assessment', 'Reliability and risk assessment methods in chemical plants', 'Practical structural design methods based on reliability in architectural and civil areas', 'Maintenance activities based on reliability in thermal power plants' and 'LWR maintenance strategies based on Probabilistic Fracture Mechanics'. (T. Tanaka)

  10. Identification of a practical and reliable method for the evaluation of litter moisture in turkey production.

    Science.gov (United States)

    Vinco, L J; Giacomelli, S; Campana, L; Chiari, M; Vitale, N; Lombardi, G; Veldkamp, T; Hocking, P M

    2018-02-01

    1. An experiment was conducted to compare 5 different methods for the evaluation of litter moisture. 2. For litter collection and assessment, 55 farms were selected, one shed from each farm was inspected and 9 points were identified within each shed. 3. For each device, used for the evaluation of litter moisture, mean and standard deviation of wetness measures per collection point were assessed. 4. The reliability and overall consistency between the 5 instruments used to measure wetness were high (α = 0.72). 5. Measurement of three out of the 9 collection points were sufficient to provide a reliable assessment of litter moisture throughout the shed. 6. Based on the direct correlation between litter moisture and footpad lesions, litter moisture measurement can be used as a resource based on-farm animal welfare indicator. 7. Among the 5 methods analysed, visual scoring is the most simple and practical, and therefore the best candidate to be used on-farm for animal welfare assessment.

  11. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    Science.gov (United States)

    Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra

    2014-06-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute to stability of the tunnel structure are recognized owing to identify various aspects of reliability and sustainability in the system. The selection of efficient support methods for rock tunneling is a key factor in order to reduce the number of problems during construction and maintain the project cost and time within the limited budget and planned schedule. This paper introduces a smart approach by which decision-makers will be able to find the overall reliability of tunnel support system before selecting the final scheme of the lining system. Due to this research focus, engineering reliability which is a branch of statistics and probability is being appropriately applied to the field and much effort has been made to use it in tunneling while investigating the reliability of the lining support system for the tunnel structure. Therefore, reliability analysis for evaluating the tunnel support performance is the main idea used in this research. Decomposition approaches are used for producing system block diagram and determining the failure probability of the whole system. Effectiveness of the proposed reliability model of tunnel lining together with the recommended approaches is examined using several case studies and the final value of reliability obtained for different designing scenarios. Considering the idea of linear correlation between safety factors and reliability parameters, the values of isolated reliabilities determined for different structural components of tunnel support system. In order to determine individual safety factors, finite element modeling is employed for different structural subsystems and the results of numerical analyses are obtained in

  12. Systems reliability in high risk situations

    International Nuclear Information System (INIS)

    Hunns, D.M.

    1974-12-01

    A summary is given of five papers and the discussion of a seminar promoted by the newly-formed National Centre of Systems Reliability. The topics covered include hazard analysis, reliability assessment, and risk assessment in both nuclear and non-nuclear industries. (U.K.)

  13. The human factor in operation and maintenance of complex high-reliability systems

    International Nuclear Information System (INIS)

    Ryan, T.G.

    1989-01-01

    Human factors issues in probabilistic risk assessment (PRAs) of complex high-reliability systems are addressed. These PRAs influence system operation and technical support programs such as maintainability, test, and surveillance. Using the U.S. commercial nuclear power industry as the setting, the paper addresses the manner in which PRAs currently treat human performance, the state of quantification methods and source data for analyzing human performance, and the role of human factors specialist in the analysis. The paper concludes with a presentation of TALENT, an emerging concept for fully integrating broad-based human factors expertise into the PRA process, is presented. 47 refs

  14. A simple reliability block diagram method for safety integrity verification

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2007-01-01

    IEC 61508 requires safety integrity verification for safety related systems to be a necessary procedure in safety life cycle. PFD avg must be calculated to verify the safety integrity level (SIL). Since IEC 61508-6 does not give detailed explanations of the definitions and PFD avg calculations for its examples, it is difficult for common reliability or safety engineers to understand when they use the standard as guidance in practice. A method using reliability block diagram is investigated in this study in order to provide a clear and feasible way of PFD avg calculation and help those who take IEC 61508-6 as their guidance. The method finds mean down times (MDTs) of both channel and voted group first and then PFD avg . The calculated results of various voted groups are compared with those in IEC61508 part 6 and Ref. [Zhang T, Long W, Sato Y. Availability of systems with self-diagnostic components-applying Markov model to IEC 61508-6. Reliab Eng System Saf 2003;80(2):133-41]. An interesting outcome can be realized from the comparison. Furthermore, although differences in MDT of voted groups exist between IEC 61508-6 and this paper, PFD avg of voted groups are comparatively close. With detailed description, the method of RBD presented can be applied to the quantitative SIL verification, showing a similarity of the method in IEC 61508-6

  15. Development of improved processing and evaluation methods for high reliability structural ceramics for advanced heat engine applications, Phase 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Pujari, V.K.; Tracey, D.M.; Foley, M.R.; Paille, N.I.; Pelletier, P.J.; Sales, L.C.; Wilkens, C.A.; Yeckley, R.L. [Norton Co., Northboro, MA (United States)

    1993-08-01

    The program goals were to develop and demonstrate significant improvements in processing methods, process controls and non-destructive evaluation (NDE) which can be commercially implemented to produce high reliability silicon nitride components for advanced heat engine applications at temperatures to 1,370{degrees}C. The program focused on a Si{sub 3}N{sub 4}-4% Y{sub 2}O{sub 3} high temperature ceramic composition and hot-isostatic-pressing as the method of densification. Stage I had as major objectives: (1) comparing injection molding and colloidal consolidation process routes, and selecting one route for subsequent optimization, (2) comparing the performance of water milled and alcohol milled powder and selecting one on the basis of performance data, and (3) adapting several NDE methods to the needs of ceramic processing. The NDE methods considered were microfocus X-ray radiography, computed tomography, ultrasonics, NMR imaging, NMR spectroscopy, fluorescent liquid dye penetrant and X-ray diffraction residual stress analysis. The colloidal consolidation process route was selected and approved as the forming technique for the remainder of the program. The material produced by the final Stage II optimized process has been given the designation NCX 5102 silicon nitride. According to plan, a large number of specimens were produced and tested during Stage III to establish a statistically robust room temperature tensile strength database for this material. Highlights of the Stage III process demonstration and resultant database are included in the main text of the report, along with a synopsis of the NCX-5102 aqueous based colloidal process. The R and D accomplishments for Stage I are discussed in Appendices 1--4, while the tensile strength-fractography database for the Stage III NCX-5102 process demonstration is provided in Appendix 5. 4 refs., 108 figs., 23 tabs.

  16. Selection and reporting of statistical methods to assess reliability of a diagnostic test: Conformity to recommended methods in a peer-reviewed journal

    International Nuclear Information System (INIS)

    Park, Ji Eun; Sung, Yu Sub; Han, Kyung Hwa

    2017-01-01

    To evaluate the frequency and adequacy of statistical analyses in a general radiology journal when reporting a reliability analysis for a diagnostic test. Sixty-three studies of diagnostic test accuracy (DTA) and 36 studies reporting reliability analyses published in the Korean Journal of Radiology between 2012 and 2016 were analyzed. Studies were judged using the methodological guidelines of the Radiological Society of North America-Quantitative Imaging Biomarkers Alliance (RSNA-QIBA), and COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) initiative. DTA studies were evaluated by nine editorial board members of the journal. Reliability studies were evaluated by study reviewers experienced with reliability analysis. Thirty-one (49.2%) of the 63 DTA studies did not include a reliability analysis when deemed necessary. Among the 36 reliability studies, proper statistical methods were used in all (5/5) studies dealing with dichotomous/nominal data, 46.7% (7/15) of studies dealing with ordinal data, and 95.2% (20/21) of studies dealing with continuous data. Statistical methods were described in sufficient detail regarding weighted kappa in 28.6% (2/7) of studies and regarding the model and assumptions of intraclass correlation coefficient in 35.3% (6/17) and 29.4% (5/17) of studies, respectively. Reliability parameters were used as if they were agreement parameters in 23.1% (3/13) of studies. Reproducibility and repeatability were used incorrectly in 20% (3/15) of studies. Greater attention to the importance of reporting reliability, thorough description of the related statistical methods, efforts not to neglect agreement parameters, and better use of relevant terminology is necessary

  17. Selection and reporting of statistical methods to assess reliability of a diagnostic test: Conformity to recommended methods in a peer-reviewed journal

    Energy Technology Data Exchange (ETDEWEB)

    Park, Ji Eun; Sung, Yu Sub [Dept. of Radiology and Research Institute of Radiology, University of Ulsan College of Medicine, Asan Medical Center, Seoul (Korea, Republic of); Han, Kyung Hwa [Dept. of Radiology, Research Institute of Radiological Science, Yonsei University College of Medicine, Seoul (Korea, Republic of); and others

    2017-11-15

    To evaluate the frequency and adequacy of statistical analyses in a general radiology journal when reporting a reliability analysis for a diagnostic test. Sixty-three studies of diagnostic test accuracy (DTA) and 36 studies reporting reliability analyses published in the Korean Journal of Radiology between 2012 and 2016 were analyzed. Studies were judged using the methodological guidelines of the Radiological Society of North America-Quantitative Imaging Biomarkers Alliance (RSNA-QIBA), and COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) initiative. DTA studies were evaluated by nine editorial board members of the journal. Reliability studies were evaluated by study reviewers experienced with reliability analysis. Thirty-one (49.2%) of the 63 DTA studies did not include a reliability analysis when deemed necessary. Among the 36 reliability studies, proper statistical methods were used in all (5/5) studies dealing with dichotomous/nominal data, 46.7% (7/15) of studies dealing with ordinal data, and 95.2% (20/21) of studies dealing with continuous data. Statistical methods were described in sufficient detail regarding weighted kappa in 28.6% (2/7) of studies and regarding the model and assumptions of intraclass correlation coefficient in 35.3% (6/17) and 29.4% (5/17) of studies, respectively. Reliability parameters were used as if they were agreement parameters in 23.1% (3/13) of studies. Reproducibility and repeatability were used incorrectly in 20% (3/15) of studies. Greater attention to the importance of reporting reliability, thorough description of the related statistical methods, efforts not to neglect agreement parameters, and better use of relevant terminology is necessary.

  18. Highly Efficient and Reliable Transparent Electromagnetic Interference Shielding Film.

    Science.gov (United States)

    Jia, Li-Chuan; Yan, Ding-Xiang; Liu, Xiaofeng; Ma, Rujun; Wu, Hong-Yuan; Li, Zhong-Ming

    2018-04-11

    Electromagnetic protection in optoelectronic instruments such as optical windows and electronic displays is challenging because of the essential requirements of a high optical transmittance and an electromagnetic interference (EMI) shielding effectiveness (SE). Herein, we demonstrate the creation of an efficient transparent EMI shielding film that is composed of calcium alginate (CA), silver nanowires (AgNWs), and polyurethane (PU), via a facile and low-cost Mayer-rod coating method. The CA/AgNW/PU film with a high optical transmittance of 92% achieves an EMI SE of 20.7 dB, which meets the requirements for commercial shielding applications. A superior EMI SE of 31.3 dB could be achieved, whereas the transparent film still maintains a transmittance of 81%. The integrated efficient EMI SE and high transmittance are superior to those of most previously reported transparent EMI shielding materials. Moreover, our transparent films exhibit a highly reliable shielding ability in a complex service environment, with 98 and 96% EMI SE retentions even after 30 min of ultrasound treatment and 5000 bending cycles (1.5 mm radius), respectively. The comprehensive performance that is associated with the facile fabrication strategy imparts the CA/AgNW/PU film with great potential as an optimized EMI shielding material in emerging optoelectronic devices, such as flexible solar cells, displays, and touch panels.

  19. A high-throughput multiplex method adapted for GMO detection.

    Science.gov (United States)

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  20. Impact of High-Reliability Education on Adverse Event Reporting by Registered Nurses.

    Science.gov (United States)

    McFarland, Diane M; Doucette, Jeffrey N

    Adverse event reporting is one strategy to identify risks and improve patient safety, but, historically, adverse events are underreported by registered nurses (RNs) because of fear of retribution and blame. A program was provided on high reliability to examine whether education would impact RNs' willingness to report adverse events. Although the findings were not statistically significant, they demonstrated a positive impact on adverse event reporting and support the need to create a culture of high reliability.

  1. A fast and reliable readout method for quantitative analysis of surface-enhanced Raman scattering nanoprobes on chip surface

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Hyejin; Jeong, Sinyoung; Ko, Eunbyeol; Jeong, Dae Hong, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Department of Chemistry Education, Seoul National University, Seoul 151-742 (Korea, Republic of); Kang, Homan [Interdisciplinary Program in Nano-Science and Technology, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, Yoon-Sik, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Interdisciplinary Program in Nano-Science and Technology, Seoul National University, Seoul 151-742 (Korea, Republic of); School of Chemical and Biological Engineering, Seoul National University, Seoul 151-742 (Korea, Republic of); Lee, Ho-Young, E-mail: yslee@snu.ac.kr, E-mail: debobkr@gmail.com, E-mail: jeongdh@snu.ac.kr [Department of Nuclear Medicine, Seoul National University Bundang Hospital, Seongnam 463-707 (Korea, Republic of)

    2015-05-15

    Surface-enhanced Raman scattering techniques have been widely used for bioanalysis due to its high sensitivity and multiplex capacity. However, the point-scanning method using a micro-Raman system, which is the most common method in the literature, has a disadvantage of extremely long measurement time for on-chip immunoassay adopting a large chip area of approximately 1-mm scale and confocal beam point of ca. 1-μm size. Alternative methods such as sampled spot scan with high confocality and large-area scan method with enlarged field of view and low confocality have been utilized in order to minimize the measurement time practically. In this study, we analyzed the two methods in respect of signal-to-noise ratio and sampling-led signal fluctuations to obtain insights into a fast and reliable readout strategy. On this basis, we proposed a methodology for fast and reliable quantitative measurement of the whole chip area. The proposed method adopted a raster scan covering a full area of 100 μm × 100 μm region as a proof-of-concept experiment while accumulating signals in the CCD detector for single spectrum per frame. One single scan with 10 s over 100 μm × 100 μm area yielded much higher sensitivity compared to sampled spot scanning measurements and no signal fluctuations attributed to sampled spot scan. This readout method is able to serve as one of key technologies that will bring quantitative multiplexed detection and analysis into practice.

  2. Gearbox Reliability Collaborative High-Speed Shaft Calibration

    Energy Technology Data Exchange (ETDEWEB)

    Keller, J.; McNiff, B.

    2014-09-01

    Instrumentation has been added to the high-speed shaft, pinion, and tapered roller bearing pair of the Gearbox Reliability Collaborative gearbox to measure loads and temperatures. The new shaft bending moment and torque instrumentation was calibrated and the purpose of this document is to describe this calibration process and results, such that the raw shaft bending and torque signals can be converted to the proper engineering units and coordinate system reference for comparison to design loads and simulation model predictions.

  3. Learning Organizations in High Reliability Industries

    International Nuclear Information System (INIS)

    Schwalbe, D.; Wächter, C.

    2016-01-01

    Full text: Humans make mistakes. Sometimes we learn from them. In a high reliability organization we have to learn before an error leads to an incident (or even accident). Therefore the “human factor” is most important as most of the time the human is the last line of defense. The “human factor” is more than communication or leadership skills. At the end, it is the personal attitude. This attitude has to be safety minded. And this attitude has to be self-reflected continuously. Moreover, feedback from others is urgently needed to improve one’s personal skills daily and learn from our own experience as well as from others. (author

  4. A Hierarchical Reliability Control Method for a Space Manipulator Based on the Strategy of Autonomous Decision-Making

    Directory of Open Access Journals (Sweden)

    Xin Gao

    2016-01-01

    Full Text Available In order to maintain and enhance the operational reliability of a robotic manipulator deployed in space, an operational reliability system control method is presented in this paper. First, a method to divide factors affecting the operational reliability is proposed, which divides the operational reliability factors into task-related factors and cost-related factors. Then the models describing the relationships between the two kinds of factors and control variables are established. Based on this, a multivariable and multiconstraint optimization model is constructed. Second, a hierarchical system control model which incorporates the operational reliability factors is constructed. The control process of the space manipulator is divided into three layers: task planning, path planning, and motion control. Operational reliability related performance parameters are measured and used as the system’s feedback. Taking the factors affecting the operational reliability into consideration, the system can autonomously decide which control layer of the system should be optimized and how to optimize it using a control level adjustment decision module. The operational reliability factors affect these three control levels in the form of control variable constraints. Simulation results demonstrate that the proposed method can achieve a greater probability of meeting the task accuracy requirements, while extending the expected lifetime of the space manipulator.

  5. A high Reliability Module with Thermoelectric Device by Molding Technology for M2M Wireless Sensor Network

    International Nuclear Information System (INIS)

    Nakagawa, K; Tanaka, T; Suzuki, T

    2014-01-01

    This paper presents the fabrication of a new energy harvesting module that used the thermoelectric device (TED) by using molding technology. The output voltage per heater temperature of the TED module at 20 °C ambient temperature is 8mV/K and similar to the result with the aluminium heat sink which is almost the same fin size as the TED module. The accelerated environmental tests are performed on damp heat test that is an aging test under high temperature and high humidity, cold test and highly accelerated temperature and humidity stress test (HAST) for the purpose of evaluating the electrical reliability in harsh environments. Every result of tests indicates that the TED and circuit board can be properly protected from harsh temperature and humidity by using molding technology, because the output voltage of after tested modules is reduced by less than 5%.This study presents a novel fabrication method for a high reliability TED-installed module appropriate for Machine to Machine wireless sensor networks

  6. Reliability of different methods used for forming of working samples in the laboratory for seed testing

    Directory of Open Access Journals (Sweden)

    Opra Branislava

    2000-01-01

    Full Text Available The testing of seed quality starts from the moment a sample is formed in a warehouse during processing or packaging of the seed. The seed sampling as the process of obtaining the working sample also assumes each step undertaken during its testing in the laboratory. With the aim of appropriate forming of a seed sample in the laboratory, the usage of seed divider is prescribed for large seeded species (such as seed the size of wheat or larger (ISTA Rules, 1999. The aim of this paper was the comparison of different methods used for obtaining the working samples of maize and wheat seeds using conical, soil and centrifugal dividers. The number of seed of added admixtures confirmed the reliability of working samples formation. To each maize sample (1000 g 10 seeds of the following admixtures were added: Zea mays L. (red pericarp, Hordeum vulgäre L., Triticum aestivum L., and Glycine max (L. Merr. Two methods were used for formation of maze seed working sample. To wheat samples (1000 g 10 seeds of each of the following species were added: Avena saliva (hulled seeds, Hordeum vulgäre L., Galium tricorne Stokes, and Polygonum lapatifolmm L. For formation of wheat seed working samples four methods were used. Optimum of 9, but not less than 7 seeds of admixture were due to be determined in the maize seed working sample, while for wheat, at least one seed of admixture was expected to be found in the working sample. The obtained results confirmed that the formation of the maize seed working samples was the most reliable when centrifugal divider, the first method was used (average of admixture - 9.37. From the observed admixtures the seed of Triticum aestivum L. was the most uniformly distributed, the first method also being used (6.93. The second method gains high average values satisfying the given criterion, but it should be used with previous homogenization of the sample being tested. The forming of wheat seed working samples is the most reliable if the

  7. Semiconductor laser engineering, reliability and diagnostics a practical approach to high power and single mode devices

    CERN Document Server

    Epperlein, Peter W

    2013-01-01

    This reference book provides a fully integrated novel approach to the development of high-power, single-transverse mode, edge-emitting diode lasers by addressing the complementary topics of device engineering, reliability engineering and device diagnostics in the same book, and thus closes the gap in the current book literature. Diode laser fundamentals are discussed, followed by an elaborate discussion of problem-oriented design guidelines and techniques, and by a systematic treatment of the origins of laser degradation and a thorough exploration of the engineering means to enhance the optical strength of the laser. Stability criteria of critical laser characteristics and key laser robustness factors are discussed along with clear design considerations in the context of reliability engineering approaches and models, and typical programs for reliability tests and laser product qualifications. Novel, advanced diagnostic methods are reviewed to discuss, for the first time in detail in book literature, performa...

  8. Collection of methods for reliability and safety engineering

    International Nuclear Information System (INIS)

    Fussell, J.B.; Rasmuson, D.M.; Wilson, J.R.; Burdick, G.R.; Zipperer, J.C.

    1976-04-01

    The document presented contains five reports each describing a method of reliability and safety engineering. Report I provides a conceptual framework for the study of component malfunctions during system evaluations. Report II provides methods for locating groups of critical component failures such that all the component failures in a given group can be caused to occur by the occurrence of a single separate event. These groups of component failures are called common cause candidates. Report III provides a method for acquiring and storing system-independent component failure logic information. The information stored is influenced by the concepts presented in Report I and also includes information useful in locating common cause candidates. Report IV puts forth methods for analyzing situations that involve systems which change character in a predetermined time sequence. These phased missions techniques are applicable to the hypothetical ''accident chains'' frequently analyzed for nuclear power plants. Report V presents a unified approach to cause-consequence analysis, a method of analysis useful during risk assessments. This approach, as developed by the Danish Atomic Energy Commission, is modified to reflect the format and symbology conventionally used for other types of analysis of nuclear reactor systems

  9. Method of reliability allocation based on fault tree analysis and fuzzy math in nuclear power plants

    International Nuclear Information System (INIS)

    Chen Zhaobing; Deng Jian; Cao Xuewu

    2005-01-01

    Reliability allocation is a kind of a difficult multi-objective optimization problem. It can not only be applied to determine the reliability characteristic of reactor systems, subsystem and main components but also be performed to improve the design, operation and maintenance of nuclear plants. The fuzzy math known as one of the powerful tools for fuzzy optimization and the fault analysis deemed to be one of the effective methods of reliability analysis can be applied to the reliability allocation model so as to work out the problems of fuzzy characteristic of some factors and subsystem's choice respectively in this paper. Thus we develop a failure rate allocation model on the basis of the fault tree analysis and fuzzy math. For the choice of the reliability constraint factors, we choose the six important ones according to practical need for conducting the reliability allocation. The subsystem selected by the top-level fault tree analysis is to avoid allocating reliability for all the equipment and components including the unnecessary parts. During the reliability process, some factors can be calculated or measured quantitatively while others only can be assessed qualitatively by the expert rating method. So we adopt fuzzy decision and dualistic contrast to realize the reliability allocation with the help of fault tree analysis. Finally the example of the emergency diesel generator's reliability allocation is used to illustrate reliability allocation model and improve this model simple and applicable. (authors)

  10. Educational Management Organizations as High Reliability Organizations: A Study of Victory's Philadelphia High School Reform Work

    Science.gov (United States)

    Thomas, David E.

    2013-01-01

    This executive position paper proposes recommendations for designing reform models between public and private sectors dedicated to improving school reform work in low performing urban high schools. It reviews scholarly research about for-profit educational management organizations, high reliability organizations, American high school reform, and…

  11. Methods for estimating the reliability of the RBMK fuel assemblies and elements

    International Nuclear Information System (INIS)

    Klemin, A.I.; Sitkarev, A.G.

    1985-01-01

    Applied non-parametric methods for calculation of point and interval estimations for the basic nomenclature of reliability factors for the RBMK fuel assemblies and elements are described. As the fuel assembly and element reliability factors, the average lifetime is considered at a preset operating time up to unloading due to fuel burnout as well as the average lifetime at the reactor transient operation and at the steady-state fuel reloading mode of reactor operation. The formulae obtained are included into the special standardized engineering documentation

  12. Assessment of Electronic Circuits Reliability Using Boolean Truth Table Modeling Method

    International Nuclear Information System (INIS)

    EI-Shanshoury, A.I.

    2011-01-01

    This paper explores the use of Boolean Truth Table modeling Method (BTTM) in the analysis of qualitative data. It is widely used in certain fields especially in the fields of electrical and electronic engineering. Our work focuses on the evaluation of power supply circuit reliability using (BTTM) which involves systematic attempts to falsify and identify hypotheses on the basis of truth tables constructed from qualitative data. Reliability parameters such as the system's failure rates for the power supply case study are estimated. All possible state combinations (operating and failed states) of the major components in the circuit were listed and their effects on overall system were studied

  13. Sub-nanosecond jitter, repetitive impulse generators for high reliability applications

    International Nuclear Information System (INIS)

    Krausse, G.J.; Sarjeant, W.J.

    1981-01-01

    Low jitter, high reliability impulse generator development has recently become of ever increasing importance for developing nuclear physics and weapons applications. The research and development of very low jitter (< 30 ps), multikilovolt generators for high reliability, minimum maintenance trigger applications utilizing a new class of high-pressure tetrode thyratrons now commercially available are described. The overall system design philosophy is described followed by a detailed analysis of the subsystem component elements. A multi-variable experimental analysis of this new tetrode thyratron was undertaken, in a low-inductance configuration, as a function of externally available parameters. For specific thyratron trigger conditions, rise times of 18 ns into 6.0-Ω loads were achieved at jitters as low as 24 ps. Using this database, an integrated trigger generator system with solid-state front-end is described in some detail. The generator was developed to serve as the Master Trigger Generator for a large neutrino detector installation at the Los Alamos Meson Physics Facility

  14. Reliable on-line storage in the ALICE High-Level Trigger

    Energy Technology Data Exchange (ETDEWEB)

    Kalcher, Sebastian; Lindenstruth, Volker [Kirchhoff Institute of Physics, University of Heidelberg (Germany)

    2009-07-01

    The on-line disk capacity within large computing clusters such as used in the ALICE High-Level Trigger (HLT) is often not used due to the inherent unreliability of the involved disks. With currently available hard drive capacities the total on-line capacity can be significant when compared to the storage requirements of present high energy physics experiments. In this talk we report on ClusterRAID, a reliable, distributed mass storage system, which allows to harness the (often unused) disk capacities of large cluster installations. The key paradigm of this system is to transform the local hard drive into a reliable device. It provides adjustable fault-tolerance by utilizing sophisticated error-correcting codes. To reduce the costs of coding and decoding operations the use of modern graphics processing units as co-processor has been investigated. Also, the utilization of low overhead, high performance communication networks has been examined. A prototype set up of the system exists within the HLT with 90 TB gross capacity.

  15. An attempt to use FMEA method for an approximate reliability assessment of machinery

    Directory of Open Access Journals (Sweden)

    Przystupa Krzysztof

    2017-01-01

    Full Text Available The paper presents a modified FMEA (Failure Mode and Effect Analysis method to assess reliability of the components that make up a wrench type 2145: MAX Impactol TM Driver Ingersoll Rand Company. This case concerns the analysis of reliability in conditions, when full service data is not known. The aim of the study is to determine the weakest element in the design of the tool.

  16. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    Science.gov (United States)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system

  17. Transferring Aviation Practices into Clinical Medicine for the Promotion of High Reliability.

    Science.gov (United States)

    Powell-Dunford, Nicole; McPherson, Mark K; Pina, Joseph S; Gaydos, Steven J

    2017-05-01

    Aviation is a classic example of a high reliability organization (HRO)-an organization in which catastrophic events are expected to occur without control measures. As health care systems transition toward high reliability, aviation practices are increasingly transferred for clinical implementation. A PubMed search using the terms aviation, crew resource management, and patient safety was undertaken. Manuscripts authored by physician pilots and accident investigation regulations were analyzed. Subject matter experts involved in adoption of aviation practices into the medical field were interviewed. A PubMed search yielded 621 results with 22 relevant for inclusion. Improved clinical outcomes were noted in five research trials in which aviation practices were adopted, particularly with regard to checklist usage and crew resource-management training. Effectiveness of interventions was influenced by intensity of application, leadership involvement, and provision of staff training. The usefulness of incorporating mishap investigation techniques has not been established. Whereas aviation accident investigation is highly standardized, the investigation of medical error is characterized by variation. The adoption of aviation practices into clinical medicine facilitates an evolution toward high reliability. Evidence for the efficacy of the checklist and crew resource-management training is robust. Transference of aviation accident investigation practices is preliminary. A standardized, independent investigation process could facilitate the development of a safety culture commensurate with that achieved in the aviation industry.Powell-Dunford N, McPherson MK, Pina JS, Gaydos SJ. Transferring aviation practices into clinical medicine for the promotion of high reliability. Aerosp Med Hum Perform. 2017; 88(5):487-491.

  18. [A reliability growth assessment method and its application in the development of equipment in space cabin].

    Science.gov (United States)

    Chen, J D; Sun, H L

    1999-04-01

    Objective. To assess and predict reliability of an equipment dynamically by making full use of various test informations in the development of products. Method. A new reliability growth assessment method based on army material system analysis activity (AMSAA) model was developed. The method is composed of the AMSAA model and test data conversion technology. Result. The assessment and prediction results of a space-borne equipment conform to its expectations. Conclusion. It is suggested that this method should be further researched and popularized.

  19. Reliability of force-velocity relationships during deadlift high pull.

    Science.gov (United States)

    Lu, Wei; Boyas, Sébastien; Jubeau, Marc; Rahmani, Abderrahmane

    2017-11-13

    This study aimed to evaluate the within- and between-session reliability of force, velocity and power performances and to assess the force-velocity relationship during the deadlift high pull (DHP). Nine participants performed two identical sessions of DHP with loads ranging from 30 to 70% of body mass. The force was measured by a force plate under the participants' feet. The velocity of the 'body + lifted mass' system was calculated by integrating the acceleration and the power was calculated as the product of force and velocity. The force-velocity relationships were obtained from linear regression of both mean and peak values of force and velocity. The within- and between-session reliability was evaluated by using coefficients of variation (CV) and intraclass correlation coefficients (ICC). Results showed that DHP force-velocity relationships were significantly linear (R² > 0.90, p  0.94), mean and peak velocities showed a good agreement (CV reliable and can therefore be utilised as a tool to characterise individuals' muscular profiles.

  20. Direct unavailability computation of a maintained highly reliable system

    Czech Academy of Sciences Publication Activity Database

    Briš, R.; Byczanski, Petr

    2010-01-01

    Roč. 224, č. 3 (2010), s. 159-170 ISSN 1748-0078 Grant - others:GA Mšk(CZ) MSM6198910007 Institutional research plan: CEZ:AV0Z30860518 Keywords : high reliability * availability * directed acyclic graph Subject RIV: BA - General Mathematics http:// journals .pepublishing.com/content/rtp3178l17923m46/

  1. Implementing eco friendly highly reliable upload feature using multi 3G service

    Science.gov (United States)

    Tanutama, Lukas; Wijaya, Rico

    2017-12-01

    The current trend of eco friendly Internet access is preferred. In this research the understanding of eco friendly is minimum power consumption. The devices that are selected have operationally low power consumption and normally have no power consumption as they are hibernating during idle state. To have the reliability a router of a router that has internal load balancing feature will provide the improvement of previous research on multi 3G services for broadband lines. Previous studies emphasized on accessing and downloading information files from Public Cloud residing Web Servers. The demand is not only for speed but high reliability of access as well. High reliability will mean mitigating both direct and indirect high cost due to repeated attempts of uploading and downloading the large files. Nomadic and mobile computer users need viable solution. Following solution for downloading information has been proposed and tested. The solution is promising. The result is now extended to providing reliable access line by means of redundancy and automatic reconfiguration for uploading and downloading large information files to a Web Server in the Cloud. The technique is taking advantage of internal load balancing feature to provision a redundant line acting as a backup line. A router that has the ability to provide load balancing to several WAN lines is chosen. The WAN lines are constructed using multiple 3G lines. The router supports the accessing Internet with more than one 3G access line which increases the reliability and availability of the Internet access as the second line immediately takes over if the first line is disturbed.

  2. Reliability-Based Topology Optimization Using Stochastic Response Surface Method with Sparse Grid Design

    Directory of Open Access Journals (Sweden)

    Qinghai Zhao

    2015-01-01

    Full Text Available A mathematical framework is developed which integrates the reliability concept into topology optimization to solve reliability-based topology optimization (RBTO problems under uncertainty. Two typical methodologies have been presented and implemented, including the performance measure approach (PMA and the sequential optimization and reliability assessment (SORA. To enhance the computational efficiency of reliability analysis, stochastic response surface method (SRSM is applied to approximate the true limit state function with respect to the normalized random variables, combined with the reasonable design of experiments generated by sparse grid design, which was proven to be an effective and special discretization technique. The uncertainties such as material property and external loads are considered on three numerical examples: a cantilever beam, a loaded knee structure, and a heat conduction problem. Monte-Carlo simulations are also performed to verify the accuracy of the failure probabilities computed by the proposed approach. Based on the results, it is demonstrated that application of SRSM with SGD can produce an efficient reliability analysis in RBTO which enables a more reliable design than that obtained by DTO. It is also found that, under identical accuracy, SORA is superior to PMA in view of computational efficiency.

  3. Reliability and Sensitivity Analysis for Laminated Composite Plate Using Response Surface Method

    International Nuclear Information System (INIS)

    Lee, Seokje; Kim, Ingul; Jang, Moonho; Kim, Jaeki; Moon, Jungwon

    2013-01-01

    Advanced fiber-reinforced laminated composites are widely used in various fields of engineering to reduce weight. The material property of each ply is well known; specifically, it is known that ply is less reliable than metallic materials and very sensitive to the loading direction. Therefore, it is important to consider this uncertainty in the design of laminated composites. In this study, reliability analysis is conducted using Callosum and Meatball interactions for a laminated composite plate for the case in which the tip deflection is the design requirement and the material property is a random variable. Furthermore, the efficiency and accuracy of the approximation method is identified, and a probabilistic sensitivity analysis is conducted. As a result, we can prove the applicability of the advanced design method for the stabilizer of an underwater vehicle

  4. Reliability and Sensitivity Analysis for Laminated Composite Plate Using Response Surface Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seokje; Kim, Ingul [Chungnam National Univ., Daejeon (Korea, Republic of); Jang, Moonho; Kim, Jaeki; Moon, Jungwon [LIG Nex1, Yongin (Korea, Republic of)

    2013-04-15

    Advanced fiber-reinforced laminated composites are widely used in various fields of engineering to reduce weight. The material property of each ply is well known; specifically, it is known that ply is less reliable than metallic materials and very sensitive to the loading direction. Therefore, it is important to consider this uncertainty in the design of laminated composites. In this study, reliability analysis is conducted using Callosum and Meatball interactions for a laminated composite plate for the case in which the tip deflection is the design requirement and the material property is a random variable. Furthermore, the efficiency and accuracy of the approximation method is identified, and a probabilistic sensitivity analysis is conducted. As a result, we can prove the applicability of the advanced design method for the stabilizer of an underwater vehicle.

  5. Prepared for the thirtieth annual conference on bioassay analytical and environmental chemistry. Reliable analysis of high resolution gamma spectra

    International Nuclear Information System (INIS)

    Spitz, H.B.; Buschbom, R.; Rieksts, G.A.; Palmer, H.E.

    1985-01-01

    A new method has been developed to reliably analyze pulse height-energy spectra obtained from measurements employing high resolution germanium detectors. The method employs a simple data transformation and smoothing function to calculate background and identify photopeaks and isotopic analysis. This technique is elegant in its simplicity because it avoids dependence upon complex spectrum deconvolution, stripping, or other least-square-fitting techniques which complicate the assessment of measurement reliability. A moving median was chosen for data smoothing because, unlike moving averages, medians are not dominated by extreme data points. Finally, peaks are identified whenever the difference between the background spectrum and the transformed spectrum exceeds a pre-determined number of standard deviations

  6. Analyzing the reliability of shuffle-exchange networks using reliability block diagrams

    International Nuclear Information System (INIS)

    Bistouni, Fathollah; Jahanshahi, Mohsen

    2014-01-01

    Supercomputers and multi-processor systems are comprised of thousands of processors that need to communicate in an efficient way. One reasonable solution would be the utilization of multistage interconnection networks (MINs), where the challenge is to analyze the reliability of such networks. One of the methods to increase the reliability and fault-tolerance of the MINs is use of various switching stages. Therefore, recently, the reliability of one of the most common MINs namely shuffle-exchange network (SEN) has been evaluated through the investigation on the impact of increasing the number of switching stage. Also, it is concluded that the reliability of SEN with one additional stage (SEN+) is better than SEN or SEN with two additional stages (SEN+2), even so, the reliability of SEN is better compared to SEN with two additional stages (SEN+2). Here we re-evaluate the reliability of these networks where the results of the terminal, broadcast, and network reliability analysis demonstrate that SEN+ and SEN+2 continuously outperform SEN and are very alike in terms of reliability. - Highlights: • The impact of increasing the number of stages on reliability of MINs is investigated. • The RBD method as an accurate method is used for the reliability analysis of MINs. • Complex series–parallel RBDs are used to determine the reliability of the MINs. • All measures of the reliability (i.e. terminal, broadcast, and network reliability) are analyzed. • All reliability equations will be calculated for different size N×N

  7. Reliable methods for computer simulation error control and a posteriori estimates

    CERN Document Server

    Neittaanmäki, P

    2004-01-01

    Recent decades have seen a very rapid success in developing numerical methods based on explicit control over approximation errors. It may be said that nowadays a new direction is forming in numerical analysis, the main goal of which is to develop methods ofreliable computations. In general, a reliable numerical method must solve two basic problems: (a) generate a sequence of approximations that converges to a solution and (b) verify the accuracy of these approximations. A computer code for such a method must consist of two respective blocks: solver and checker.In this book, we are chie

  8. Training and Maintaining System-Wide Reliability in Outcome Management.

    Science.gov (United States)

    Barwick, Melanie A; Urajnik, Diana J; Moore, Julia E

    2014-01-01

    The Child and Adolescent Functional Assessment Scale (CAFAS) is widely used for outcome management, for providing real time client and program level data, and the monitoring of evidence-based practices. Methods of reliability training and the assessment of rater drift are critical for service decision-making within organizations and systems of care. We assessed two approaches for CAFAS training: external technical assistance and internal technical assistance. To this end, we sampled 315 practitioners trained by external technical assistance approach from 2,344 Ontario practitioners who had achieved reliability on the CAFAS. To assess the internal technical assistance approach as a reliable alternative training method, 140 practitioners trained internally were selected from the same pool of certified raters. Reliabilities were high for both practitioners trained by external technical assistance and internal technical assistance approaches (.909-.995, .915-.997, respectively). 1 and 3-year estimates showed some drift on several scales. High and consistent reliabilities over time and training method has implications for CAFAS training of behavioral health care practitioners, and the maintenance of CAFAS as a global outcome management tool in systems of care.

  9. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  10. Reliability of a Computerized Neurocognitive Test in Baseline Concussion Testing of High School Athletes.

    Science.gov (United States)

    MacDonald, James; Duerson, Drew

    2015-07-01

    Baseline assessments using computerized neurocognitive tests are frequently used in the management of sport-related concussions. Such testing is often done on an annual basis in a community setting. Reliability is a fundamental test characteristic that should be established for such tests. Our study examined the test-retest reliability of a computerized neurocognitive test in high school athletes over 1 year. Repeated measures design. Two American high schools. High school athletes (N = 117) participating in American football or soccer during the 2011-2012 and 2012-2013 academic years. All study participants completed 2 baseline computerized neurocognitive tests taken 1 year apart at their respective schools. The test measures performance on 4 cognitive tasks: identification speed (Attention), detection speed (Processing Speed), one card learning accuracy (Learning), and one back speed (Working Memory). Reliability was assessed by measuring the intraclass correlation coefficient (ICC) between the repeated measures of the 4 cognitive tasks. Pearson and Spearman correlation coefficients were calculated as a secondary outcome measure. The measure for identification speed performed best (ICC = 0.672; 95% confidence interval, 0.559-0.760) and the measure for one card learning accuracy performed worst (ICC = 0.401; 95% confidence interval, 0.237-0.542). All tests had marginal or low reliability. In a population of high school athletes, computerized neurocognitive testing performed in a community setting demonstrated low to marginal test-retest reliability on baseline assessments 1 year apart. Further investigation should focus on (1) improving the reliability of individual tasks tested, (2) controlling for external factors that might affect test performance, and (3) identifying the ideal time interval to repeat baseline testing in high school athletes. Computerized neurocognitive tests are used frequently in high school athletes, often within a model of baseline testing

  11. How to Map Theory: Reliable Methods Are Fruitless Without Rigorous Theory.

    Science.gov (United States)

    Gray, Kurt

    2017-09-01

    Good science requires both reliable methods and rigorous theory. Theory allows us to build a unified structure of knowledge, to connect the dots of individual studies and reveal the bigger picture. Some have criticized the proliferation of pet "Theories," but generic "theory" is essential to healthy science, because questions of theory are ultimately those of validity. Although reliable methods and rigorous theory are synergistic, Action Identification suggests psychological tension between them: The more we focus on methodological details, the less we notice the broader connections. Therefore, psychology needs to supplement training in methods (how to design studies and analyze data) with training in theory (how to connect studies and synthesize ideas). This article provides a technique for visually outlining theory: theory mapping. Theory mapping contains five elements, which are illustrated with moral judgment and with cars. Also included are 15 additional theory maps provided by experts in emotion, culture, priming, power, stress, ideology, morality, marketing, decision-making, and more (see all at theorymaps.org ). Theory mapping provides both precision and synthesis, which helps to resolve arguments, prevent redundancies, assess the theoretical contribution of papers, and evaluate the likelihood of surprising effects.

  12. Reliability and construction control

    Directory of Open Access Journals (Sweden)

    Sherif S. AbdelSalam

    2016-06-01

    Full Text Available The goal of this study was to determine the most reliable and efficient combination of design and construction methods required for vibro piles. For a wide range of static and dynamic formulas, the reliability-based resistance factors were calculated using EGYPT database, which houses load test results for 318 piles. The analysis was extended to introduce a construction control factor that determines the variation between the pile nominal capacities calculated using static versus dynamic formulae. From the major outcomes, the lowest coefficient of variation is associated with Davisson’s criterion, and the resistance factors calculated for the AASHTO method are relatively high compared with other methods. Additionally, the CPT-Nottingham and Schmertmann method provided the most economic design. Recommendations related to a pile construction control factor were also presented, and it was found that utilizing the factor can significantly reduce variations between calculated and actual capacities.

  13. 3-D high-frequency endovaginal ultrasound of female urethral complex and assessment of inter-observer reliability

    International Nuclear Information System (INIS)

    Wieczorek, A.P.; Wozniak, M.M.; Stankiewicz, A.; Santoro, G.A.; Bogusiewicz, M.; Rechberger, T.

    2012-01-01

    Objectives: Assessment of the urethral complex and defining its morphological characteristics with 3-dimensional endovaginal ultrasonography with the use of high frequency rotational 360° transducer. Defining inter-observer reliability of the performed measurements. Materials and methods: Twenty-four asymptomatic, nulliparous females (aged 18–55, mean 32 years) underwent high-frequency (12 MHz) endovaginal ultrasound with rotational 360° and automated 3D data acquisition (type 2050, B-K Medical, Herlev, Denmark). Measurements of the urethral thickness, width and length, bladder neck-symphysis distance, intramural part of the urethra as well as rhabdosphincter thickness, width and length were taken by three investigators. Descriptive statistics for continuous data was performed. The results were given as mean values with standard deviation. The relationships among different variables were assessed with ANOVA for repeated measures factors, as well as T-test for dependent samples. Intraclass correlation (ICC) was calculated for each parameter. Intra- and interobserver reliability was assessed. Statistical significance was assigned to a P value of 0.8) and good reliability for rhabdosphincter measurements (ICC > 0.6) between all three investigators. Conclusions: Advanced EVUS provides detailed information on anatomy and morphology of the female urethral complex. Our results show that 360° rotational transducer with automated 3D acquisition, currently routinely used for proctological scanning is suitable for the reliable assessment of the urethral complex and can be applied in a routine diagnostics of pelvic floor disturbances in females.

  14. Simple and rapid preparation of [11C]DASB with high quality and reliability for routine applications

    International Nuclear Information System (INIS)

    Haeusler, D.; Mien, L.-K.; Nics, L.; Ungersboeck, J.; Philippe, C.; Lanzenberger, R.R.; Kletter, K.; Dudczak, R.; Mitterhauser, M.; Wadsak, W.

    2009-01-01

    [ 11 C]DASB combines all major prerequisites for a successful SERT-ligand, providing excellent biological properties and in-vivo behaviour. Thus, we aimed to establish a fully automated procedure for the synthesis and purification of [ 11 C]DASB with a high degree of reliability reducing the overall synthesis time while conserving high yields and purity. The optimized [ 11 C]DASB synthesis was applied in more than 60 applications with a very low failure rate (3.2%). We obtained yields up to 8.9 GBq (average 5.3±1.6 GBq). Radiochemical yields based on [ 11 C]CH 3 I, (corrected for decay) were 66.3±6.9% with a specific radioactivity (A s ) of 86.8±24.3 GBq/μmol (both at the end of synthesis, EOS). Time consumption was kept to a minimum, resulting in 43 min from end of bombardment to release of the product after quality control. Form our data, it is evident that the presented method can be implemented for routine preparations of [ 11 C]DASB with high reliability.

  15. Development of reliability centered maintenance methods and tools

    International Nuclear Information System (INIS)

    Jacquot, J.P.; Dubreuil-Chambardel, A.; Lannoy, A.; Monnier, B.

    1992-12-01

    This paper recalls the development of the RCM (Reliability Centered Maintenance) approach in the nuclear industry and describes the trial study implemented by EDF in the context of the OMF (RCM) Project. The approach developed is currently being applied to about thirty systems (Industrial Project). On a parallel, R and D efforts are being maintained to improve the selectivity of the analysis methods. These methods use Probabilistic Safety Study models, thereby guaranteeing better selectivity in the identification of safety critical elements and enhancing consistency between Maintenance and Safety studies. They also offer more detailed analysis of operation feedback, invoking for example Bayes' methods combining expert judgement and feedback data. Finally, they propose a functional and material representation of the plant. This dual representation describes both the functions assured by maintenance provisions and the material elements required for their implementation. In the final chapter, the targets of the future OMF workstation are summarized and the latter's insertion in the EDF information system is briefly described. (authors). 5 figs., 2 tabs., 7 refs

  16. Evaluation of the reliability of Levine method of wound swab for ...

    African Journals Online (AJOL)

    The aim of this paper is to evaluate the reliability of Levine swab in accurate identification of microorganisms present in a wound and identify the necessity for further studies in this regard. Methods: A semi structured questionnaire was administered and physical examination was performed on patients with chronic wounds ...

  17. Test-retest reliability of cognitive EEG

    Science.gov (United States)

    McEvoy, L. K.; Smith, M. E.; Gevins, A.

    2000-01-01

    OBJECTIVE: Task-related EEG is sensitive to changes in cognitive state produced by increased task difficulty and by transient impairment. If task-related EEG has high test-retest reliability, it could be used as part of a clinical test to assess changes in cognitive function. The aim of this study was to determine the reliability of the EEG recorded during the performance of a working memory (WM) task and a psychomotor vigilance task (PVT). METHODS: EEG was recorded while subjects rested quietly and while they performed the tasks. Within session (test-retest interval of approximately 1 h) and between session (test-retest interval of approximately 7 days) reliability was calculated for four EEG components: frontal midline theta at Fz, posterior theta at Pz, and slow and fast alpha at Pz. RESULTS: Task-related EEG was highly reliable within and between sessions (r0.9 for all components in WM task, and r0.8 for all components in the PVT). Resting EEG also showed high reliability, although the magnitude of the correlation was somewhat smaller than that of the task-related EEG (r0.7 for all 4 components). CONCLUSIONS: These results suggest that under appropriate conditions, task-related EEG has sufficient retest reliability for use in assessing clinical changes in cognitive status.

  18. The Development of DNA Based Methods for the Reliable and Efficient Identification of Nicotiana tabacum in Tobacco and Its Derived Products

    Directory of Open Access Journals (Sweden)

    Sukumar Biswas

    2016-01-01

    Full Text Available Reliable methods are needed to detect the presence of tobacco components in tobacco products to effectively control smuggling and classify tariff and excise in tobacco industry to control illegal tobacco trade. In this study, two sensitive and specific DNA based methods, one quantitative real-time PCR (qPCR assay and the other loop-mediated isothermal amplification (LAMP assay, were developed for the reliable and efficient detection of the presence of tobacco (Nicotiana tabacum in various tobacco samples and commodities. Both assays targeted the same sequence of the uridine 5′-monophosphate synthase (UMPS, and their specificities and sensitivities were determined with various plant materials. Both qPCR and LAMP methods were reliable and accurate in the rapid detection of tobacco components in various practical samples, including customs samples, reconstituted tobacco samples, and locally purchased cigarettes, showing high potential for their application in tobacco identification, particularly in the special cases where the morphology or chemical compositions of tobacco have been disrupted. Therefore, combining both methods would facilitate not only the detection of tobacco smuggling control, but also the detection of tariff classification and of excise.

  19. Reliability and availability of high power proton accelerators

    International Nuclear Information System (INIS)

    Cho, Y.

    1999-01-01

    It has become increasingly important to address the issues of operational reliability and availability of an accelerator complex early in its design and construction phases. In this context, reliability addresses the mean time between failures and the failure rate, and availability takes into account the failure rate as well as the length of time required to repair the failure. Methods to reduce failure rates include reduction of the number of components and over-design of certain key components. Reduction of the on-line repair time can be achieved by judiciously designed hardware, quick-service spare systems and redundancy. In addition, provisions for easy inspection and maintainability are important for both reduction of the failure rate as well as reduction of the time to repair. The radiation safety exposure principle of ALARA (as low as reasonably achievable) is easier to comply with when easy inspection capability and easy maintainability are incorporated into the design. Discussions of past experience in improving accelerator availability, some recent developments, and potential R and D items are presented. (author)

  20. Features of an advanced human reliability analysis method, AGAPE-ET

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun [Korea Atomic Energy Research Institute, Taejeon (Korea, Republic of)

    2005-11-15

    This paper presents the main features of an advanced human reliability analysis (HRA) method, AGAPE-ET. It has the capabilities to deal with the diagnosis failures and the errors of commission (EOC), which have not been normally treated in the conventional HRAs. For the analysis of the potential for diagnosis failures, an analysis framework, which is called the misdiagnosis tree analysis (MDTA), and a taxonomy of the misdiagnosis causes with appropriate quantification schemes are provided. For the identification of the EOC events from the misdiagnosis, some procedural guidance is given. An example of the application of the method is also provided.

  1. Features of an advanced human reliability analysis method, AGAPE-ET

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun

    2005-01-01

    This paper presents the main features of an advanced human reliability analysis (HRA) method, AGAPE-ET. It has the capabilities to deal with the diagnosis failures and the errors of commission (EOC), which have not been normally treated in the conventional HRAs. For the analysis of the potential for diagnosis failures, an analysis framework, which is called the misdiagnosis tree analysis (MDTA), and a taxonomy of the misdiagnosis causes with appropriate quantification schemes are provided. For the identification of the EOC events from the misdiagnosis, some procedural guidance is given. An example of the application of the method is also provided

  2. Mission Reliability Estimation for Repairable Robot Teams

    Science.gov (United States)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  3. Leadership in organizations with high security and reliability requirements

    International Nuclear Information System (INIS)

    Gonzalez, F.

    2013-01-01

    Developing leadership skills in organizations is the key to ensure the sustainability of excellent results in industries with high requirements safety and reliability. In order to have a model of leadership development specific to this type of organizations, Tecnatom in 2011, we initiated a project internal, to find and adapt a competency model to these requirements.

  4. A Fast Optimization Method for Reliability and Performance of Cloud Services Composition Application

    Directory of Open Access Journals (Sweden)

    Zhao Wu

    2013-01-01

    Full Text Available At present the cloud computing is one of the newest trends of distributed computation, which is propelling another important revolution of software industry. The cloud services composition is one of the key techniques in software development. The optimization for reliability and performance of cloud services composition application, which is a typical stochastic optimization problem, is confronted with severe challenges due to its randomness and long transaction, as well as the characteristics of the cloud computing resources such as openness and dynamic. The traditional reliability and performance optimization techniques, for example, Markov model and state space analysis and so forth, have some defects such as being too time consuming and easy to cause state space explosion and unsatisfied the assumptions of component execution independence. To overcome these defects, we propose a fast optimization method for reliability and performance of cloud services composition application based on universal generating function and genetic algorithm in this paper. At first, a reliability and performance model for cloud service composition application based on the multiple state system theory is presented. Then the reliability and performance definition based on universal generating function is proposed. Based on this, a fast reliability and performance optimization algorithm is presented. In the end, the illustrative examples are given.

  5. Coupling finite elements and reliability methods - application to safety evaluation of pressurized water reactor vessels

    International Nuclear Information System (INIS)

    Pitner, P.; Venturini, V.

    1995-02-01

    When reliability studies are extended form deterministic calculations in mechanics, it is necessary to take into account input parameters variabilities which are linked to the different sources of uncertainty. Integrals must then be calculated to evaluate the failure risk. This can be performed either by simulation methods, or by approximations ones (FORM/SORM). Model in mechanics often require to perform calculation codes. These ones must then be coupled with the reliability calculations. Theses codes can involve large calculation times when they are invoked numerous times during simulations sequences or in complex iterative procedures. Response surface method gives an approximation of the real response from a reduced number of points for which the finite element code is run. Thus, when it is combined with FORM/SORM methods, a coupling can be carried out which gives results in a reasonable calculation time. An application of response surface method to mechanics reliability coupling for a mechanical model which calls for a finite element code is presented. It corresponds to a probabilistic fracture mechanics study of a pressurized water reactor vessel. (authors). 5 refs., 3 figs

  6. Analysis of factors influencing the reliability of retrievable storage canisters for containment of solid high-level radioactive waste

    International Nuclear Information System (INIS)

    Mecham, W.J.; Seefeldt, W.B.; Steindler, M.J.

    1976-08-01

    The reliability of stainless steel type 304L canisters for the containment of solidified high-level radioactive wastes in the glass and calcine forms was studied. A reference system, drawn largely from information furnished by Battelle Northwest Laboratories and Atlantic Richfield Hanford Company is described. Operations include filling the canister with the appropriate waste form, interim storage at a reprocessing plant, shipment in water to a Retrievable Surface Storage Facility (RSSF), interim storage at the RSSF, and shipment to a final disposal facility. The properties of stainless steel type 304L, fission product oxides, calcine, and glass were reviewed, and mechanisms of corrosion were identified and studied. The modes of corrosion important for reliability were stress-corrosion cracking, internal pressurization of the canister by residual impurities present, intergranular attack at the waste-canister interface, and potential local effects due to migration of fission products. The key role of temperature control throughout canister lifetime is considered together with interactive effects. Methods of ameliorating adverse effects and ensuring high reliability are identified and described. Conclusions and recommendations are presented

  7. On the Reliability of Source Time Functions Estimated Using Empirical Green's Function Methods

    Science.gov (United States)

    Gallegos, A. C.; Xie, J.; Suarez Salas, L.

    2017-12-01

    The Empirical Green's Function (EGF) method (Hartzell, 1978) has been widely used to extract source time functions (STFs). In this method, seismograms generated by collocated events with different magnitudes are deconvolved. Under a fundamental assumption that the STF of the small event is a delta function, the deconvolved Relative Source Time Function (RSTF) yields the large event's STF. While this assumption can be empirically justified by examination of differences in event size and frequency content of the seismograms, there can be a lack of rigorous justification of the assumption. In practice, a small event might have a finite duration when the RSTF is retrieved and interpreted as the large event STF with a bias. In this study, we rigorously analyze this bias using synthetic waveforms generated by convolving a realistic Green's function waveform with pairs of finite-duration triangular or parabolic STFs. The RSTFs are found using a time-domain based matrix deconvolution. We find when the STFs of smaller events are finite, the RSTFs are a series of narrow non-physical spikes. Interpreting these RSTFs as a series of high-frequency source radiations would be very misleading. The only reliable and unambiguous information we can retrieve from these RSTFs is the difference in durations and the moment ratio of the two STFs. We can apply a Tikhonov smoothing to obtain a single-pulse RSTF, but its duration is dependent on the choice of weighting, which may be subjective. We then test the Multi-Channel Deconvolution (MCD) method (Plourde & Bostock, 2017) which assumes that both STFs have finite durations to be solved for. A concern about the MCD method is that the number of unknown parameters is larger, which would tend to make the problem rank-deficient. Because the kernel matrix is dependent on the STFs to be solved for under a positivity constraint, we can only estimate the rank-deficiency with a semi-empirical approach. Based on the results so far, we find that the

  8. Surviving the Lead Reliability Engineer Role in High Unit Value Projects

    Science.gov (United States)

    Perez, Reinaldo J.

    2011-01-01

    A project with a very high unit value within a company is defined as a project where a) the project constitutes one of a kind (or two-of-a-kind) national asset type of project, b) very large cost, and c) a mission failure would be a very public event that will hurt the company's image. The Lead Reliability engineer in a high visibility project is by default involved in all phases of the project, from conceptual design to manufacture and testing. This paper explores a series of lessons learned, over a period of ten years of practical industrial experience by a Lead Reliability Engineer. We expand on the concepts outlined by these lessons learned via examples. The lessons learned are applicable to all industries.

  9. Structural Reliability Methods for Wind Power Converter System Component Reliability Assessment

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Wind power converter systems are essential subsystems in both off-shore and on-shore wind turbines. It is the main interface between generator and grid connection. This system is affected by numerous stresses where the main contributors might be defined as vibration and temperature loadings....... The temperature variations induce time-varying stresses and thereby fatigue loads. A probabilistic model is used to model fatigue failure for an electrical component in the power converter system. This model is based on a linear damage accumulation and physics of failure approaches, where a failure criterion...... is defined by the threshold model. The attention is focused on crack propagation in solder joints of electrical components due to the temperature loadings. Structural Reliability approaches are used to incorporate model, physical and statistical uncertainties. Reliability estimation by means of structural...

  10. Reliability engineering for nuclear and other high technology systems

    International Nuclear Information System (INIS)

    Lakner, A.A.; Anderson, R.T.

    1985-01-01

    This book is written for the reliability instructor, program manager, system engineer, design engineer, reliability engineer, nuclear regulator, probability risk assessment (PRA) analyst, general manager and others who are involved in system hardware acquisition, design and operation and are concerned with plant safety and operational cost-effectiveness. It provides criteria, guidelines and comprehensive engineering data affecting reliability; it covers the key aspects of system reliability as it relates to conceptual planning, cost tradeoff decisions, specification, contractor selection, design, test and plant acceptance and operation. It treats reliability as an integrated methodology, explicitly describing life cycle management techniques as well as the basic elements of a total hardware development program, including: reliability parameters and design improvement attributes, reliability testing, reliability engineering and control. It describes how these elements can be defined during procurement, and implemented during design and development to yield reliable equipment. (author)

  11. Highly-reliable operation of 638-nm broad stripe laser diode with high wall-plug efficiency for display applications

    Science.gov (United States)

    Yagi, Tetsuya; Shimada, Naoyuki; Nishida, Takehiro; Mitsuyama, Hiroshi; Miyashita, Motoharu

    2013-03-01

    Laser based displays, as pico to cinema laser projectors have gathered much attention because of wide gamut, low power consumption, and so on. Laser light sources for the displays are operated mainly in CW, and heat management is one of the big issues. Therefore, highly efficient operation is necessitated. Also the light sources for the displays are requested to be highly reliable. 638 nm broad stripe laser diode (LD) was newly developed for high efficiency and highly reliable operation. An AlGaInP/GaAs red LD suffers from low wall plug efficiency (WPE) due to electron overflow from an active layer to a p-cladding layer. Large optical confinement factor (Γ) design with AlInP cladding layers is adopted to improve the WPE. The design has a disadvantage for reliable operation because the large Γ causes high optical density and brings a catastrophic optical degradation (COD) at a front facet. To overcome the disadvantage, a window-mirror structure is also adopted in the LD. The LD shows WPE of 35% at 25°C, highest record in the world, and highly stable operation at 35°C, 550 mW up to 8,000 hours without any catastrophic optical degradation.

  12. A Reliability-Oriented Design Method for Power Electronic Converters

    DEFF Research Database (Denmark)

    Wang, Huai; Zhou, Dao; Blaabjerg, Frede

    2013-01-01

    Reliability is a crucial performance indicator of power electronic systems in terms of availability, mission accomplishment and life cycle cost. A paradigm shift in the research on reliability of power electronics is going on from simple handbook based calculations (e.g. models in MIL-HDBK-217F h...... and reliability prediction models are provided. A case study on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical component IGBT modules....

  13. High reliability fuel in the US

    International Nuclear Information System (INIS)

    Neuhold, R.J.; Leggett, R.D.; Walters, L.C.; Matthews, R.B.

    1986-05-01

    The fuels development program of the United States is described for liquid metal reactors (LMR's). The experience base, status and future potential are discussed for the three systems - oxide, metal and carbide - that have proved to have high reliability. Information is presented showing burnup capability of the oxide fuel system in a large core, e.g., FFTF, to be 150 MWd/kgM with today's technology with the potential for a capability as high as 300 MWd/kgM. Data provided for the metal fuel system show 8 at. % being routinely achieved as the EBR-II driver fuel with good potential for extending this to 15 at. % since special test pins have already exceeded this burnup level. The data included for the carbide fuel system are from pin and assembly irradiations in EBR-II and FFTF, respectively. Burnup to 12 at. % appears readily achievable with burnups to 20 at. % being demonstrated in a few pins. Efforts continue on all three systems with the bulk of the activity on metal and oxide

  14. Sensitivity Weaknesses in Application of some Statistical Distribution in First Order Reliability Methods

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Enevoldsen, I.

    1993-01-01

    It has been observed and shown that in some examples a sensitivity analysis of the first order reliability index results in increasing reliability index, when the standard deviation for a stochastic variable is increased while the expected value is fixed. This unfortunate behaviour can occur when...... a stochastic variable is modelled by an asymmetrical density function. For lognormally, Gumbel and Weibull distributed stochastic variables it is shown for which combinations of the/3-point, the expected value and standard deviation the weakness can occur. In relation to practical application the behaviour...... is probably rather infrequent. A simple example is shown as illustration and to exemplify that for second order reliability methods and for exact calculations of the probability of failure this behaviour is much more infrequent....

  15. Review on Laryngeal Palpation Methods in Muscle Tension Dysphonia: Validity and Reliability Issues.

    Science.gov (United States)

    Khoddami, Seyyedeh Maryam; Ansari, Noureddin Nakhostin; Jalaie, Shohreh

    2015-07-01

    Laryngeal palpation is a common clinical method for the assessment of neck and laryngeal muscles in muscle tension dysphonia (MTD). To review the available laryngeal palpation methods used in patients with MTD for the assessment, diagnosis, or document of treatment outcomes. A systematic review of the literature concerning palpatory methods in MTD was conducted using the databases MEDLINE (PubMed), ScienceDirect, Scopus, Web of science, Web of knowledge and Cochrane Library between July and October 2013. Relevant studies were identified by one reviewer based on screened titles/abstracts and full texts. Manual searching was also used to track the source literature. There were five main as well as miscellaneous palpation methods that were different according to target anatomical structures, judgment or grading system, and using tasks. There were only a few scales available, and the majority of the palpatory methods were qualitative. Most of the palpatory methods evaluate the tension at both static and dynamic tasks. There was little information about the validity and reliability of the available methods. The literature on the scientific evidence of muscle tension indicators perceived by laryngeal palpation in MTD is scarce. Future studies should be conducted to investigate the validity and reliability of palpation methods. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  16. A reliability design method for a lithium-ion battery pack considering the thermal disequilibrium in electric vehicles

    Science.gov (United States)

    Xia, Quan; Wang, Zili; Ren, Yi; Sun, Bo; Yang, Dezhen; Feng, Qiang

    2018-05-01

    With the rapid development of lithium-ion battery technology in the electric vehicle (EV) industry, the lifetime of the battery cell increases substantially; however, the reliability of the battery pack is still inadequate. Because of the complexity of the battery pack, a reliability design method for a lithium-ion battery pack considering the thermal disequilibrium is proposed in this paper based on cell redundancy. Based on this method, a three-dimensional electric-thermal-flow-coupled model, a stochastic degradation model of cells under field dynamic conditions and a multi-state system reliability model of a battery pack are established. The relationships between the multi-physics coupling model, the degradation model and the system reliability model are first constructed to analyze the reliability of the battery pack and followed by analysis examples with different redundancy strategies. By comparing the reliability of battery packs of different redundant cell numbers and configurations, several conclusions for the redundancy strategy are obtained. More notably, the reliability does not monotonically increase with the number of redundant cells for the thermal disequilibrium effects. In this work, the reliability of a 6 × 5 parallel-series configuration is the optimal system structure. In addition, the effect of the cell arrangement and cooling conditions are investigated.

  17. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  18. Reliability and validity in measurement of true humeral retroversion by a three-dimensional cylinder fitting method.

    Science.gov (United States)

    Saka, Masayuki; Yamauchi, Hiroki; Hoshi, Kenji; Yoshioka, Toru; Hamada, Hidetoshi; Gamada, Kazuyoshi

    2015-05-01

    Humeral retroversion is defined as the orientation of the humeral head relative to the distal humerus. Because none of the previous methods used to measure humeral retroversion strictly follow this definition, values obtained by these techniques vary and may be biased by morphologic variations of the humerus. The purpose of this study was 2-fold: to validate a method to define the axis of the distal humerus with a virtual cylinder and to establish the reliability of 3-dimensional (3D) measurement of humeral retroversion by this cylinder fitting method. Humeral retroversion in 14 baseball players (28 humeri) was measured by the 3D cylinder fitting method. The root mean square error was calculated to compare values obtained by a single tester and by 2 different testers using the embedded coordinate system. To establish the reliability, intraclass correlation coefficient (ICC) and precision (standard error of measurement [SEM]) were calculated. The root mean square errors for the humeral coordinate system were reliability and precision of the 3D measurement of retroversion yielded an intratester ICC of 0.99 (SEM, 1.0°) and intertester ICC of 0.96 (SEM, 2.8°). The error in measurements obtained by a distal humerus cylinder fitting method was small enough not to affect retroversion measurement. The 3D measurement of retroversion by this method provides excellent intratester and intertester reliability. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  19. Conceptual transitions in methods of skull-photo superimposition that impact the reliability of identification: a review.

    Science.gov (United States)

    Jayaprakash, Paul T

    2015-01-01

    Establishing identification during skull-photo superimposition relies on correlating the salient morphological features of an unidentified skull with those of a face-image of a suspected dead individual using image overlay processes. Technical progression in the process of overlay has included the incorporation of video cameras, image-mixing devices and software that enables real-time vision-mixing. Conceptual transitions occur in the superimposition methods that involve 'life-size' images, that achieve orientation of the skull to the posture of the face in the photograph and that assess the extent of match. A recent report on the reliability of identification using the superimposition method adopted the currently prevalent methods and suggested an increased rate of failures when skulls were compared with related and unrelated face images. The reported reduction in the reliability of the superimposition method prompted a review of the transition in the concepts that are involved in skull-photo superimposition. The prevalent popular methods for visualizing the superimposed images at less than 'life-size', overlaying skull-face images by relying on the cranial and facial landmarks in the frontal plane when orienting the skull for matching and evaluating the match on a morphological basis by relying on mix-mode alone are the major departures in the methodology that may have reduced the identification reliability. The need to reassess the reliability of the method that incorporates the concepts which have been considered appropriate by the practitioners is stressed. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Rapid and Reliable HPLC Method for the Simultaneous Determination of Dihydroxyacetone, Methylglyoxal and 5-Hydroxymethylfurfural in Leptospermum Honeys.

    Directory of Open Access Journals (Sweden)

    Matthew Pappalardo

    Full Text Available A reliable determination of dihydroxyacetone, methylglyoxal and 5-hydroxymethylfurfural is essential to establishing the commercial value and antimicrobial potential of honeys derived from the Leptospermum species endemic to Australia and New Zealand. We report a robust method for quantitation of all three compounds in a single HPLC run. Honey samples (n = 6 that are derivatized with o-(2,3,4,5,6-Pentafluorobenzyl hydroxylamine were quantitated against a stable anisole internal standard. Linear regression analysis was performed using calibration standards for each compound (n = 6 and results indicated a high degree of accuracy (R2 = 0.999 for this method. The reliability of some commercial methylglyoxal solutions were found to be questionable. Effective quantitation of methylglyoxal content in honey is critical for researchers and industry, and the use of some commercial standards may bias data. Two accurate methylglyoxal standards are proposed, including a commercial standard and a derivative that can be prepared within the laboratory.

  1. A critique of reliability prediction techniques for avionics applications

    Directory of Open Access Journals (Sweden)

    Guru Prasad PANDIAN

    2018-01-01

    Full Text Available Avionics (aeronautics and aerospace industries must rely on components and systems of demonstrated high reliability. For this, handbook-based methods have been traditionally used to design for reliability, develop test plans, and define maintenance requirements and sustainment logistics. However, these methods have been criticized as flawed and leading to inaccurate and misleading results. In its recent report on enhancing defense system reliability, the U.S. National Academy of Sciences has recently discredited these methods, judging the Military Handbook (MIL-HDBK-217 and its progeny as invalid and inaccurate. This paper discusses the issues that arise with the use of handbook-based methods in commercial and military avionics applications. Alternative approaches to reliability design (and its demonstration are also discussed, including similarity analysis, testing, physics-of-failure, and data analytics for prognostics and systems health management.

  2. High reliable and Real-time Data Communication Network Technology for Nuclear Power Plant

    International Nuclear Information System (INIS)

    Jeong, K. I.; Lee, J. K.; Choi, Y. R.; Lee, J. C.; Choi, Y. S.; Cho, J. W.; Hong, S. B.; Jung, J. E.; Koo, I. S.

    2008-03-01

    As advanced digital Instrumentation and Control (I and C) system of NPP(Nuclear Power Plant) are being introduced to replace analog systems, a Data Communication Network(DCN) is becoming the important system for transmitting the data generated by I and C systems in NPP. In order to apply the DCNs to NPP I and C design, DCNs should conform to applicable acceptance criteria and meet the reliability and safety goals of the system. As response time is impacted by the selected protocol, network topology, network performance, and the network configuration of I and C system, DCNs should transmit a data within time constraints and response time required by I and C systems to satisfy response time requirements of I and C system. To meet these requirements, the DCNs of NPP I and C should be a high reliable and real-time system. With respect to high reliable and real-time system, several reports and techniques having influences upon the reliability and real-time requirements of DCNs are surveyed and analyzed

  3. Exploratory factor analysis and reliability analysis with missing data: A simple method for SPSS users

    Directory of Open Access Journals (Sweden)

    Bruce Weaver

    2014-09-01

    Full Text Available Missing data is a frequent problem for researchers conducting exploratory factor analysis (EFA or reliability analysis. The SPSS FACTOR procedure allows users to select listwise deletion, pairwise deletion or mean substitution as a method for dealing with missing data. The shortcomings of these methods are well-known. Graham (2009 argues that a much better way to deal with missing data in this context is to use a matrix of expectation maximization (EM covariances(or correlations as input for the analysis. SPSS users who have the Missing Values Analysis add-on module can obtain vectors ofEM means and standard deviations plus EM correlation and covariance matrices via the MVA procedure. But unfortunately, MVA has no /MATRIX subcommand, and therefore cannot write the EM correlations directly to a matrix dataset of the type needed as input to the FACTOR and RELIABILITY procedures. We describe two macros that (in conjunction with an intervening MVA command carry out the data management steps needed to create two matrix datasets, one containing EM correlations and the other EM covariances. Either of those matrix datasets can then be used asinput to the FACTOR procedure, and the EM correlations can also be used as input to RELIABILITY. We provide an example that illustrates the use of the two macros to generate the matrix datasets and how to use those datasets as input to the FACTOR and RELIABILITY procedures. We hope that this simple method for handling missing data will prove useful to both students andresearchers who are conducting EFA or reliability analysis.

  4. A Method to Increase Drivers' Trust in Collision Warning Systems Based on Reliability Information of Sensor

    Science.gov (United States)

    Tsutsumi, Shigeyoshi; Wada, Takahiro; Akita, Tokihiko; Doi, Shun'ichi

    Driver's workload tends to be increased during driving under complicated traffic environments like a lane change. In such cases, rear collision warning is effective for reduction of cognitive workload. On the other hand, it is pointed out that false alarm or missing alarm caused by sensor errors leads to decrease of driver' s trust in the warning system and it can result in low efficiency of the system. Suppose that reliability information of the sensor is provided in real-time. In this paper, we propose a new warning method to increase driver' s trust in the system even with low sensor reliability utilizing the sensor reliability information. The effectiveness of the warning methods is shown by driving simulator experiments.

  5. Safety and reliability analysis based on nonprobabilistic methods

    International Nuclear Information System (INIS)

    Kozin, I.O.; Petersen, K.E.

    1996-01-01

    Imprecise probabilities, being developed during the last two decades, offer a considerably more general theory having many advantages which make it very promising for reliability and safety analysis. The objective of the paper is to argue that imprecise probabilities are more appropriate tool for reliability and safety analysis, that they allow to model the behavior of nuclear industry objects more comprehensively and give a possibility to solve some problems unsolved in the framework of conventional approach. Furthermore, some specific examples are given from which we can see the usefulness of the tool for solving some reliability tasks

  6. Validity and reliability of a method for assessment of cervical vertebral maturation.

    Science.gov (United States)

    Zhao, Xiao-Guang; Lin, Jiuxiang; Jiang, Jiu-Hui; Wang, Qingzhu; Ng, Sut Hong

    2012-03-01

    To evaluate the validity and reliability of the cervical vertebral maturation (CVM) method with a longitudinal sample. Eighty-six cephalograms from 18 subjects (5 males and 13 females) were selected from the longitudinal database. Total mandibular length was measured on each film; an increased rate served as the gold standard in examination of the validity of the CVM method. Eleven orthodontists, after receiving intensive training in the CVM method, evaluated all films twice. Kendall's W and the weighted kappa statistic were employed. Kendall's W values were higher than 0.8 at both times, indicating strong interobserver reproducibility, but interobserver agreement was documented twice at less than 50%. A wide range of intraobserver agreement was noted (40.7%-79.1%), and substantial intraobserver reproducibility was proved by kappa values (0.53-0.86). With regard to validity, moderate agreement was reported between the gold standard and observer staging at the initial time (kappa values 0.44-0.61). However, agreement seemed to be unacceptable for clinical use, especially in cervical stage 3 (26.8%). Even though the validity and reliability of the CVM method proved statistically acceptable, we suggest that many other growth indicators should be taken into consideration in evaluating adolescent skeletal maturation.

  7. Simple and rapid preparation of [{sup 11}C]DASB with high quality and reliability for routine applications

    Energy Technology Data Exchange (ETDEWEB)

    Haeusler, D.; Mien, L.-K. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Department of Pharmaceutical Technology and Biopharmaceutics, University of Vienna, A-1090 Vienna (Austria); Nics, L. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Department of Nutritional Sciences, University of Vienna, A-1090 Vienna (Austria); Ungersboeck, J. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Department of Inorganic Chemistry, University of Vienna, A-1090 Vienna (Austria); Philippe, C. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Department of Pharmaceutical Technology and Biopharmaceutics, University of Vienna, A-1090 Vienna (Austria); Lanzenberger, R.R. [Department of Psychiatry and Psychotherapy, Medical University of Vienna, A-1090 Vienna (Austria); Kletter, K.; Dudczak, R. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Mitterhauser, M. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Department of Pharmaceutical Technology and Biopharmaceutics, University of Vienna, A-1090 Vienna (Austria); Hospital Pharmacy of the General Hospital of Vienna, A-1090 Vienna (Austria); Wadsak, W. [Department of Nuclear Medicine, PET, Medical University of Vienna, Waehringer Guertel 18-20, A-1090 Vienna (Austria); Department of Inorganic Chemistry, University of Vienna, A-1090 Vienna (Austria)], E-mail: wolfgang.wadsak@meduniwien.ac.at

    2009-09-15

    [{sup 11}C]DASB combines all major prerequisites for a successful SERT-ligand, providing excellent biological properties and in-vivo behaviour. Thus, we aimed to establish a fully automated procedure for the synthesis and purification of [{sup 11}C]DASB with a high degree of reliability reducing the overall synthesis time while conserving high yields and purity. The optimized [{sup 11}C]DASB synthesis was applied in more than 60 applications with a very low failure rate (3.2%). We obtained yields up to 8.9 GBq (average 5.3{+-}1.6 GBq). Radiochemical yields based on [{sup 11}C]CH{sub 3}I, (corrected for decay) were 66.3{+-}6.9% with a specific radioactivity (A{sub s}) of 86.8{+-}24.3 GBq/{mu}mol (both at the end of synthesis, EOS). Time consumption was kept to a minimum, resulting in 43 min from end of bombardment to release of the product after quality control. Form our data, it is evident that the presented method can be implemented for routine preparations of [{sup 11}C]DASB with high reliability.

  8. Reliability studies of a high-power proton accelerator for accelerator-driven system applications for nuclear waste transmutation

    Energy Technology Data Exchange (ETDEWEB)

    Burgazzi, Luciano [ENEA-Centro Ricerche ' Ezio Clementel' , Advanced Physics Technology Division, Via Martiri di Monte Sole, 4, 40129 Bologna (Italy)]. E-mail: burgazzi@bologna.enea.it; Pierini, Paolo [INFN-Sezione di Milano, Laboratorio Acceleratori e Superconduttivita Applicata, Via Fratelli Cervi 201, I-20090 Segrate (MI) (Italy)

    2007-04-15

    The main effort of the present study is to analyze the availability and reliability of a high-performance linac (linear accelerator) conceived for Accelerator-Driven Systems (ADS) purpose and to suggest recommendations, in order both to meet the high operability goals and to satisfy the safety requirements dictated by the reactor system. Reliability Block Diagrams (RBD) approach has been considered for system modelling, according to the present level of definition of the design: component failure modes are assessed in terms of Mean Time Between Failure (MTBF) and Mean Time To Repair (MTTR), reliability and availability figures are derived, applying the current reliability algorithms. The lack of a well-established component database has been pointed out as the main issue related to the accelerator reliability assessment. The results, affected by the conservative character of the study, show a high margin for the improvement in terms of accelerator reliability and availability figures prediction. The paper outlines the viable path towards the accelerator reliability and availability enhancement process and delineates the most proper strategies. The improvement in the reliability characteristics along this path is shown as well.

  9. Reliability studies of a high-power proton accelerator for accelerator-driven system applications for nuclear waste transmutation

    International Nuclear Information System (INIS)

    Burgazzi, Luciano; Pierini, Paolo

    2007-01-01

    The main effort of the present study is to analyze the availability and reliability of a high-performance linac (linear accelerator) conceived for Accelerator-Driven Systems (ADS) purpose and to suggest recommendations, in order both to meet the high operability goals and to satisfy the safety requirements dictated by the reactor system. Reliability Block Diagrams (RBD) approach has been considered for system modelling, according to the present level of definition of the design: component failure modes are assessed in terms of Mean Time Between Failure (MTBF) and Mean Time To Repair (MTTR), reliability and availability figures are derived, applying the current reliability algorithms. The lack of a well-established component database has been pointed out as the main issue related to the accelerator reliability assessment. The results, affected by the conservative character of the study, show a high margin for the improvement in terms of accelerator reliability and availability figures prediction. The paper outlines the viable path towards the accelerator reliability and availability enhancement process and delineates the most proper strategies. The improvement in the reliability characteristics along this path is shown as well

  10. A novel high reliability CMOS SRAM cell

    Energy Technology Data Exchange (ETDEWEB)

    Xie Chengmin; Wang Zhongfang; Wu Longsheng; Liu Youbao, E-mail: hglnew@sina.com [Computer Research and Design Department, Xi' an Microelectronic Technique Institutes, Xi' an 710054 (China)

    2011-07-15

    A novel 8T single-event-upset (SEU) hardened and high static noise margin (SNM) SRAM cell is proposed. By adding one transistor paralleled with each access transistor, the drive capability of pull-up PMOS is greater than that of the conventional cell and the read access transistors are weaker than that of the conventional cell. So the hold, read SNM and critical charge increase greatly. The simulation results show that the critical charge is almost three times larger than that of the conventional 6T cell by appropriately sizing the pull-up transistors. The hold and read SNM of the new cell increase by 72% and 141.7%, respectively, compared to the 6T design, but it has a 54% area overhead and read performance penalty. According to these features, this novel cell suits high reliability applications, such as aerospace and military. (semiconductor integrated circuits)

  11. A novel high reliability CMOS SRAM cell

    International Nuclear Information System (INIS)

    Xie Chengmin; Wang Zhongfang; Wu Longsheng; Liu Youbao

    2011-01-01

    A novel 8T single-event-upset (SEU) hardened and high static noise margin (SNM) SRAM cell is proposed. By adding one transistor paralleled with each access transistor, the drive capability of pull-up PMOS is greater than that of the conventional cell and the read access transistors are weaker than that of the conventional cell. So the hold, read SNM and critical charge increase greatly. The simulation results show that the critical charge is almost three times larger than that of the conventional 6T cell by appropriately sizing the pull-up transistors. The hold and read SNM of the new cell increase by 72% and 141.7%, respectively, compared to the 6T design, but it has a 54% area overhead and read performance penalty. According to these features, this novel cell suits high reliability applications, such as aerospace and military. (semiconductor integrated circuits)

  12. Design and reliability analysis of high-speed and continuous data recording system based on disk array

    Science.gov (United States)

    Jiang, Changlong; Ma, Cheng; He, Ning; Zhang, Xugang; Wang, Chongyang; Jia, Huibo

    2002-12-01

    In many real-time fields the sustained high-speed data recording system is required. This paper proposes a high-speed and sustained data recording system based on the complex-RAID 3+0. The system consists of Array Controller Module (ACM), String Controller Module (SCM) and Main Controller Module (MCM). ACM implemented by an FPGA chip is used to split the high-speed incoming data stream into several lower-speed streams and generate one parity code stream synchronously. It also can inversely recover the original data stream while reading. SCMs record lower-speed streams from the ACM into the SCSI disk drivers. In the SCM, the dual-page buffer technology is adopted to implement speed-matching function and satisfy the need of sustainable recording. MCM monitors the whole system, controls ACM and SCMs to realize the data stripping, reconstruction, and recovery functions. The method of how to determine the system scale is presented. At the end, two new ways Floating Parity Group (FPG) and full 2D-Parity Group (full 2D-PG) are proposed to improve the system reliability and compared with the Traditional Parity Group (TPG). This recording system can be used conveniently in many areas of data recording, storing, playback and remote backup with its high-reliability.

  13. Reliability Oriented Design of a Grid-Connected Photovoltaic Microinverter

    DEFF Research Database (Denmark)

    Shen, Yanfeng; Wang, Huai; Blaabjerg, Frede

    2017-01-01

    High reliability performance of microinverters in Photovoltaic (PV) systems is a merit to match lifetime with PV panels, and to reduce the required maintenance efforts and costs. This digest applies a reliability oriented design method for a grid-connected PV microinverter to achieve specific...

  14. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  15. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  16. High-performance liquid chromatographic method for guanylhydrazone compounds.

    Science.gov (United States)

    Cerami, C; Zhang, X; Ulrich, P; Bianchi, M; Tracey, K J; Berger, B J

    1996-01-12

    A high-performance liquid chromatographic method has been developed for a series of aromatic guanylhydrazones that have demonstrated therapeutic potential as anti-inflammatory agents. The compounds were separated using octadecyl or diisopropyloctyl reversed-phase columns, with an acetonitrile gradient in water containing heptane sulfonate, tetramethylammonium chloride, and phosphoric acid. The method was used to reliably quantify levels of analyte as low as 785 ng/ml, and the detector response was linear to at least 50 micrograms/ml using a 100 microliters injection volume. The assay system was used to determine the basic pharmacokinetics of a lead compound, CNI-1493, from serum concentrations following a single intravenous injection in rats.

  17. A reliability analysis tool for SpaceWire network

    Science.gov (United States)

    Zhou, Qiang; Zhu, Longjiang; Fei, Haidong; Wang, Xingyou

    2017-04-01

    A SpaceWire is a standard for on-board satellite networks as the basis for future data-handling architectures. It is becoming more and more popular in space applications due to its technical advantages, including reliability, low power and fault protection, etc. High reliability is the vital issue for spacecraft. Therefore, it is very important to analyze and improve the reliability performance of the SpaceWire network. This paper deals with the problem of reliability modeling and analysis with SpaceWire network. According to the function division of distributed network, a reliability analysis method based on a task is proposed, the reliability analysis of every task can lead to the system reliability matrix, the reliability result of the network system can be deduced by integrating these entire reliability indexes in the matrix. With the method, we develop a reliability analysis tool for SpaceWire Network based on VC, where the computation schemes for reliability matrix and the multi-path-task reliability are also implemented. By using this tool, we analyze several cases on typical architectures. And the analytic results indicate that redundancy architecture has better reliability performance than basic one. In practical, the dual redundancy scheme has been adopted for some key unit, to improve the reliability index of the system or task. Finally, this reliability analysis tool will has a directive influence on both task division and topology selection in the phase of SpaceWire network system design.

  18. Technology Improvement for the High Reliability LM-2F Launch Vehicle

    Institute of Scientific and Technical Information of China (English)

    QIN Tong; RONG Yi; ZHENG Liwei; ZHANG Zhi

    2017-01-01

    The Long March 2F (LM-2F) launch vehicle,the only launch vehicle designed for manned space flight in China,successfully launched the Tiangong 2 space laboratory and the Shenzhou ll manned spaceship into orbits in 2016 respectively.In this study,it introduces the technological improvements for enhancing the reliability of the LM-2F launch vehicle in the aspects of general technology,control system,manufacture and ground support system.The LM2F launch vehicle will continue to provide more contributions to the Chinese Space Station Project with its high reliability and 100% success rate.

  19. Method for controlling low-energy high current density electron beams

    International Nuclear Information System (INIS)

    Lee, J.N.; Oswald, R.B. Jr.

    1977-01-01

    A method and an apparatus for controlling the angle of incidence of low-energy, high current density electron beams are disclosed. The apparatus includes a current generating diode arrangement with a mesh anode for producing a drifting electron beam. An auxiliary grounded screen electrode is placed between the anode and a target for controlling the average angle of incidence of electrons in the drifting electron beam. According to the method of the present invention, movement of the auxiliary screen electrode relative to the target and the anode permits reliable and reproducible adjustment of the average angle of incidence of the electrons in low energy, high current density relativistic electron beams

  20. Reliability of Semiautomated Computational Methods for Estimating Tibiofemoral Contact Stress in the Multicenter Osteoarthritis Study

    Directory of Open Access Journals (Sweden)

    Donald D. Anderson

    2012-01-01

    Full Text Available Recent findings suggest that contact stress is a potent predictor of subsequent symptomatic osteoarthritis development in the knee. However, much larger numbers of knees (likely on the order of hundreds, if not thousands need to be reliably analyzed to achieve the statistical power necessary to clarify this relationship. This study assessed the reliability of new semiautomated computational methods for estimating contact stress in knees from large population-based cohorts. Ten knees of subjects from the Multicenter Osteoarthritis Study were included. Bone surfaces were manually segmented from sequential 1.0 Tesla magnetic resonance imaging slices by three individuals on two nonconsecutive days. Four individuals then registered the resulting bone surfaces to corresponding bone edges on weight-bearing radiographs, using a semi-automated algorithm. Discrete element analysis methods were used to estimate contact stress distributions for each knee. Segmentation and registration reliabilities (day-to-day and interrater for peak and mean medial and lateral tibiofemoral contact stress were assessed with Shrout-Fleiss intraclass correlation coefficients (ICCs. The segmentation and registration steps of the modeling approach were found to have excellent day-to-day (ICC 0.93–0.99 and good inter-rater reliability (0.84–0.97. This approach for estimating compartment-specific tibiofemoral contact stress appears to be sufficiently reliable for use in large population-based cohorts.

  1. Machine Maintenance Scheduling with Reliability Engineering Method and Maintenance Value Stream Mapping

    Science.gov (United States)

    Sembiring, N.; Nasution, A. H.

    2018-02-01

    Corrective maintenance i.e replacing or repairing the machine component after machine break down always done in a manufacturing company. It causes the production process must be stopped. Production time will decrease due to the maintenance team must replace or repair the damage machine component. This paper proposes a preventive maintenance’s schedule for a critical component of a critical machine of an crude palm oil and kernel company due to increase maintenance efficiency. The Reliability Engineering & Maintenance Value Stream Mapping is used as a method and a tool to analize the reliability of the component and reduce the wastage in any process by segregating value added and non value added activities.

  2. Reliable method for fission source convergence of Monte Carlo criticality calculation with Wielandt's method

    International Nuclear Information System (INIS)

    Yamamoto, Toshihiro; Miyoshi, Yoshinori

    2004-01-01

    A new algorithm of Monte Carlo criticality calculations for implementing Wielandt's method, which is one of acceleration techniques for deterministic source iteration methods, is developed, and the algorithm can be successfully implemented into MCNP code. In this algorithm, part of fission neutrons emitted during random walk processes are tracked within the current cycle, and thus a fission source distribution used in the next cycle spread more widely. Applying this method intensifies a neutron interaction effect even in a loosely-coupled array where conventional Monte Carlo criticality methods have difficulties, and a converged fission source distribution can be obtained with fewer cycles. Computing time spent for one cycle, however, increases because of tracking fission neutrons within the current cycle, which eventually results in an increase of total computing time up to convergence. In addition, statistical fluctuations of a fission source distribution in a cycle are worsened by applying Wielandt's method to Monte Carlo criticality calculations. However, since a fission source convergence is attained with fewer source iterations, a reliable determination of convergence can easily be made even in a system with a slow convergence. This acceleration method is expected to contribute to prevention of incorrect Monte Carlo criticality calculations. (author)

  3. Procedures and methods that increase reliability and reproducibility of the transplanted kidney perfusion index

    International Nuclear Information System (INIS)

    Smokvina, A.

    1994-01-01

    At different times following surgery and during various complications, 119 studies were performed on 57 patients. In many patients studies were repeated several times. Twenty-three studies were performed in as many patients, in whom a normal function of the transplanted kidney was established by other diagnostic methods and retrospective analysis. Comparison was made of the perfusion index results obtained by the Hilson et al. method from 1978 and the ones obtained by my own modified method, which for calculating the index also takes into account: the time difference in appearance of the initial portions of the artery and kidney curves; the positioning of the region of interest over the distal part of the aorta; the bolus injection into the arteriovenous shunt of the forearm with high specific activity of small volumes of Tc-99m labelled agents; a fast 0.5 seconds study of data collection; and a standard for normalization of numerical data. The reliability of one or the other method tested by simulated time shift of the peak of arterial curves shows that the deviation percentage from the main index value in the unmodified method is 2-5 times greater than in the modified method. The normal value of the perfusion index applying the modified method is 91-171. (author)

  4. Reliability of cervical lordosis measurement techniques on long-cassette radiographs.

    Science.gov (United States)

    Janusz, Piotr; Tyrakowski, Marcin; Yu, Hailong; Siemionow, Kris

    2016-11-01

    Lateral radiographs are commonly used to assess cervical sagittal alignment. Three assessment methods have been described and are commonly utilized in clinical practice. These methods are described for perfect lateral cervical radiographs, however in everyday practice radiograph quality varies. The aim of this study was to compare the reliability and reproducibility of 3 cervical lordosis (CL) measurement methods. Forty-four standing lateral radiographs were randomly chosen from a lateral long-cassette radiograph database. Measurements of CL were performed with: Cobb method C2-C7 (CM), C2-C7 posterior tangent method (PTM), sum of posterior tangent method for each segment (SPTM). Three independent orthopaedic surgeons measured CL using the three methods on 44 lateral radiographs. One researcher used the three methods to measured CL three times at 4-week time intervals. Agreement between the methods as well as their intra- and interobserver reliability were tested and quantified by intraclass correlation coefficient (ICC) and median error for a single measurement (SEM). ICC of 0.75 or more reflected an excellent agreement/reliability. The results were compared with repeated ANOVA test, with p  0.05). All three methods appeared to be highly reliable. Although, high agreement between all measurement methods was shown, we do not recommend using Cobb measurement method interchangeably with PTM or SPTM within a single study as this could lead to error, whereas, such a comparison between tangent methods can be considered.

  5. Methods for slow axis beam quality improvement of high power broad area diode lasers

    Science.gov (United States)

    An, Haiyan; Xiong, Yihan; Jiang, Ching-Long J.; Schmidt, Berthold; Treusch, Georg

    2014-03-01

    For high brightness direct diode laser systems, it is of fundamental importance to improve the slow axis beam quality of the incorporated laser diodes regardless what beam combining technology is applied. To further advance our products in terms of increased brightness at a high power level, we must optimize the slow axis beam quality despite the far field blooming at high current levels. The later is caused predominantly by the built-in index step in combination with the thermal lens effect. Most of the methods for beam quality improvements reported in publications sacrifice the device efficiency and reliable output power. In order to improve the beam quality as well as maintain the efficiency and reliable output power, we investigated methods of influencing local heat generation to reduce the thermal gradient across the slow axis direction, optimizing the built-in index step and discriminating high order modes. Based on our findings, we have combined different methods in our new device design. Subsequently, the beam parameter product (BPP) of a 10% fill factor bar has improved by approximately 30% at 7 W/emitter without efficiency penalty. This technology has enabled fiber coupled high brightness multi-kilowatt direct diode laser systems. In this paper, we will elaborate on the methods used as well as the results achieved.

  6. A fracture mechanics and reliability based method to assess non-destructive testings for pressure vessels

    International Nuclear Information System (INIS)

    Kitagawa, Hideo; Hisada, Toshiaki

    1979-01-01

    Quantitative evaluation has not been made on the effects of carrying out preservice and in-service nondestructive tests for securing the soundness, safety and maintainability of pressure vessels, spending large expenses and labor. Especially the problems concerning the time and interval of in-service inspections lack the reasonable, quantitative evaluation method. In this paper, the problems of pressure vessels are treated by having developed the analysis method based on reliability technology and probability theory. The growth of surface cracks in pressure vessels was estimated, using the results of previous studies. The effects of nondestructive inspection on the defects in pressure vessels were evaluated, and the influences of many factors, such as plate thickness, stress, the accuracy of inspection and so on, on the effects of inspection, and the method of evaluating the inspections at unequal intervals were investigated. The analysis of reliability taking in-service inspection into consideration, the evaluation of in-service inspection and other affecting factors through the typical examples of analysis, and the review concerning the time of inspection are described. The method of analyzing the reliability of pressure vessels, considering the growth of defects and preservice and in-service nondestructive tests, was able to be systematized so as to be practically usable. (Kako, I.)

  7. Evaluation of nodal reliability risk in a deregulated power system with photovoltaic power penetration

    DEFF Research Database (Denmark)

    Zhao, Qian; Wang, Peng; Goel, Lalit

    2014-01-01

    Owing to the intermittent characteristic of solar radiation, power system reliability may be affected with high photovoltaic (PV) power penetration. To reduce large variation of PV power, additional system balancing reserve would be needed. In deregulated power systems, deployment of reserves...... and customer reliability requirements are correlated with energy and reserve prices. Therefore a new method should be developed to evaluate the impacts of PV power on customer reliability and system reserve deployment in the new environment. In this study, a method based on the pseudo-sequential Monte Carlo...... simulation technique has been proposed to evaluate the reserve deployment and customers' nodal reliability with high PV power penetration. The proposed method can effectively model the chronological aspects and stochastic characteristics of PV power and system operation with high computation efficiency...

  8. 76 FR 72203 - Voltage Coordination on High Voltage Grids; Notice of Reliability Workshop Agenda

    Science.gov (United States)

    2011-11-22

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. AD12-5-000] Voltage Coordination on High Voltage Grids; Notice of Reliability Workshop Agenda As announced in the Notice of Staff..., from 9 a.m. to 4:30 p.m. to explore the interaction between voltage control, reliability, and economic...

  9. High Reliability Prototype Quadrupole for the Next Linear Collider

    International Nuclear Information System (INIS)

    Spencer, Cherrill M

    2001-01-01

    The Next Linear Collider (NLC) will require over 5600 magnets, each of which must be highly reliable and/or quickly repairable in order that the NLC reach its 85% overall availability goal. A multidiscipline engineering team was assembled at SLAC to develop a more reliable electromagnet design than historically had been achieved at SLAC. This team carried out a Failure Mode and Effects Analysis (FMEA) on a standard SLAC quadrupole magnet system. They overcame a number of longstanding design prejudices, producing 10 major design changes. This paper describes how a prototype magnet was constructed and the extensive testing carried out on it to prove full functionality with an improvement in reliability. The magnet's fabrication cost will be compared to the cost of a magnet with the same requirements made in the historic SLAC way. The NLC will use over 1600 of these 12.7 mm bore quadrupoles with a range of integrated strengths from 0.6 to 132 Tesla, a maximum gradient of 135 Tesla per meter, an adjustment range of 0 to -20% and core lengths from 324 mm to 972 mm. The magnetic center must remain stable to within 1 micron during the 20% adjustment. A magnetic measurement set-up has been developed that can measure sub-micron shifts of a magnetic center. The prototype satisfied the center shift requirement over the full range of integrated strengths

  10. Lifetime validation of high-reliability (>30,000hr) rotary cryocoolers for specific customer profiles

    Science.gov (United States)

    Cauquil, Jean-Marc; Seguineau, Cédric; Vasse, Christophe; Raynal, Gaetan; Benschop, Tonny

    2018-05-01

    The cooler reliability is a major performance requested by the customers, especially for 24h/24h applications, which are a growing market. Thales has built a reliability policy based on accelerate ageing and tests to establish a robust knowledge on acceleration factors. The current trend seems to prove that the RM2 mean time to failure is now higher than 30,000hr. Even with accelerate ageing; the reliability growth becomes hardly manageable for such large figures. The paper focuses on these figures and comments the robustness of such a method when projections over 30,000hr of MTTF are needed.

  11. Application of fuzzy-MOORA method: Ranking of components for reliability estimation of component-based software systems

    Directory of Open Access Journals (Sweden)

    Zeeshan Ali Siddiqui

    2016-01-01

    Full Text Available Component-based software system (CBSS development technique is an emerging discipline that promises to take software development into a new era. As hardware systems are presently being constructed from kits of parts, software systems may also be assembled from components. It is more reliable to reuse software than to create. It is the glue code and individual components reliability that contribute to the reliability of the overall system. Every component contributes to overall system reliability according to the number of times it is being used, some components are of critical usage, known as usage frequency of component. The usage frequency decides the weight of each component. According to their weights, each component contributes to the overall reliability of the system. Therefore, ranking of components may be obtained by analyzing their reliability impacts on overall application. In this paper, we propose the application of fuzzy multi-objective optimization on the basis of ratio analysis, Fuzzy-MOORA. The method helps us find the best suitable alternative, software component, from a set of available feasible alternatives named software components. It is an accurate and easy to understand tool for solving multi-criteria decision making problems that have imprecise and vague evaluation data. By the use of ratio analysis, the proposed method determines the most suitable alternative among all possible alternatives, and dimensionless measurement will realize the job of ranking of components for estimating CBSS reliability in a non-subjective way. Finally, three case studies are shown to illustrate the use of the proposed technique.

  12. Reference satellite selection method for GNSS high-precision relative positioning

    Directory of Open Access Journals (Sweden)

    Xiao Gao

    2017-03-01

    Full Text Available Selecting the optimal reference satellite is an important component of high-precision relative positioning because the reference satellite directly influences the strength of the normal equation. The reference satellite selection methods based on elevation and positional dilution of precision (PDOP value were compared. Results show that all the above methods cannot select the optimal reference satellite. We introduce condition number of the design matrix in the reference satellite selection method to improve structure of the normal equation, because condition number can indicate the ill condition of the normal equation. The experimental results show that the new method can improve positioning accuracy and reliability in precise relative positioning.

  13. The Evaluation Method of the Lightning Strike on Transmission Lines Aiming at Power Grid Reliability

    Science.gov (United States)

    Wen, Jianfeng; Wu, Jianwei; Huang, Liandong; Geng, Yinan; Yu, zhanqing

    2018-01-01

    Lightning protection of power system focuses on reducing the flashover rate, only distinguishing by the voltage level, without considering the functional differences between the transmission lines, and being lack of analysis the effect on the reliability of power grid. This will lead lightning protection design of general transmission lines is surplus but insufficient for key lines. In order to solve this problem, the analysis method of lightning striking on transmission lines for power grid reliability is given. Full wave process theory is used to analyze the lightning back striking; the leader propagation model is used to describe the process of shielding failure of transmission lines. The index of power grid reliability is introduced and the effect of transmission line fault on the reliability of power system is discussed in detail.

  14. Separation and determination of high-carbon alcohols using method of column chromatographic and gas-chromatographic analysis

    International Nuclear Information System (INIS)

    Kang Zhongrong; Li Biping; Zeng Yongchang

    1988-01-01

    This paper describes the separation and determination of high-carbon alcohols from amine extractant by using the method of column chromatography of aluminium oxide and gas-chromatographic analysis. The total conent of high-carbon alcohols is determined by the method of column chromatography, while the components of the high-carbon alcohols and their relative contents are determined by the method of gas-chromatography. A simple reliable and practical method is provided for the analysis of high-carbon alcohol from the amine extractant in this paper

  15. A Method for Improving Reliability of Radiation Detection using Deep Learning Framework

    International Nuclear Information System (INIS)

    Chang, Hojong; Kim, Tae-Ho; Han, Byunghun; Kim, Hyunduk; Kim, Ki-duk

    2017-01-01

    Radiation detection is essential technology for overall field of radiation and nuclear engineering. Previously, technology for radiation detection composes of preparation of the table of the input spectrum to output spectrum in advance, which requires simulation of numerous predicted output spectrum with simulation using parameters modeling the spectrum. In this paper, we propose new technique to improve the performance of radiation detector. The software in the radiation detector has been stagnant for a while with possible intrinsic error of simulation. In the proposed method, to predict the input source using output spectrum measured by radiation detector is performed using deep neural network. With highly complex model, we expect that the complex pattern between data and the label can be captured well. Furthermore, the radiation detector should be calibrated regularly and beforehand. We propose a method to calibrate radiation detector using GAN. We hope that the power of deep learning may also reach to radiation detectors and make huge improvement on the field. Using improved radiation detector, the reliability of detection would be confident, and there are many tasks remaining to solve using deep learning in nuclear engineering society.

  16. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  17. Efficient reliability analysis of structures with the rotational quasi-symmetric point- and the maximum entropy methods

    Science.gov (United States)

    Xu, Jun; Dang, Chao; Kong, Fan

    2017-10-01

    This paper presents a new method for efficient structural reliability analysis. In this method, a rotational quasi-symmetric point method (RQ-SPM) is proposed for evaluating the fractional moments of the performance function. Then, the derivation of the performance function's probability density function (PDF) is carried out based on the maximum entropy method in which constraints are specified in terms of fractional moments. In this regard, the probability of failure can be obtained by a simple integral over the performance function's PDF. Six examples, including a finite element-based reliability analysis and a dynamic system with strong nonlinearity, are used to illustrate the efficacy of the proposed method. All the computed results are compared with those by Monte Carlo simulation (MCS). It is found that the proposed method can provide very accurate results with low computational effort.

  18. INNOVATIVE METHODS TO EVALUATE THE RELIABILITY OF INFORMATION CONSOLIDATED FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    Irina P. Kurochkina

    2014-01-01

    Full Text Available The article explores the possibility of using foreign innovative methods to assess the reliabilityof information consolidated fi nancial statements of Russian companies. Recommendations aremade under their adaptation and applicationinto commercial organizations. Banish methodindicators are implemented in one of the world’s largest vertically integrated steel and miningcompanies. Audit firms are proposed to usemethods of assessing the reliability of information in the practical application of ISA.

  19. Reliability Analysis of a Composite Wind Turbine Blade Section Using the Model Correction Factor Method: Numerical Study and Validation

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Friis-Hansen, Peter; Berggreen, Christian

    2013-01-01

    by the composite failure criteria. Each failure mode has been considered in a separate component reliability analysis, followed by a system analysis which gives the total probability of failure of the structure. The Model Correction Factor method used in connection with FORM (First-Order Reliability Method) proved...

  20. Reliability of an experimental method to analyse the impact point on a golf ball during putting.

    Science.gov (United States)

    Richardson, Ashley K; Mitchell, Andrew C S; Hughes, Gerwyn

    2015-06-01

    This study aimed to examine the reliability of an experimental method identifying the location of the impact point on a golf ball during putting. Forty trials were completed using a mechanical putting robot set to reproduce a putt of 3.2 m, with four different putter-ball combinations. After locating the centre of the dimple pattern (centroid) the following variables were tested; distance of the impact point from the centroid, angle of the impact point from the centroid and distance of the impact point from the centroid derived from the X, Y coordinates. Good to excellent reliability was demonstrated in all impact variables reflected in very strong relative (ICC = 0.98-1.00) and absolute reliability (SEM% = 0.9-4.3%). The highest SEM% observed was 7% for the angle of the impact point from the centroid. In conclusion, the experimental method was shown to be reliable at locating the centroid location of a golf ball, therefore allowing for the identification of the point of impact with the putter head and is suitable for use in subsequent studies.

  1. Cost benefit justification of nuclear plant reliability improvement

    International Nuclear Information System (INIS)

    El-Sayed, M.A.H.; Abdelmonem, N.M.

    1986-01-01

    Nuclear power costs are evaluated on the bases of general ground rules (a) vary from time to time (b) vary from country to another (c) even vary from one reactor type to another. The main objective of an electric utility is to provide the electric energy to the different consummers at the lowest possible cost with reasonable reliability level. Rapid increase of the construction costs and fuel prices in recent years have stimulated a great deal of interest in improving the reliability and productivity of new and existing power plants. One of the most important areas is the improvement of the secondary steam loop and the reactor cooling system. The method for evaluating the reliability of steam loop and cooling system utilizes the cut-set technique. The developed method can be easily used to show to what extent the overall reliability of the nuclear plant is affected by the possible failures in the steam and cooling subsystem. The cost reliability trade-off analysis is used to evaluate alternative schemes in the design with a view towards meeting a high reliability goal. Based on historical or estimated failure and repair rate, the reliability of the alternate scheme can be calculated

  2. The reliability assessment of the electromagnetic valve of high-speed electric multiple units braking system based on two-parameter exponential distribution

    Directory of Open Access Journals (Sweden)

    Jianwei Yang

    2016-06-01

    Full Text Available In order to solve the reliability assessment of braking system component of high-speed electric multiple units, this article, based on two-parameter exponential distribution, provides the maximum likelihood estimation and Bayes estimation under a type-I life test. First of all, we evaluate the failure probability value according to the classical estimation method and then obtain the maximum likelihood estimation of parameters of two-parameter exponential distribution by performing and using the modified likelihood function. On the other hand, based on Bayesian theory, this article also selects the beta and gamma distributions as the prior distribution, combines with the modified maximum likelihood function, and innovatively applies a Markov chain Monte Carlo algorithm to parameters assessment based on Bayes estimation method for two-parameter exponential distribution, so that two reliability mathematical models of the electromagnetic valve are obtained. Finally, through type-I life test, the failure rates according to maximum likelihood estimation and Bayes estimation method based on Markov chain Monte Carlo algorithm are, respectively, 2.650 × 10−5 and 3.037 × 10−5. Compared with the failure rate of a electromagnetic valve 3.005 × 10−5, it proves that the Bayes method can use a Markov chain Monte Carlo algorithm to estimate reliability for two-parameter exponential distribution and Bayes estimation is more closer to the value of electromagnetic valve. So, by fully integrating multi-source, Bayes estimation method can preferably modify and precisely estimate the parameters, which can provide a certain theoretical basis for the safety operation of high-speed electric multiple units.

  3. Efficient Estimation of Extreme Non-linear Roll Motions using the First-order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2007-01-01

    In on-board decision support systems efficient procedures are needed for real-time estimation of the maximum ship responses to be expected within the next few hours, given on-line information on the sea state and user defined ranges of possible headings and speeds. For linear responses standard...... frequency domain methods can be applied. To non-linear responses like the roll motion, standard methods like direct time domain simulations are not feasible due to the required computational time. However, the statistical distribution of non-linear ship responses can be estimated very accurately using...... the first-order reliability method (FORM), well-known from structural reliability problems. To illustrate the proposed procedure, the roll motion is modelled by a simplified non-linear procedure taking into account non-linear hydrodynamic damping, time-varying restoring and wave excitation moments...

  4. A Reliable Method to Measure Lip Height Using Photogrammetry in Unilateral Cleft Lip Patients.

    Science.gov (United States)

    van der Zeeuw, Frederique; Murabit, Amera; Volcano, Johnny; Torensma, Bart; Patel, Brijesh; Hay, Norman; Thorburn, Guy; Morris, Paul; Sommerlad, Brian; Gnarra, Maria; van der Horst, Chantal; Kangesu, Loshan

    2015-09-01

    There is still no reliable tool to determine the outcome of the repaired unilateral cleft lip (UCL). The aim of this study was therefore to develop an accurate, reliable tool to measure vertical lip height from photographs. The authors measured the vertical height of the cutaneous and vermilion parts of the lip in 72 anterior-posterior view photographs of 17 patients with repairs to a UCL. Points on the lip's white roll and vermillion were marked on both the cleft and the noncleft sides on each image. Two new concepts were tested. First, photographs were standardized using the horizontal (medial to lateral) eye fissure width (EFW) for calibration. Second, the authors tested the interpupillary line (IPL) and the alar base line (ABL) for their reliability as horizontal lines of reference. Measurements were taken by 2 independent researchers, at 2 different time points each. Overall 2304 data points were obtained and analyzed. Results showed that the method was very effective in measuring the height of the lip on the cleft side with the noncleft side. When using the IPL, inter- and intra-rater reliability was 0.99 to 1.0, with the ABL it varied from 0.91 to 0.99 with one exception at 0.84. The IPL was easier to define because in some subjects the overhanging nasal tip obscured the alar base and gave more consistent measurements possibly because the reconstructed alar base was sometimes indistinct. However, measurements from the IPL can only give the percentage difference between the left and right sides of the lip, whereas those from the ABL can also give exact measurements. Patient examples were given that show how the measurements correlate with clinical assessment. The authors propose this method of photogrammetry with the innovative use of the IPL as a reliable horizontal plane and use of the EFW for calibration as a useful and reliable tool to assess the outcome of UCL repair.

  5. Reliability high cycle fatigue design of gas turbine blading system using probabilistic goodman diagram

    Energy Technology Data Exchange (ETDEWEB)

    Herman Shen, M.-H. [Ohio State Univ., Columbus, OH (United States). Dept. of Aerospace Engineering and Aviation; Nicholas, T. [MLLN, Wright-Patterson AFB, OH (United States). Air Force Research Lab.

    2001-07-01

    A framework for the probabilistic analysis of high cycle fatigue is developed. The framework will be useful to U.S. Air Force and aeroengine manufacturers in the design of high cycle fatigue in disk or compressor components fabricated from Ti-6Al-4V under a range of loading conditions that might be encountered during service. The main idea of the framework is to characterize vibratory stresses from random input variables due to uncertainties such as crack location, loading, material properties, and manufacturing variability. The characteristics of such vibratory stresses are portrayed graphically as histograms, or probability density function (PDF). The outcome of the probability measures associated with all the values of a random variable exceeding the material capability is achieved by a failure function g(X) defined by the difference between the vibratory stress and Goodman line or surface such that the probability of HCF failure is P{sub f} =P(g(X<0)). Design can then be based on a go-no go criterion based on an assumed risk. The framework can be used to facilitate the development of design tools for the prediction of inspection schedules and reliability in aeroengine components. Such tools could lead ultimately to improved life extension schemes in aging aircraft, and more reliable methods for the design and inspection of critical components. (orig.)

  6. A two-step method for fast and reliable EUV mask metrology

    Science.gov (United States)

    Helfenstein, Patrick; Mochi, Iacopo; Rajendran, Rajeev; Yoshitake, Shusuke; Ekinci, Yasin

    2017-03-01

    One of the major obstacles towards the implementation of extreme ultraviolet lithography for upcoming technology nodes in semiconductor industry remains the realization of a fast and reliable detection methods patterned mask defects. We are developing a reflective EUV mask-scanning lensless imaging tool (RESCAN), installed at the Swiss Light Source synchrotron at the Paul Scherrer Institut. Our system is based on a two-step defect inspection method. In the first step, a low-resolution defect map is generated by die to die comparison of the diffraction patterns from areas with programmed defects, to those from areas that are known to be defect-free on our test sample. In a later stage, a die to database comparison will be implemented in which the measured diffraction patterns will be compared to those calculated directly from the mask layout. This Scattering Scanning Contrast Microscopy technique operates purely in the Fourier domain without the need to obtain the aerial image and, given a sufficient signal to noise ratio, defects are found in a fast and reliable way, albeit with a location accuracy limited by the spot size of the incident illumination. Having thus identified rough locations for the defects, a fine scan is carried out in the vicinity of these locations. Since our source delivers coherent illumination, we can use an iterative phase-retrieval method to reconstruct the aerial image of the scanned area with - in principle - diffraction-limited resolution without the need of an objective lens. Here, we will focus on the aerial image reconstruction technique and give a few examples to illustrate the capability of the method.

  7. A high reliability module with thermoelectric device by molding technology for M2M wireless sensor network

    International Nuclear Information System (INIS)

    Nakagawa, K; Tanaka, T; Suzuki, T

    2015-01-01

    This paper presents the fabrication of a new energy harvesting module that uses a thermoelectric device (TED) by using molding technology. Through molding technology, the TED and circuit board can be properly protected and a heat-radiating fin structure can be simultaneously constructed. The output voltage per heater temperature of the TED module at 20 °C ambient temperature is 8 mV K −1 , similar to the result with the aluminum heat sink which is almost the same fin size as the TED module. The accelerated environmental tests are performed on a damp heat test, which is an aging test under high temperature and high humidity, highly accelerated temperature, and humidity stress test (HAST) for the purpose of evaluating the electrical reliability in harsh environments, cold test and thermal cycle test to evaluate degrading characteristics by cycling through two temperatures. All test results indicate that the TED and circuit board can be properly protected from harsh temperature and humidity by using molding technology because the output voltage of after-tested modules is reduced by less than 5%. This study presents a novel fabrication method for a high reliability TED-installed module appropriate for Machine to Machine wireless sensor networks. (paper)

  8. Inverse Reliability Task: Artificial Neural Networks and Reliability-Based Optimization Approaches

    OpenAIRE

    Lehký , David; Slowik , Ondřej; Novák , Drahomír

    2014-01-01

    Part 7: Genetic Algorithms; International audience; The paper presents two alternative approaches to solve inverse reliability task – to determine the design parameters to achieve desired target reliabilities. The first approach is based on utilization of artificial neural networks and small-sample simulation Latin hypercube sampling. The second approach considers inverse reliability task as reliability-based optimization task using double-loop method and also small-sample simulation. Efficie...

  9. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  10. Reliability design of a critical facility: An application of PRA methods

    International Nuclear Information System (INIS)

    Souza Vieira Neto, A.; Souza Borges, W. de

    1987-01-01

    Although a general agreement concerning the enforcement of reliability (probabilistic) design criteria for nuclear utilities is yet to be achieved. PRA methodology can still be used successfully as a project design and review tool, aimed at improving system's prospective performance or minimizing expected accident consequences. In this paper, the potential of such an application of PRA methods is examined in the special case of a critical design project currently being developed in Brazil. (orig.)

  11. Waste package reliability analysis

    International Nuclear Information System (INIS)

    Pescatore, C.; Sastre, C.

    1983-01-01

    Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table

  12. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    Tanji, Junichi; Fujimoto, Haruo

    2013-09-01

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  13. Direct Calculation of Permeability by High-Accurate Finite Difference and Numerical Integration Methods

    KAUST Repository

    Wang, Yi

    2016-07-21

    Velocity of fluid flow in underground porous media is 6~12 orders of magnitudes lower than that in pipelines. If numerical errors are not carefully controlled in this kind of simulations, high distortion of the final results may occur [1-4]. To fit the high accuracy demands of fluid flow simulations in porous media, traditional finite difference methods and numerical integration methods are discussed and corresponding high-accurate methods are developed. When applied to the direct calculation of full-tensor permeability for underground flow, the high-accurate finite difference method is confirmed to have numerical error as low as 10-5% while the high-accurate numerical integration method has numerical error around 0%. Thus, the approach combining the high-accurate finite difference and numerical integration methods is a reliable way to efficiently determine the characteristics of general full-tensor permeability such as maximum and minimum permeability components, principal direction and anisotropic ratio. Copyright © Global-Science Press 2016.

  14. Between-day reliability of a method for non-invasive estimation of muscle composition.

    Science.gov (United States)

    Simunič, Boštjan

    2012-08-01

    Tensiomyography is a method for valid and non-invasive estimation of skeletal muscle fibre type composition. The validity of selected temporal tensiomyographic measures has been well established recently; there is, however, no evidence regarding the method's between-day reliability. Therefore it is the aim of this paper to establish the between-day repeatability of tensiomyographic measures in three skeletal muscles. For three consecutive days, 10 healthy male volunteers (mean±SD: age 24.6 ± 3.0 years; height 177.9 ± 3.9 cm; weight 72.4 ± 5.2 kg) were examined in a supine position. Four temporal measures (delay, contraction, sustain, and half-relaxation time) and maximal amplitude were extracted from the displacement-time tensiomyogram. A reliability analysis was performed with calculations of bias, random error, coefficient of variation (CV), standard error of measurement, and intra-class correlation coefficient (ICC) with a 95% confidence interval. An analysis of ICC demonstrated excellent agreement (ICC were over 0.94 in 14 out of 15 tested parameters). However, lower CV was observed in half-relaxation time, presumably because of the specifics of the parameter definition itself. These data indicate that for the three muscles tested, tensiomyographic measurements were reproducible across consecutive test days. Furthermore, we indicated the most possible origin of the lowest reliability detected in half-relaxation time. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Using the graphs models for evaluating in-core monitoring systems reliability by the method of imiting simulaton

    International Nuclear Information System (INIS)

    Golovanov, M.N.; Zyuzin, N.N.; Levin, G.L.; Chesnokov, A.N.

    1987-01-01

    An approach for estimation of reliability factors of complex reserved systems at early stages of development using the method of imitating simulation is considered. Different types of models, their merits and lacks are given. Features of in-core monitoring systems and advosability of graph model and graph theory element application for estimating reliability of such systems are shown. The results of investigation of the reliability factors of the reactor monitoring, control and core local protection subsystem are shown

  16. Validity and reliability of the session-RPE method for quantifying training in Australian football: a comparison of the CR10 and CR100 scales.

    Science.gov (United States)

    Scott, Tannath J; Black, Cameron R; Quinn, John; Coutts, Aaron J

    2013-01-01

    The purpose of this study was to examine and compare the criterion validity and test-retest reliability of the CR10 and CR100 rating of perceived exertion (RPE) scales for team sport athletes that undertake high-intensity, intermittent exercise. Twenty-one male Australian football (AF) players (age: 19.0 ± 1.8 years, body mass: 83.92 ± 7.88 kg) participated the first part (part A) of this study, which examined the construct validity of the session-RPE (sRPE) method for quantifying training load in AF. Ten male athletes (age: 16.1 ± 0.5 years) participated in the second part of the study (part B), which compared the test-retest reliability of the CR10 and CR100 RPE scales. In part A, the validity of the sRPE method was assessed by examining the relationships between sRPE, and objective measures of internal (i.e., heart rate) and external training load (i.e., distance traveled), collected from AF training sessions. Part B of the study assessed the reliability of sRPE through examining the test-retest reliability of sRPE during 3 different intensities of controlled intermittent running (10, 11.5, and 13 km·h(-1)). Results from part A demonstrated strong correlations for CR10- and CR100-derived sRPE with measures of internal training load (Banisters TRIMP and Edwards TRIMP) (CR10: r = 0.83 and 0.83, and CR100: r = 0.80 and 0.81, p training load (distance, higher speed running and player load) for both the CR10 (r = 0.81, 0.71, and 0.83) and CR100 (r = 0.78, 0.69, and 0.80) were significant (p reliability for both the CR10 (31.9% CV) and CR100 (38.6% CV) RPE scales after short bouts of intermittent running. Collectively, these results suggest both CR10- and CR100-derived sRPE methods have good construct validity for assessing training load in AF. The poor levels of reliability revealed under field testing indicate that the sRPE method may not be sensible to detecting small changes in exercise intensity during brief intermittent running bouts. Despite this limitation

  17. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  18. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  19. Monte Carlo methods for the reliability analysis of Markov systems

    International Nuclear Information System (INIS)

    Buslik, A.J.

    1985-01-01

    This paper presents Monte Carlo methods for the reliability analysis of Markov systems. Markov models are useful in treating dependencies between components. The present paper shows how the adjoint Monte Carlo method for the continuous time Markov process can be derived from the method for the discrete-time Markov process by a limiting process. The straightforward extensions to the treatment of mean unavailability (over a time interval) are given. System unavailabilities can also be estimated; this is done by making the system failed states absorbing, and not permitting repair from them. A forward Monte Carlo method is presented in which the weighting functions are related to the adjoint function. In particular, if the exact adjoint function is known then weighting factors can be constructed such that the exact answer can be obtained with a single Monte Carlo trial. Of course, if the exact adjoint function is known, there is no need to perform the Monte Carlo calculation. However, the formulation is useful since it gives insight into choices of the weight factors which will reduce the variance of the estimator

  20. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  1. Development of advanced methods and related software for human reliability evaluation within probabilistic safety analyses

    International Nuclear Information System (INIS)

    Kosmowski, K.T.; Mertens, J.; Degen, G.; Reer, B.

    1994-06-01

    Human Reliability Analysis (HRA) is an important part of Probabilistic Safety Analysis (PSA). The first part of this report consists of an overview of types of human behaviour and human error including the effect of significant performance shaping factors on human reliability. Particularly with regard to safety assessments for nuclear power plants a lot of HRA methods have been developed. The most important of these methods are presented and discussed in the report, together with techniques for incorporating HRA into PSA and with models of operator cognitive behaviour. Based on existing HRA methods the concept of a software system is described. For the development of this system the utilization of modern programming tools is proposed; the essential goal is the effective application of HRA methods. A possible integration of computeraided HRA within PSA is discussed. The features of Expert System Technology and examples of applications (PSA, HRA) are presented in four appendices. (orig.) [de

  2. Study on Feasibility of Applying Function Approximation Moment Method to Achieve Reliability-Based Design Optimization

    International Nuclear Information System (INIS)

    Huh, Jae Sung; Kwak, Byung Man

    2011-01-01

    Robust optimization or reliability-based design optimization are some of the methodologies that are employed to take into account the uncertainties of a system at the design stage. For applying such methodologies to solve industrial problems, accurate and efficient methods for estimating statistical moments and failure probability are required, and further, the results of sensitivity analysis, which is needed for searching direction during the optimization process, should also be accurate. The aim of this study is to employ the function approximation moment method into the sensitivity analysis formulation, which is expressed as an integral form, to verify the accuracy of the sensitivity results, and to solve a typical problem of reliability-based design optimization. These results are compared with those of other moment methods, and the feasibility of the function approximation moment method is verified. The sensitivity analysis formula with integral form is the efficient formulation for evaluating sensitivity because any additional function calculation is not needed provided the failure probability or statistical moments are calculated

  3. A method and application study on holistic decision tree for human reliability analysis in nuclear power plant

    International Nuclear Information System (INIS)

    Sun Feng; Zhong Shan; Wu Zhiyu

    2008-01-01

    The paper introduces a human reliability analysis method mainly used in Nuclear Power Plant Safety Assessment and the Holistic Decision Tree (HDT) method and how to apply it. The focus is primarily on providing the basic framework and some background of HDT method and steps to perform it. Influence factors and quality descriptors are formed by the interview with operators in Qinshan Nuclear Power Plant and HDT analysis performed for SGTR and SLOCA based on this information. The HDT model can use a graphic tree structure to indicate that error rate is a function of influence factors. HDT method is capable of dealing with the uncertainty in HRA, and it is reliable and practical. (authors)

  4. Transition from Partial Factors Method to Simulation-Based Reliability Assessment in Structural Design

    Czech Academy of Sciences Publication Activity Database

    Marek, Pavel; Guštar, M.; Permaul, K.

    1999-01-01

    Roč. 14, č. 1 (1999), s. 105-118 ISSN 0266-8920 R&D Projects: GA ČR GA103/94/0562; GA ČR GV103/96/K034 Keywords : reliability * safety * failure * durability * Monte Carlo method Subject RIV: JM - Building Engineering Impact factor: 0.522, year: 1999

  5. A new, rapid and reliable method for the determination of reduced sulphur (S2-) species in natural water discharges

    International Nuclear Information System (INIS)

    Montegrossi, Giordano; Tassi, Franco; Vaselli, Orlando; Bidini, Eva; Minissale, Angelo

    2006-01-01

    The determination of reduced S species in natural waters is particularly difficult due to their high instability and chemical and physical interferences in the current analytical methods. In this paper a new, rapid and reliable analytical procedure is presented, named the Cd-IC method, for their determination as ΣS 2- via oxidation to SO 4 2- after chemical trapping with an ammonia-cadmium solution that allows precipitation of all the reduced S species as CdS. The S 2- -SO 4 is analysed by ion-chromatography. The main advantages of this method are: low cost, high stability of CdS precipitate, absence of interferences, low detection limit (0.01mg/L as SO 4 for 10mL of water) and low analytical error (about 5%). The proposed method has been applied to more than 100 water samples from different natural systems (water discharges and cold wells from volcanic and geothermal areas, crater lakes) in central-southern Italy

  6. Reliable discrimination of 10 ungulate species using high resolution melting analysis of faecal DNA.

    Directory of Open Access Journals (Sweden)

    Ana Ramón-Laca

    Full Text Available Identifying species occupying an area is essential for many ecological and conservation studies. Faecal DNA is a potentially powerful method for identifying cryptic mammalian species. In New Zealand, 10 species of ungulate (Order: Artiodactyla have established wild populations and are managed as pests because of their impacts on native ecosystems. However, identifying the ungulate species present within a management area based on pellet morphology is unreliable. We present a method that enables reliable identification of 10 ungulate species (red deer, sika deer, rusa deer, fallow deer, sambar deer, white-tailed deer, Himalayan tahr, Alpine chamois, feral sheep, and feral goat from swabs of faecal pellets. A high resolution melting (HRM assay, targeting a fragment of the 12S rRNA gene, was developed. Species-specific primers were designed and combined in a multiplex PCR resulting in fragments of different length and therefore different melting behaviour for each species. The method was developed using tissue from each of the 10 species, and was validated in blind trials. Our protocol enabled species to be determined for 94% of faecal pellet swabs collected during routine monitoring by the New Zealand Department of Conservation. Our HRM method enables high-throughput and cost-effective species identification from low DNA template samples, and could readily be adapted to discriminate other mammalian species from faecal DNA.

  7. Offshore compression system design for low cost high and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Castro, Carlos J. Rocha de O.; Carrijo Neto, Antonio Dias; Cordeiro, Alexandre Franca [Chemtech Engineering Services and Software Ltd., Rio de Janeiro, RJ (Brazil). Special Projects Div.], Emails: antonio.carrijo@chemtech.com.br, carlos.rocha@chemtech.com.br, alexandre.cordeiro@chemtech.com.br

    2010-07-01

    In the offshore oil fields, the oil streams coming from the wells usually have significant amounts of gas. This gas is separated at low pressure and has to be compressed to the export pipeline pressure, usually at high pressure to reduce the needed diameter of the pipelines. In the past, this gases where flared, but nowadays there are a increasing pressure for the energy efficiency improvement of the oil rigs and the use of this gaseous fraction. The most expensive equipment of this kind of plant are the compression and power generation systems, being the second a strong function of the first, because the most power consuming equipment are the compressors. For this reason, the optimization of the compression system in terms of efficiency and cost are determinant to the plant profit. The availability of the plants also have a strong influence in the plant profit, specially in gas fields where the products have a relatively low aggregated value, compared to oil. Due this, the third design variable of the compression system becomes the reliability. As high the reliability, larger will be the plant production. The main ways to improve the reliability of compression system are the use of multiple compression trains in parallel, in a 2x50% or 3x50% configuration, with one in stand-by. Such configurations are possible and have some advantages and disadvantages, but the main side effect is the increase of the cost. This is the offshore common practice, but that does not always significantly improve the plant availability, depending of the previous process system. A series arrangement and a critical evaluation of the overall system in some cases can provide a cheaper system with equal or better performance. This paper shows a case study of the procedure to evaluate a compression system design to improve the reliability but without extreme cost increase, balancing the number of equipment, the series or parallel arrangement, and the driver selection. Two cases studies will be

  8. How to use an optimization-based method capable of balancing safety, reliability, and weight in an aircraft design process

    International Nuclear Information System (INIS)

    Johansson, Cristina; Derelov, Micael; Olvander, Johan

    2017-01-01

    In order to help decision-makers in the early design phase to improve and make more cost-efficient system safety and reliability baselines of aircraft design concepts, a method (Multi-objective Optimization for Safety and Reliability Trade-off) that is able to handle trade-offs such as system safety, system reliability, and other characteristics, for instance weight and cost, is used. Multi-objective Optimization for Safety and Reliability Trade-off has been developed and implemented at SAAB Aeronautics. The aim of this paper is to demonstrate how the implemented method might work to aid the selection of optimal design alternatives. The method is a three-step method: step 1 involves the modelling of each considered target, step 2 is optimization, and step 3 is the visualization and selection of results (results processing). The analysis is performed within Architecture Design and Preliminary Design steps, according to the company's Product Development Process. The lessons learned regarding the use of the implemented trade-off method in the three cases are presented. The results are a handful of solutions, a basis to aid in the selection of a design alternative. While the implementation of the trade-off method is performed for companies, there is nothing to prevent adapting this method, with minimal modifications, for use in other industrial applications

  9. How to use an optimization-based method capable of balancing safety, reliability, and weight in an aircraft design process

    Energy Technology Data Exchange (ETDEWEB)

    Johansson, Cristina [Mendeley, Broderna Ugglasgatan, Linkoping (Sweden); Derelov, Micael; Olvander, Johan [Linkoping University, IEI, Dept. of Machine Design, Linkoping (Sweden)

    2017-03-15

    In order to help decision-makers in the early design phase to improve and make more cost-efficient system safety and reliability baselines of aircraft design concepts, a method (Multi-objective Optimization for Safety and Reliability Trade-off) that is able to handle trade-offs such as system safety, system reliability, and other characteristics, for instance weight and cost, is used. Multi-objective Optimization for Safety and Reliability Trade-off has been developed and implemented at SAAB Aeronautics. The aim of this paper is to demonstrate how the implemented method might work to aid the selection of optimal design alternatives. The method is a three-step method: step 1 involves the modelling of each considered target, step 2 is optimization, and step 3 is the visualization and selection of results (results processing). The analysis is performed within Architecture Design and Preliminary Design steps, according to the company's Product Development Process. The lessons learned regarding the use of the implemented trade-off method in the three cases are presented. The results are a handful of solutions, a basis to aid in the selection of a design alternative. While the implementation of the trade-off method is performed for companies, there is nothing to prevent adapting this method, with minimal modifications, for use in other industrial applications.

  10. Accelerated life testing and reliability of high K multilayer ceramic capacitors

    Science.gov (United States)

    Minford, W. J.

    1981-01-01

    The reliability of one lot of high K multilayer ceramic capacitors was evaluated using accelerated life testing. The degradation in insulation resistance was characterized as a function of voltage and temperature. The times to failure at a voltage-temperature stress conformed to a lognormal distribution with a standard deviation approximately 0.5.

  11. Standard semiconductor packaging for high-reliability low-cost MEMS applications

    Science.gov (United States)

    Harney, Kieran P.

    2005-01-01

    Microelectronic packaging technology has evolved over the years in response to the needs of IC technology. The fundamental purpose of the package is to provide protection for the silicon chip and to provide electrical connection to the circuit board. Major change has been witnessed in packaging and today wafer level packaging technology has further revolutionized the industry. MEMS (Micro Electro Mechanical Systems) technology has created new challenges for packaging that do not exist in standard ICs. However, the fundamental objective of MEMS packaging is the same as traditional ICs, the low cost and reliable presentation of the MEMS chip to the next level interconnect. Inertial MEMS is one of the best examples of the successful commercialization of MEMS technology. The adoption of MEMS accelerometers for automotive airbag applications has created a high volume market that demands the highest reliability at low cost. The suppliers to these markets have responded by exploiting standard semiconductor packaging infrastructures. However, there are special packaging needs for MEMS that cannot be ignored. New applications for inertial MEMS devices are emerging in the consumer space that adds the imperative of small size to the need for reliability and low cost. These trends are not unique to MEMS accelerometers. For any MEMS technology to be successful the packaging must provide the basic reliability and interconnection functions, adding the least possible cost to the product. This paper will discuss the evolution of MEMS packaging in the accelerometer industry and identify the main issues that needed to be addressed to enable the successful commercialization of the technology in the automotive and consumer markets.

  12. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    Science.gov (United States)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  13. Improved Yield, Performance and Reliability of High-Actuator-Count Deformable Mirrors, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The project team will conduct processing and design research aimed at improving yield, performance, and reliability of high-actuator-count micro-electro-mechanical...

  14. METHODS OF IMPROVING THE RELIABILITY OF THE CONTROL SYSTEM TRACTION POWER SUPPLY OF ELECTRIC TRANSPORT BASED ON AN EXPERT INFORMATION

    Directory of Open Access Journals (Sweden)

    O. O. Matusevych

    2009-03-01

    Full Text Available The author proposed the numerous methods of solving the multi-criterion task – increasing of reliability of control system on the basis of expert information. The information, which allows choosing thoughtfully the method of reliability increasing for a control system of electric transport, is considered.

  15. Condition-based fault tree analysis (CBFTA): A new method for improved fault tree analysis (FTA), reliability and safety calculations

    International Nuclear Information System (INIS)

    Shalev, Dan M.; Tiran, Joseph

    2007-01-01

    Condition-based maintenance methods have changed systems reliability in general and individual systems in particular. Yet, this change does not affect system reliability analysis. System fault tree analysis (FTA) is performed during the design phase. It uses components failure rates derived from available sources as handbooks, etc. Condition-based fault tree analysis (CBFTA) starts with the known FTA. Condition monitoring (CM) methods applied to systems (e.g. vibration analysis, oil analysis, electric current analysis, bearing CM, electric motor CM, and so forth) are used to determine updated failure rate values of sensitive components. The CBFTA method accepts updated failure rates and applies them to the FTA. The CBFTA recalculates periodically the top event (TE) failure rate (λ TE ) thus determining the probability of system failure and the probability of successful system operation-i.e. the system's reliability. FTA is a tool for enhancing system reliability during the design stages. But, it has disadvantages, mainly it does not relate to a specific system undergoing maintenance. CBFTA is tool for updating reliability values of a specific system and for calculating the residual life according to the system's monitored conditions. Using CBFTA, the original FTA is ameliorated to a practical tool for use during the system's field life phase, not just during system design phase. This paper describes the CBFTA method and its advantages are demonstrated by an example

  16. A method to evaluate performance reliability of individual subjects in laboratory research applied to work settings.

    Science.gov (United States)

    1978-10-01

    This report presents a method that may be used to evaluate the reliability of performance of individual subjects, particularly in applied laboratory research. The method is based on analysis of variance of a tasks-by-subjects data matrix, with all sc...

  17. Study on Performance Shaping Factors (PSFs) Quantification Method in Human Reliability Analysis (HRA)

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, Inseok Jang; Seong, Poong Hyun; Park, Jinkyun; Kim, Jong Hyun

    2015-01-01

    The purpose of HRA implementation is 1) to achieve the human factor engineering (HFE) design goal of providing operator interfaces that will minimize personnel errors and 2) to conduct an integrated activity to support probabilistic risk assessment (PRA). For these purposes, various HRA methods have been developed such as technique for human error rate prediction (THERP), simplified plant analysis risk human reliability assessment (SPAR-H), cognitive reliability and error analysis method (CREAM) and so on. In performing HRA, such conditions that influence human performances have been represented via several context factors called performance shaping factors (PSFs). PSFs are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. Most HRA methods evaluate the weightings of PSFs by expert judgment and explicit guidance for evaluating the weighting is not provided. It has been widely known that the performance of the human operator is one of the critical factors to determine the safe operation of NPPs. HRA methods have been developed to identify the possibility and mechanism of human errors. In performing HRA methods, the effect of PSFs which may increase or decrease human error should be investigated. However, the effect of PSFs were estimated by expert judgment so far. Accordingly, in order to estimate the effect of PSFs objectively, the quantitative framework to estimate PSFs by using PSF profiles is introduced in this paper

  18. pd Scattering Using a Rigorous Coulomb Treatment: Reliability of the Renormalization Method for Screened-Coulomb Potentials

    International Nuclear Information System (INIS)

    Hiratsuka, Y.; Oryu, S.; Gojuki, S.

    2011-01-01

    Reliability of the screened Coulomb renormalization method, which was proposed in an elegant way by Alt-Sandhas-Zankel-Ziegelmann (ASZZ), is discussed on the basis of 'two-potential theory' for the three-body AGS equations with the Coulomb potential. In order to obtain ASZZ's formula, we define the on-shell Moller function, and calculate it by using the Haeringen criterion, i. e. 'the half-shell Coulomb amplitude is zero'. By these two steps, we can finally obtain the ASZZ formula for a small Coulomb phase shift. Furthermore, the reliability of the Haeringen criterion is thoroughly checked by a numerically rigorous calculation for the Coulomb LS-type equation. We find that the Haeringen criterion can be satisfied only in the higher energy region. We conclude that the ASZZ method can be verified in the case that the on-shell approximation to the Moller function is reasonable, and the Haeringen criterion is reliable. (author)

  19. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  20. Numerical methods for reliability and safety assessment multiscale and multiphysics systems

    CERN Document Server

    Hami, Abdelkhalak

    2015-01-01

    This book offers unique insight on structural safety and reliability by combining computational methods that address multiphysics problems, involving multiple equations describing different physical phenomena, and multiscale problems, involving discrete sub-problems that together  describe important aspects of a system at multiple scales. The book examines a range of engineering domains and problems using dynamic analysis, nonlinear methods, error estimation, finite element analysis, and other computational techniques. This book also: ·       Introduces novel numerical methods ·       Illustrates new practical applications ·       Examines recent engineering applications ·       Presents up-to-date theoretical results ·       Offers perspective relevant to a wide audience, including teaching faculty/graduate students, researchers, and practicing engineers

  1. The psychophysiological assessment method for pilot's professional reliability.

    Science.gov (United States)

    Zhang, L M; Yu, L S; Wang, K N; Jing, B S; Fang, C

    1997-05-01

    Previous research has shown that a pilot's professional reliability depends on two relative factors: the pilot's functional state and the demands of task workload. The Psychophysiological Reserve Capacity (PRC) is defined as a pilot's ability to accomplish additive tasks without reducing the performance of the primary task (flight task). We hypothesized that the PRC was a mirror of the pilot's functional state. The purpose of this study was to probe the psychophysiological method for evaluating a pilot's professional reliability on a simulator. The PRC Comprehensive Evaluating System (PRCCES) which was used in the experiment included four subsystems: a) quantitative evaluation system for pilot's performance on simulator; b) secondary task display and quantitative estimating system; c) multiphysiological data monitoring and statistical system; and d) comprehensive evaluation system for pilot PRC. Two studies were performed. In study one, 63 healthy and 13 hospitalized pilots participated. Each pilot performed a double 180 degrees circuit flight program with and without secondary task (three digit operation). The operator performance, score of secondary task and cost of physiological effort were measured and compared by PRCCES in the two conditions. Then, each pilot's flight skill in training was subjectively scored by instructor pilot ratings. In study two, 7 healthy pilots volunteered to take part in the experiment on the effects of sleep deprivation on pilot's PRC. Each participant had PRC tested pre- and post-8 h sleep deprivation. The results show that the PRC values of a healthy pilot was positively correlated with abilities of flexibility, operating and correcting deviation, attention distribution, and accuracy of instrument flight in the air (r = 0.27-0.40, p < 0.05), and negatively correlated with emotional anxiety in flight (r = -0.40, p < 0.05). The values of PRC in healthy pilots (0.61 +/- 0.17) were significantly higher than that of hospitalized pilots

  2. Advanced Functionalities for Highly Reliable Optical Networks

    DEFF Research Database (Denmark)

    An, Yi

    This thesis covers two research topics concerning optical solutions for networks e.g. avionic systems. One is to identify the applications for silicon photonic devices for cost-effective solutions in short-range optical networks. The other one is to realise advanced functionalities in order...... to increase the availability of highly reliable optical networks. A cost-effective transmitter based on a directly modulated laser (DML) using a silicon micro-ring resonator (MRR) to enhance its modulation speed is proposed, analysed and experimentally demonstrated. A modulation speed enhancement from 10 Gbit...... interconnects and network-on-chips. A novel concept of all-optical protection switching scheme is proposed, where fault detection and protection trigger are all implemented in the optical domain. This scheme can provide ultra-fast establishment of the protection path resulting in a minimum loss of data...

  3. Mechanical reliability of bulk high Tc superconductors

    International Nuclear Information System (INIS)

    Freiman, S.W.

    1990-01-01

    Most prospective applications for high T c superconductors in bulk form, e.g. magnets, motors, will require appreciable mechanical strength. Work at NIST [National Institute of Standards and Technology] has begun to address issues related to mechanical reliability. For example, recent studies on Ba-Y-Cu-O have shown that the intrinsic crack growth resistance, K IC , of crystals of this material is even smaller than was first reported, less than that of window glass, and is sensitive to moisture. Processing conditions, particularly sintering and annealing atmosphere, have been shown to have a major influence on microstructure and internal stresses in the material. Large internal stresses result from the tetragonal to orthorhombic phase transformation as well as the thermal expansion anisotropy in the grains of the ceramic. Because stress relief is absent, microcracks form which have a profound influence on strength

  4. Chip-Level Electromigration Reliability for Cu Interconnects

    International Nuclear Information System (INIS)

    Gall, M.; Oh, C.; Grinshpon, A.; Zolotov, V.; Panda, R.; Demircan, E.; Mueller, J.; Justison, P.; Ramakrishna, K.; Thrasher, S.; Hernandez, R.; Herrick, M.; Fox, R.; Boeck, B.; Kawasaki, H.; Haznedar, H.; Ku, P.

    2004-01-01

    Even after the successful introduction of Cu-based metallization, the electromigration (EM) failure risk has remained one of the most important reliability concerns for most advanced process technologies. Ever increasing operating current densities and the introduction of low-k materials in the backend process scheme are some of the issues that threaten reliable, long-term operation at elevated temperatures. The traditional method of verifying EM reliability only through current density limit checks is proving to be inadequate in general, or quite expensive at the best. A Statistical EM Budgeting (SEB) methodology has been proposed to assess more realistic chip-level EM reliability from the complex statistical distribution of currents in a chip. To be valuable, this approach requires accurate estimation of currents for all interconnect segments in a chip. However, no efficient technique to manage the complexity of such a task for very large chip designs is known. We present an efficient method to estimate currents exhaustively for all interconnects in a chip. The proposed method uses pre-characterization of cells and macros, and steps to identify and filter out symmetrically bi-directional interconnects. We illustrate the strength of the proposed approach using a high-performance microprocessor design for embedded applications as a case study

  5. Implosion lessons from national security, high reliability spacecraft, electronics, and the forces which changed them

    CERN Document Server

    Temple, L Parker

    2012-01-01

    Implosion is a focused study of the history and uses of high-reliability, solid-state electronics, military standards, and space systems that support our national security and defense. This book is unique in combining the interdependent evolution of and interrelationships among military standards, solid-state electronics, and very high-reliability space systems. Starting with a brief description of the physics that enabled the development of the first transistor, Implosion covers the need for standardizing military electronics, which began during World War II and continu

  6. Reliability and validity of academic motivation scale for sports high school students’

    Directory of Open Access Journals (Sweden)

    Haslofça Fehime

    2016-01-01

    Full Text Available This study was designed to test validity and reliability of Academic Motivation Scale (AMS for sports high school students. The research conducted with 357 volunteered girls (n=117 and boys (n=240. Confirmatory factor analysis showed that Chi square (χ2, degrees of freedom (df and χ2/df ratio were 1102.90, 341 and 3.234, respectively. Goodness of Fit Index, Comparative Fit Index, Non-normed Fit Index and Incremental Fit Index were between 0.92-0.95. Additionally, Adjusted Goodness of Fit Index, An Average Errors Square Root and Root Mean Square Error of Approximation were 0.88, 0.070 and 0.079, respectively. Subscale reliability coefficients were between 0.77 and 0.86. Test-retest correlations of AMS were found between 0.79 and 0.91. Results showed that scale was suitable for determination of sports high school students’ academicals motivation levels.

  7. Efficient surrogate models for reliability analysis of systems with multiple failure modes

    International Nuclear Information System (INIS)

    Bichon, Barron J.; McFarland, John M.; Mahadevan, Sankaran

    2011-01-01

    Despite many advances in the field of computational reliability analysis, the efficient estimation of the reliability of a system with multiple failure modes remains a persistent challenge. Various sampling and analytical methods are available, but they typically require accepting a tradeoff between accuracy and computational efficiency. In this work, a surrogate-based approach is presented that simultaneously addresses the issues of accuracy, efficiency, and unimportant failure modes. The method is based on the creation of Gaussian process surrogate models that are required to be locally accurate only in the regions of the component limit states that contribute to system failure. This approach to constructing surrogate models is demonstrated to be both an efficient and accurate method for system-level reliability analysis. - Highlights: → Extends efficient global reliability analysis to systems with multiple failure modes. → Constructs locally accurate Gaussian process models of each response. → Highly efficient and accurate method for assessing system reliability. → Effectiveness is demonstrated on several test problems from the literature.

  8. Patient safety in anesthesia: learning from the culture of high-reliability organizations.

    Science.gov (United States)

    Wright, Suzanne M

    2015-03-01

    There has been an increased awareness of and interest in patient safety and improved outcomes, as well as a growing body of evidence substantiating medical error as a leading cause of death and injury in the United States. According to The Joint Commission, US hospitals demonstrate improvements in health care quality and patient safety. Although this progress is encouraging, much room for improvement remains. High-reliability organizations, industries that deliver reliable performances in the face of complex working environments, can serve as models of safety for our health care system until plausible explanations for patient harm are better understood. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. 650-nm-band high-power and highly reliable laser diodes with a window-mirror structure

    Science.gov (United States)

    Shima, Akihiro; Hironaka, Misao; Ono, Ken-ichi; Takemi, Masayoshi; Sakamoto, Yoshifumi; Kunitsugu, Yasuhiro; Yamashita, Koji

    1998-05-01

    An active layer structure with 658 nm-emission at 25 degrees Celsius has been optimized in order to reduce the operating current of the laser diodes (LD) under high temperature condition. For improvement of the maximum output power and the reliability limited by mirror degradation, we have applied a zinc-diffused-type window-mirror structure which prevents the optical absorption at the mirror facet. As a result, the CW output power of 50 mW is obtained even at 80 degrees Celsius for a 650 micrometer-long window-mirror LD. In addition, the maximum light output power over 150 mW at 25 degrees Celsius has been realized without any optical mirror damage. In the aging tests, the LDs have been operating for over 2,500 - 5,000 hours under the CW condition of 30 - 50 mW at 60 degrees Celsius. The window-mirror structure also enables reliable 60 degree Celsius, 30 mW, CW operation of the LDs with 651 nm- emission at 25 degrees Celsius. Moreover, the maximum output power of around 100 mW even at 80 degrees Celsius and reliable 2,000-hour operation at 60 degrees Celsius, 70 mW have been realized for the first time by 659 nm LDs with a long cavity length of 900 micrometers.

  10. Evaluation of mobile ad hoc network reliability using propagation-based link reliability model

    International Nuclear Information System (INIS)

    Padmavathy, N.; Chaturvedi, Sanjay K.

    2013-01-01

    A wireless mobile ad hoc network (MANET) is a collection of solely independent nodes (that can move randomly around the area of deployment) making the topology highly dynamic; nodes communicate with each other by forming a single hop/multi-hop network and maintain connectivity in decentralized manner. MANET is modelled using geometric random graphs rather than random graphs because the link existence in MANET is a function of the geometric distance between the nodes and the transmission range of the nodes. Among many factors that contribute to the MANET reliability, the reliability of these networks also depends on the robustness of the link between the mobile nodes of the network. Recently, the reliability of such networks has been evaluated for imperfect nodes (transceivers) with binary model of communication links based on the transmission range of the mobile nodes and the distance between them. However, in reality, the probability of successful communication decreases as the signal strength deteriorates due to noise, fading or interference effects even up to the nodes' transmission range. Hence, in this paper, using a propagation-based link reliability model rather than a binary-model with nodes following a known failure distribution to evaluate the network reliability (2TR m , ATR m and AoTR m ) of MANET through Monte Carlo Simulation is proposed. The method is illustrated with an application and some imperative results are also presented

  11. [Employees in high-reliability organizations: systematic selection of personnel as a final criterion].

    Science.gov (United States)

    Oubaid, V; Anheuser, P

    2014-05-01

    Employees represent an important safety factor in high-reliability organizations. The combination of clear organizational structures, a nonpunitive safety culture, and psychological personnel selection guarantee a high level of safety. The cockpit personnel selection process of a major German airline is presented in order to demonstrate a possible transferability into medicine and urology.

  12. Application of Distribution-free Methods of Study for Identifying the Degree of Reliability of Ukrainian Banks

    Directory of Open Access Journals (Sweden)

    Burkina Natalia V.

    2014-03-01

    Full Text Available Bank ratings are integral elements of information infrastructure that ensure sound development of the banking business. One of the key issues that the clients of banking structures are worried about is the issue of identification of the degree of reliability and trust to the bank. As of now there are no common generally accepted methods of bank rating and the issue of bank reliability is rather problematic. The article considers a modern DEA method of economic and mathematical analysis which is a popular instrument of assessment of quality of services of different subjects and which became very popular in foreign econometric studies. The article demonstrates application of the data encapsulation method (data envelopment analysis, DEA for obtaining new methods of development of bank ratings and marks out incoming and outgoing indicators for building a DEA model as applied to the Ukrainian banking system. The authors also discuss some methodical problems that might appear when applying component indicators for ranging the subjects and offer methods of their elimination.

  13. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory

    Directory of Open Access Journals (Sweden)

    Kaijuan Yuan

    2016-01-01

    Full Text Available Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods.

  14. Structural Reliability Analysis of Wind Turbines: A Review

    Directory of Open Access Journals (Sweden)

    Zhiyu Jiang

    2017-12-01

    Full Text Available The paper presents a detailed review of the state-of-the-art research activities on structural reliability analysis of wind turbines between the 1990s and 2017. We describe the reliability methods including the first- and second-order reliability methods and the simulation reliability methods and show the procedure for and application areas of structural reliability analysis of wind turbines. Further, we critically review the various structural reliability studies on rotor blades, bottom-fixed support structures, floating systems and mechanical and electrical components. Finally, future applications of structural reliability methods to wind turbine designs are discussed.

  15. Evaluation of aileron actuator reliability with censored data

    Directory of Open Access Journals (Sweden)

    Li Huaiyuan

    2015-08-01

    Full Text Available For the purpose of enhancing reliability of aileron of Airbus new-generation A350XWB, an evaluation of aileron reliability on the basis of maintenance data is presented in this paper. Practical maintenance data contains large number of censoring samples, information uncertainty of which makes it hard to evaluate reliability of aileron actuator. Considering that true lifetime of censoring sample has identical distribution with complete sample, if censoring sample is transformed into complete sample, conversion frequency of censoring sample can be estimated according to frequency of complete sample. On the one hand, standard life table estimation and product limit method are improved on the basis of such conversion frequency, enabling accurate estimation of various censoring samples. On the other hand, by taking such frequency as one of the weight factors and integrating variance of order statistics under standard distribution, weighted least square estimation is formed for accurately estimating various censoring samples. Large amounts of experiments and simulations show that reliabilities of improved life table and improved product limit method are closer to the true value and more conservative; moreover, weighted least square estimate (WLSE, with conversion frequency of censoring sample and variances of order statistics as the weights, can still estimate accurately with high proportion of censored data in samples. Algorithm in this paper has good effect and can accurately estimate the reliability of aileron actuator even with small sample and high censoring rate. This research has certain significance in theory and engineering practice.

  16. Reliability analysis and operator modelling

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    1996-01-01

    The paper considers the state of operator modelling in reliability analysis. Operator models are needed in reliability analysis because operators are needed in process control systems. HRA methods must therefore be able to account both for human performance variability and for the dynamics of the interaction. A selected set of first generation HRA approaches is briefly described in terms of the operator model they use, their classification principle, and the actual method they propose. In addition, two examples of second generation methods are also considered. It is concluded that first generation HRA methods generally have very simplistic operator models, either referring to the time-reliability relationship or to elementary information processing concepts. It is argued that second generation HRA methods must recognise that cognition is embedded in a context, and be able to account for that in the way human reliability is analysed and assessed

  17. A new, rapid and reliable method for the determination of reduced sulphur (S{sup 2-}) species in natural water discharges

    Energy Technology Data Exchange (ETDEWEB)

    Montegrossi, Giordano [C.N.R. - Institute of Geosciences and Earth Resources, Via G. La Pira 4, 50121 Florence (Italy)]. E-mail: giordano@geo.unifi.it; Tassi, Franco [Department of Earth Sciences, University of Florence, Via G. La Pira 4, 50121 Florence (Italy); Vaselli, Orlando [C.N.R. - Institute of Geosciences and Earth Resources, Via G. La Pira 4, 50121 Florence (Italy); Department of Earth Sciences, University of Florence, Via G. La Pira 4, 50121 Florence (Italy); Bidini, Eva [Department of Earth Sciences, University of Florence, Via G. La Pira 4, 50121 Florence (Italy); Minissale, Angelo [C.N.R. - Institute of Geosciences and Earth Resources, Via G. La Pira 4, 50121 Florence (Italy)

    2006-05-15

    The determination of reduced S species in natural waters is particularly difficult due to their high instability and chemical and physical interferences in the current analytical methods. In this paper a new, rapid and reliable analytical procedure is presented, named the Cd-IC method, for their determination as {sigma}S{sup 2-} via oxidation to SO{sub 4}{sup 2-} after chemical trapping with an ammonia-cadmium solution that allows precipitation of all the reduced S species as CdS. The S{sup 2-}-SO{sub 4} is analysed by ion-chromatography. The main advantages of this method are: low cost, high stability of CdS precipitate, absence of interferences, low detection limit (0.01mg/L as SO{sub 4} for 10mL of water) and low analytical error (about 5%). The proposed method has been applied to more than 100 water samples from different natural systems (water discharges and cold wells from volcanic and geothermal areas, crater lakes) in central-southern Italy.

  18. Optimization of a PCRAM Chip for high-speed read and highly reliable reset operations

    Science.gov (United States)

    Li, Xiaoyun; Chen, Houpeng; Li, Xi; Wang, Qian; Fan, Xi; Hu, Jiajun; Lei, Yu; Zhang, Qi; Tian, Zhen; Song, Zhitang

    2016-10-01

    The widely used traditional Flash memory suffers from its performance limits such as its serious crosstalk problems, and increasing complexity of floating gate scaling. Phase change random access memory (PCRAM) becomes one of the most potential nonvolatile memories among the new memory techniques. In this paper, a 1M-bit PCRAM chip is designed based on the SMIC 40nm CMOS technology. Focusing on the read and write performance, two new circuits with high-speed read operation and highly reliable reset operation are proposed. The high-speed read circuit effectively reduces the reading time from 74ns to 40ns. The double-mode reset circuit improves the chip yield. This 1M-bit PCRAM chip has been simulated on cadence. After layout design is completed, the chip will be taped out for post-test.

  19. A fast method for calculating reliable event supports in tree reconciliations via Pareto optimality.

    Science.gov (United States)

    To, Thu-Hien; Jacox, Edwin; Ranwez, Vincent; Scornavacca, Celine

    2015-11-14

    Given a gene and a species tree, reconciliation methods attempt to retrieve the macro-evolutionary events that best explain the discrepancies between the two tree topologies. The DTL parsimonious approach searches for a most parsimonious reconciliation between a gene tree and a (dated) species tree, considering four possible macro-evolutionary events (speciation, duplication, transfer, and loss) with specific costs. Unfortunately, many events are erroneously predicted due to errors in the input trees, inappropriate input cost values or because of the existence of several equally parsimonious scenarios. It is thus crucial to provide a measure of the reliability for predicted events. It has been recently proposed that the reliability of an event can be estimated via its frequency in the set of most parsimonious reconciliations obtained using a variety of reasonable input cost vectors. To compute such a support, a straightforward but time-consuming approach is to generate the costs slightly departing from the original ones, independently compute the set of all most parsimonious reconciliations for each vector, and combine these sets a posteriori. Another proposed approach uses Pareto-optimality to partition cost values into regions which induce reconciliations with the same number of DTL events. The support of an event is then defined as its frequency in the set of regions. However, often, the number of regions is not large enough to provide reliable supports. We present here a method to compute efficiently event supports via a polynomial-sized graph, which can represent all reconciliations for several different costs. Moreover, two methods are proposed to take into account alternative input costs: either explicitly providing an input cost range or allowing a tolerance for the over cost of a reconciliation. Our methods are faster than the region based method, substantially faster than the sampling-costs approach, and have a higher event-prediction accuracy on

  20. Reliability Evaluation on Creep Life Prediction of Alloy 617 for a Very High Temperature Reactor

    International Nuclear Information System (INIS)

    Kim, Woo-Gon; Hong, Sung-Deok; Kim, Yong-Wan; Park, Jae-Young; Kim, Seon-Jin

    2012-01-01

    This paper evaluates the reliability of creep rupture life under service conditions of Alloy 617, which is considered as one of the candidate materials for use in a very high temperature reactor (VHTR) system. A Z-parameter, which represents the deviation of creep rupture data from the master curve, was used for the reliability analysis of the creep rupture data of Alloy 617. A Service-condition Creep Rupture Interference (SCRI) model, which can consider both the scattering of the creep rupture data and the fluctuations of temperature and stress under any service conditions, was also used for evaluating the reliability of creep rupture life. The statistical analysis showed that the scattering of creep rupture data based on Z-parameter was supported by normal distribution. The values of reliability decreased rapidly with increasing amplitudes of temperature and stress fluctuations. The results established that the reliability decreased with an increasing service time.

  1. An overview of the reliability prediction related aspects of high power IGBTs in wind power applications

    DEFF Research Database (Denmark)

    Busca, Christian; Teodorescu, Remus; Blaabjerg, Frede

    2011-01-01

    Reliability is becoming more and more important as the size and number of installed Wind Turbines (WTs) increases. Very high reliability is especially important for offshore WTs because the maintenance and repair of such WTs in case of failures can be very expensive. WT manufacturers need...

  2. Reliability in perceptual analysis of voice quality.

    Science.gov (United States)

    Bele, Irene Velsvik

    2005-12-01

    This study focuses on speaking voice quality in male teachers (n = 35) and male actors (n = 36), who represent untrained and trained voice users, because we wanted to investigate normal and supranormal voices. In this study, both substantial and methodologic aspects were considered. It includes a method for perceptual voice evaluation, and a basic issue was rater reliability. A listening group of 10 listeners, 7 experienced speech-language therapists, and 3 speech-language therapist students evaluated the voices by 15 vocal characteristics using VA scales. Two sets of voice signals were investigated: text reading (2 loudness levels) and sustained vowel (3 levels). The results indicated a high interrater reliability for most perceptual characteristics. Connected speech was evaluated more reliably, especially at the normal level, but both types of voice signals were evaluated reliably, although the reliability for connected speech was somewhat higher than for vowels. Experienced listeners tended to be more consistent in their ratings than did the student raters. Some vocal characteristics achieved acceptable reliability even with a smaller panel of listeners. The perceptual characteristics grouped in 4 factors reflected perceptual dimensions.

  3. A review of the evolution of human reliability analysis methods at nuclear industry

    International Nuclear Information System (INIS)

    Oliveira, Lécio N. de; Santos, Isaac José A. Luquetti dos; Carvalho, Paulo V.R.

    2017-01-01

    This paper reviews the status of researches on the application of human reliability analysis methods at nuclear industry and its evolution along the years. Human reliability analysis (HRA) is one of the elements used in Probabilistic Safety Analysis (PSA) and is performed as part of PSAs to quantify the likelihood that people will fail to take action, such as errors of omission and errors of commission. Although HRA may be used at lots of areas, the focus of this paper is to review the applicability of HRA methods along the years at nuclear industry, especially in Nuclear Power Plants (NPP). An electronic search on CAPES Portal of Journals (A bibliographic database) was performed. This literature review covers original papers published since the first generation of HRA methods until the ones published on March 2017. A total of 94 papers were retrieved by the initial search and 13 were selected to be fully reviewed and for data extraction after the application of inclusion and exclusion criteria, quality and suitability evaluation according to applicability at nuclear industry. Results point out that the methods from first generation are more used in practice than methods from second generation. This occurs because it is more concentrated towards quantification, in terms of success or failure of human action what make them useful for quantitative risk assessment to PSA. Although the second generation considers context and error of commission in human error prediction, they are not wider used in practice at nuclear industry to PSA. (author)

  4. A review of the evolution of human reliability analysis methods at nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Lécio N. de; Santos, Isaac José A. Luquetti dos; Carvalho, Paulo V.R., E-mail: lecionoliveira@gmail.com, E-mail: luquetti@ien.gov.br, E-mail: paulov@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-11-01

    This paper reviews the status of researches on the application of human reliability analysis methods at nuclear industry and its evolution along the years. Human reliability analysis (HRA) is one of the elements used in Probabilistic Safety Analysis (PSA) and is performed as part of PSAs to quantify the likelihood that people will fail to take action, such as errors of omission and errors of commission. Although HRA may be used at lots of areas, the focus of this paper is to review the applicability of HRA methods along the years at nuclear industry, especially in Nuclear Power Plants (NPP). An electronic search on CAPES Portal of Journals (A bibliographic database) was performed. This literature review covers original papers published since the first generation of HRA methods until the ones published on March 2017. A total of 94 papers were retrieved by the initial search and 13 were selected to be fully reviewed and for data extraction after the application of inclusion and exclusion criteria, quality and suitability evaluation according to applicability at nuclear industry. Results point out that the methods from first generation are more used in practice than methods from second generation. This occurs because it is more concentrated towards quantification, in terms of success or failure of human action what make them useful for quantitative risk assessment to PSA. Although the second generation considers context and error of commission in human error prediction, they are not wider used in practice at nuclear industry to PSA. (author)

  5. Reliability and Characterization of High Voltage Power Capacitors

    Science.gov (United States)

    2014-03-01

    in a Faraday cage to minimize external noise. In addition, electron microscopy could also be performed to identify the change in trap concentration...military bases in the United States. Energy product reliability affects the sustainability and cost- effectiveness of these systems, which must be tested...Energy product reliability affects the sustainability and cost- effectiveness of these systems, which must be tested by outside entities to ensure

  6. Highly Conductive and Reliable Copper-Filled Isotropically Conductive Adhesives Using Organic Acids for Oxidation Prevention

    Science.gov (United States)

    Chen, Wenjun; Deng, Dunying; Cheng, Yuanrong; Xiao, Fei

    2015-07-01

    The easy oxidation of copper is one critical obstacle to high-performance copper-filled isotropically conductive adhesives (ICAs). In this paper, a facile method to prepare highly reliable, highly conductive, and low-cost ICAs is reported. The copper fillers were treated by organic acids for oxidation prevention. Compared with ICA filled with untreated copper flakes, the ICA filled with copper flakes treated by different organic acids exhibited much lower bulk resistivity. The lowest bulk resistivity achieved was 4.5 × 10-5 Ω cm, which is comparable to that of commercially available Ag-filled ICA. After 500 h of 85°C/85% relative humidity (RH) aging, the treated ICAs showed quite stable bulk resistivity and relatively stable contact resistance. Through analyzing the results of x-ray diffraction, x-ray photoelectron spectroscopy, and thermogravimetric analysis, we found that, with the assistance of organic acids, the treated copper flakes exhibited resistance to oxidation, thus guaranteeing good performance.

  7. Data collection on the unit control room simulator as a method of operator reliability analysis

    International Nuclear Information System (INIS)

    Holy, J.

    1998-01-01

    The report consists of the following chapters: (1) Probabilistic assessment of nuclear power plant operation safety and human factor reliability analysis; (2) Simulators and simulations as human reliability analysis tools; (3) DOE project for using the collection and analysis of data from the unit control room simulator in human factor reliability analysis at the Paks nuclear power plant; (4) General requirements for the organization of the simulator data collection project; (5) Full-scale simulator at the Nuclear Power Plants Research Institute in Trnava, Slovakia, used as a training means for operators of the Dukovany NPP; (6) Assessment of the feasibility of quantification of important human actions modelled within a PSA study by employing simulator data analysis; (7) Assessment of the feasibility of using the various exercise topics for the quantification of the PSA model; (8) Assessment of the feasibility of employing the simulator in the analysis of the individual factors affecting the operator's activity; and (9) Examples of application of statistical methods in the analysis of the human reliability factor. (P.A.)

  8. Quantitative and Qualitative Responses to Topical Cold in Healthy Caucasians Show Variance between Individuals but High Test-Retest Reliability.

    Directory of Open Access Journals (Sweden)

    Penny Moss

    Full Text Available Increased sensitivity to cold may be a predictor of persistent pain, but cold pain threshold is often viewed as unreliable. This study aimed to determine the within-subject reliability and between-subject variance of cold response, measured comprehensively as cold pain threshold plus pain intensity and sensation quality at threshold. A test-retest design was used over three sessions, one day apart. Response to cold was assessed at four sites (thenar eminence, volar forearm, tibialis anterior, plantar foot. Cold pain threshold was measured using a Medoc thermode and standard method of limits. Intensity of pain at threshold was rated using a 10cm visual analogue scale. Quality of sensation at threshold was quantified with indices calculated from subjects' selection of descriptors from a standard McGill Pain Questionnaire. Within-subject reliability for each measure was calculated with intra-class correlation coefficients and between-subject variance was evaluated as group coefficient of variation percentage (CV%. Gender and site comparisons were also made. Forty-five healthy adults participated: 20 male, 25 female; mean age 29 (range 18-56 years. All measures at all four test sites showed high within-subject reliability: cold pain thresholds r = 0.92-0.95; pain rating r = 0.93-0.97; McGill pain quality indices r = 0.87-0.85. In contrast, all measures showed wide between-subject variance (CV% between 51.4% and 92.5%. Upper limb sites were consistently more sensitive than lower limb sites, but equally reliable. Females showed elevated cold pain thresholds, although similar pain intensity and quality to males. Females were also more reliable and showed lower variance for all measures. Thus, although there was clear population variation, response to cold for healthy individuals was found to be highly reliable, whether measured as pain threshold, pain intensity or sensation quality. A comprehensive approach to cold response testing therefore may add

  9. Quantitative and Qualitative Responses to Topical Cold in Healthy Caucasians Show Variance between Individuals but High Test-Retest Reliability.

    Science.gov (United States)

    Moss, Penny; Whitnell, Jasmine; Wright, Anthony

    2016-01-01

    Increased sensitivity to cold may be a predictor of persistent pain, but cold pain threshold is often viewed as unreliable. This study aimed to determine the within-subject reliability and between-subject variance of cold response, measured comprehensively as cold pain threshold plus pain intensity and sensation quality at threshold. A test-retest design was used over three sessions, one day apart. Response to cold was assessed at four sites (thenar eminence, volar forearm, tibialis anterior, plantar foot). Cold pain threshold was measured using a Medoc thermode and standard method of limits. Intensity of pain at threshold was rated using a 10cm visual analogue scale. Quality of sensation at threshold was quantified with indices calculated from subjects' selection of descriptors from a standard McGill Pain Questionnaire. Within-subject reliability for each measure was calculated with intra-class correlation coefficients and between-subject variance was evaluated as group coefficient of variation percentage (CV%). Gender and site comparisons were also made. Forty-five healthy adults participated: 20 male, 25 female; mean age 29 (range 18-56) years. All measures at all four test sites showed high within-subject reliability: cold pain thresholds r = 0.92-0.95; pain rating r = 0.93-0.97; McGill pain quality indices r = 0.87-0.85. In contrast, all measures showed wide between-subject variance (CV% between 51.4% and 92.5%). Upper limb sites were consistently more sensitive than lower limb sites, but equally reliable. Females showed elevated cold pain thresholds, although similar pain intensity and quality to males. Females were also more reliable and showed lower variance for all measures. Thus, although there was clear population variation, response to cold for healthy individuals was found to be highly reliable, whether measured as pain threshold, pain intensity or sensation quality. A comprehensive approach to cold response testing therefore may add validity and

  10. Reliability and Validity of a New Method for Isometric Back Extensor Strength Evaluation Using A Hand-Held Dynamometer.

    Science.gov (United States)

    Park, Hee-Won; Baek, Sora; Kim, Hong Young; Park, Jung-Gyoo; Kang, Eun Kyoung

    2017-10-01

    To investigate the reliability and validity of a new method for isometric back extensor strength measurement using a portable dynamometer. A chair equipped with a small portable dynamometer was designed (Power Track II Commander Muscle Tester). A total of 15 men (mean age, 34.8±7.5 years) and 15 women (mean age, 33.1±5.5 years) with no current back problems or previous history of back surgery were recruited. Subjects were asked to push the back of the chair while seated, and their isometric back extensor strength was measured by the portable dynamometer. Test-retest reliability was assessed with intraclass correlation coefficient (ICC). For the validity assessment, isometric back extensor strength of all subjects was measured by a widely used physical performance evaluation instrument, BTE PrimusRS system. The limit of agreement (LoA) from the Bland-Altman plot was evaluated between two methods. The test-retest reliability was excellent (ICC=0.82; 95% confidence interval, 0.65-0.91). The Bland-Altman plots demonstrated acceptable agreement between the two methods: the lower 95% LoA was -63.1 N and the upper 95% LoA was 61.1 N. This study shows that isometric back extensor strength measurement using a portable dynamometer has good reliability and validity.

  11. Reliability of large superconducting magnets through design

    International Nuclear Information System (INIS)

    Henning, C.D.

    1980-01-01

    As superconducting magnet systems grow larger and become the central component of major systems involving fusion, magnetohydrodynamics, and high-energy physics, their reliability must be commensurate with the enormous capital investment in the projects. Although the magnet may represent only 15% of the cost of a large system such as the Mirror Fusion Test Facility, its failure would be catastrophic to the entire investment. Effective quality control during construction is one method of ensuring success. However, if the design is unforgiving, even an inordinate amount of effort expended on quality control may be inadequate. Creative design is the most effective way of ensuring magnet reliability and providing a reasonable limit on the amount of quality control needed. For example, by subjecting the last drawing operation is superconductor manufacture to a stress larger than the magnet design stress, a 100% proof test is achieved; cabled conductors offer mechanical redundancy, as do some methods of conductor joining; ground-plane insulation should be multilayered to prevent arcs, and interturn and interlayer insulation spaced to be compatible with the self-extinguishing of arcs during quench voltages; electrical leads should be thermally protected; and guard vacuum spaces can be incorporated to control helium leaks. Many reliable design options are known to magnet designers. These options need to be documented and organized to produce a design guide. Eventually, standard procedures, safety factors, and design codes can lead to reliability in magnets comparable to that obtained in pressure vessels and other structures. Wihout such reliability, large-scale applications in major systems employing magnetic fusion energy, magnetohydrodynamics, or high-energy physics would present unacceptable economic risks

  12. A combined Importance Sampling and Kriging reliability method for small failure probabilities with time-demanding numerical models

    International Nuclear Information System (INIS)

    Echard, B.; Gayton, N.; Lemaire, M.; Relun, N.

    2013-01-01

    Applying reliability methods to a complex structure is often delicate for two main reasons. First, such a structure is fortunately designed with codified rules leading to a large safety margin which means that failure is a small probability event. Such a probability level is difficult to assess efficiently. Second, the structure mechanical behaviour is modelled numerically in an attempt to reproduce the real response and numerical model tends to be more and more time-demanding as its complexity is increased to improve accuracy and to consider particular mechanical behaviour. As a consequence, performing a large number of model computations cannot be considered in order to assess the failure probability. To overcome these issues, this paper proposes an original and easily implementable method called AK-IS for active learning and Kriging-based Importance Sampling. This new method is based on the AK-MCS algorithm previously published by Echard et al. [AK-MCS: an active learning reliability method combining Kriging and Monte Carlo simulation. Structural Safety 2011;33(2):145–54]. It associates the Kriging metamodel and its advantageous stochastic property with the Importance Sampling method to assess small failure probabilities. It enables the correction or validation of the FORM approximation with only a very few mechanical model computations. The efficiency of the method is, first, proved on two academic applications. It is then conducted for assessing the reliability of a challenging aerospace case study submitted to fatigue.

  13. Reliability assessment using Bayesian networks. Case study on quantative reliability estimation of a software-based motor protection relay

    International Nuclear Information System (INIS)

    Helminen, A.; Pulkkinen, U.

    2003-06-01

    In this report a quantitative reliability assessment of motor protection relay SPAM 150 C has been carried out. The assessment focuses to the methodological analysis of the quantitative reliability assessment using the software-based motor protection relay as a case study. The assessment method is based on Bayesian networks and tries to take the full advantage of the previous work done in a project called Programmable Automation System Safety Integrity assessment (PASSI). From the results and experiences achieved during the work it is justified to claim that the assessment method presented in the work enables a flexible use of qualitative and quantitative elements of reliability related evidence in a single reliability assessment. At the same time the assessment method is a concurrent way of reasoning one's beliefs and references about the reliability of the system. Full advantage of the assessment method is taken when using the method as a way to cultivate the information related to the reliability of software-based systems. The method can also be used as a communicational instrument in a licensing process of software-based systems. (orig.)

  14. New Methods for Building-In and Improvement of Integrated Circuit Reliability

    NARCIS (Netherlands)

    van der Pol, J.A.; van der Pol, Jacob Antonius

    2000-01-01

    Over the past 30 years the reliability of semiconductor products has improved by a factor of 100 while at the same time the complexity of the circuits has increased by a factor 105. This 7-decade reliability improvement has been realised by implementing a sophisticated reliability assurance system

  15. Short-Term and Medium-Term Reliability Evaluation for Power Systems With High Penetration of Wind Power

    DEFF Research Database (Denmark)

    Ding, Yi; Singh, Chanan; Goel, Lalit

    2014-01-01

    reliability evaluation techniques for power systems are well developed. These techniques are more focused on steady-state (time-independent) reliability evaluation and have been successfully applied in power system planning and expansion. In the operational phase, however, they may be too rough......The expanding share of the fluctuating and less predictable wind power generation can introduce complexities in power system reliability evaluation and management. This entails a need for the system operator to assess the system status more accurately for securing real-time balancing. The existing...... an approximation of the time-varying behavior of power systems with high penetration of wind power. This paper proposes a time-varying reliability assessment technique. Time-varying reliability models for wind farms, conventional generating units, and rapid start-up generating units are developed and represented...

  16. Power system reliability analysis using fault trees

    International Nuclear Information System (INIS)

    Volkanovski, A.; Cepin, M.; Mavko, B.

    2006-01-01

    The power system reliability analysis method is developed from the aspect of reliable delivery of electrical energy to customers. The method is developed based on the fault tree analysis, which is widely applied in the Probabilistic Safety Assessment (PSA). The method is adapted for the power system reliability analysis. The method is developed in a way that only the basic reliability parameters of the analysed power system are necessary as an input for the calculation of reliability indices of the system. The modeling and analysis was performed on an example power system consisting of eight substations. The results include the level of reliability of current power system configuration, the combinations of component failures resulting in a failed power delivery to loads, and the importance factors for components and subsystems. (author)

  17. Design for High Performance, Low Power, and Reliable 3D Integrated Circuits

    CERN Document Server

    Lim, Sung Kyu

    2013-01-01

    This book describes the design of through-silicon-via (TSV) based three-dimensional integrated circuits.  It includes details of numerous “manufacturing-ready” GDSII-level layouts of TSV-based 3D ICs, developed with tools covered in the book. Readers will benefit from the sign-off level analysis of timing, power, signal integrity, and thermo-mechanical reliability for 3D IC designs.  Coverage also includes various design-for-manufacturability (DFM), design-for-reliability (DFR), and design-for-testability (DFT) techniques that are considered critical to the 3D IC design process. Describes design issues and solutions for high performance and low power 3D ICs, such as the pros/cons of regular and irregular placement of TSVs, Steiner routing, buffer insertion, low power 3D clock routing, power delivery network design and clock design for pre-bond testability. Discusses topics in design-for-electrical-reliability for 3D ICs, such as TSV-to-TSV coupling, current crowding at the wire-to-TSV junction and the e...

  18. Gearbox Reliability Collaborative Investigation of High-Speed-Shaft Bearing Loads

    Energy Technology Data Exchange (ETDEWEB)

    Keller, Jonathan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Guo, Yi [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-06-01

    The loads and contact stresses in the bearings of the high speed shaft section of the Gearbox Reliability Collaborative gearbox are examined in this paper. The loads were measured though strain gauges installed on the bearing outer races during dynamometer testing of the gearbox. Loads and stresses were also predicted with a simple analytical model and higher-fidelity commercial models. The experimental data compared favorably to each model, and bearing stresses were below thresholds for contact fatigue and axial cracking.

  19. Reliability Analysis on NPP's Safety-Related Control Module with Field Data

    International Nuclear Information System (INIS)

    Lee, Sang Yong; Jung, Jae Hyun; Kim, Seong Hun

    2006-01-01

    The automatic control systems used in nuclear power plant (NPP) consists of numerous control modules that can be considered to be a network of components various complex ways. The control modules require relatively high reliability than industrial electronic products. Reliability prediction provides the rational basis of system designs and also provides the safety significance of system operations. The aim of this paper is to minimize the deficiencies of the traditional reliability prediction method calculation using the available field return data. This way is possible to do more realistic reliability assessment. SAMCHANG Enterprise Company (SEC) has established database containing high quality data at the module and component level from module maintenance in NPP. On the basis of these, this paper compares results that add failure record (field data) to Telcordia-SR-332 reliability prediction model with MIL-HDBK-217F prediction results

  20. Influence of reliability of the relay protection to the whole reliability of electric power systems

    International Nuclear Information System (INIS)

    Stojanovski, Ljupcho I.

    2001-01-01

    The influence of the reliability of the elements of relay protection up today analyses of the reliability on electric power systems, very rare has been taken into consideration, in other words, in these analyses it is assumed that the reliability of the protection has value one. In this work an attempt is that through modelling of individual types of protection of the elements of high-voltage systems to make calculation to the influence of the reliability of the relay protection on the total reliability of the high-voltage systems.(Author)

  1. Investigation of Reliabilities of Bolt Distances for Bolted Structural Steel Connections by Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Ertekin Öztekin Öztekin

    2015-12-01

    Full Text Available Design of the distance of bolts to each other and design of the distance of bolts to the edge of connection plates are made based on minimum and maximum boundary values proposed by structural codes. In this study, reliabilities of those distances were investigated. For this purpose, loading types, bolt types and plate thicknesses were taken as variable parameters. Monte Carlo Simulation (MCS method was used in the reliability computations performed for all combination of those parameters. At the end of study, all reliability index values for all those distances were presented in graphics and tables. Results obtained from this study compared with the values proposed by some structural codes and finally some evaluations were made about those comparisons. Finally, It was emphasized in the end of study that, it would be incorrect of the usage of the same bolt distances in the both traditional designs and the higher reliability level designs.

  2. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  3. A Newly Developed Method for Computing Reliability Measures in a Water Supply Network

    Directory of Open Access Journals (Sweden)

    Jacek Malinowski

    2016-01-01

    Full Text Available A reliability model of a water supply network has beens examined. Its main features are: a topology that can be decomposed by the so-called state factorization into a (relativelysmall number of derivative networks, each having a series-parallel structure (1, binary-state components (either operative or failed with given flow capacities (2, a multi-state character of the whole network and its sub-networks - a network state is defined as the maximal flow between a source (sources and a sink (sinks (3, all capacities (component, network, and sub-network have integer values (4. As the network operates, its state changes due to component failures, repairs, and replacements. A newly developed method of computing the inter-state transition intensities has been presented. It is based on the so-called state factorization and series-parallel aggregation. The analysis of these intensities shows that the failure-repair process of the considered system is an asymptotically homogenous Markov process. It is also demonstrated how certain reliability parameters useful for the network maintenance planning can be determined on the basis of the asymptotic intensities. For better understanding of the presented method, an illustrative example is given. (original abstract

  4. Evaluation criteria of structural steel reliability

    International Nuclear Information System (INIS)

    Zav'yalov, A.S.

    1980-01-01

    Different low-carbon and medium-carbon structural steels are investigated. It is stated that steel reliability evaluation criteria depend on the fracture mode, steel suffering from the brittle fracture under the influence of the stresses (despite their great variety) arising in articles during the production and operation. Fibrous steel fracture at the given temperature and article thickness says about its high ductility and toughness and brittle fractures are impossible. Brittle fractures take place in case of a crystalline and mixed fracture with a predominant crystalline component. Evaluation methods of article and sample steel structural strength differing greatly from real articles in a thickness (diameter) or used at temperatures higher than possible operation temperatures cannot be reliability evaluation criteria because at a great thickness (diameter) and lower operation temperatures steel fracture and its strain mode can change resulting in a sharp reliability degradation

  5. Reliable high-power diode lasers: thermo-mechanical fatigue aspects

    Science.gov (United States)

    Klumel, Genady; Gridish, Yaakov; Szafranek, Igor; Karni, Yoram

    2006-02-01

    High power water-cooled diode lasers are finding increasing demand in biomedical, cosmetic and industrial applications, where repetitive cw (continuous wave) and pulsed cw operation modes are required. When operating in such modes, the lasers experience numerous complete thermal cycles between "cold" heat sink temperature and the "hot" temperature typical of thermally equilibrated cw operation. It is clearly demonstrated that the main failure mechanism directly linked to repetitive cw operation is thermo-mechanical fatigue of the solder joints adjacent to the laser bars, especially when "soft" solders are used. Analyses of the bonding interfaces were carried out using scanning electron microscopy. It was observed that intermetallic compounds, formed already during the bonding process, lead to the solders fatigue both on the p- and n-side of the laser bar. Fatigue failure of solder joints in repetitive cw operation reduces useful lifetime of the stacks to hundreds hours, in comparison with more than 10,000 hours lifetime typically demonstrated in commonly adopted non-stop cw reliability testing programs. It is shown, that proper selection of package materials and solders, careful design of fatigue sensitive parts and burn-in screening in the hard pulse operation mode allow considerable increase of lifetime and reliability, without compromising the device efficiency, optical power density and compactness.

  6. Reliability analysis and initial requirements for FC systems and stacks

    Science.gov (United States)

    Åström, K.; Fontell, E.; Virtanen, S.

    In the year 2000 Wärtsilä Corporation started an R&D program to develop SOFC systems for CHP applications. The program aims to bring to the market highly efficient, clean and cost competitive fuel cell systems with rated power output in the range of 50-250 kW for distributed generation and marine applications. In the program Wärtsilä focuses on system integration and development. System reliability and availability are key issues determining the competitiveness of the SOFC technology. In Wärtsilä, methods have been implemented for analysing the system in respect to reliability and safety as well as for defining reliability requirements for system components. A fault tree representation is used as the basis for reliability prediction analysis. A dynamic simulation technique has been developed to allow for non-static properties in the fault tree logic modelling. Special emphasis has been placed on reliability analysis of the fuel cell stacks in the system. A method for assessing reliability and critical failure predictability requirements for fuel cell stacks in a system consisting of several stacks has been developed. The method is based on a qualitative model of the stack configuration where each stack can be in a functional, partially failed or critically failed state, each of the states having different failure rates and effects on the system behaviour. The main purpose of the method is to understand the effect of stack reliability, critical failure predictability and operating strategy on the system reliability and availability. An example configuration, consisting of 5 × 5 stacks (series of 5 sets of 5 parallel stacks) is analysed in respect to stack reliability requirements as a function of predictability of critical failures and Weibull shape factor of failure rate distributions.

  7. A G-function-based reliability-based design methodology applied to a cam roller system

    International Nuclear Information System (INIS)

    Wang, W.; Sui, P.; Wu, Y.T.

    1996-01-01

    Conventional reliability-based design optimization methods treats the reliability function as an ordinary function and applies existing mathematical programming techniques to solve the design problem. As a result, the conventional approach requires nested loops with respect to g-function, and is very time consuming. A new reliability-based design method is proposed in this paper that deals with the g-function directly instead of the reliability function. This approach has the potential of significantly reducing the number of calls for g-function calculations since it requires only one full reliability analysis in a design iteration. A cam roller system in a typical high pressure fuel injection diesel engine is designed using both the proposed and the conventional approach. The proposed method is much more efficient for this application

  8. Cervical vertebral maturation method and mandibular growth peak: a longitudinal study of diagnostic reliability.

    Science.gov (United States)

    Perinetti, Giuseppe; Primozic, Jasmina; Sharma, Bhavna; Cioffi, Iacopo; Contardo, Luca

    2018-03-28

    The capability of the cervical vertebral maturation (CVM) method in the identification of the mandibular growth peak on an individual basis remains undetermined. The diagnostic reliability of the six-stage CVM method in the identification of the mandibular growth peak was thus investigated. From the files of the Oregon and Burlington Growth Studies (data obtained between early 1950s and middle 1970s), 50 subjects (26 females, 24 males) with at least seven annual lateral cephalograms taken from 9 to 16 years were identified. Cervical vertebral maturation was assessed according to the CVM code staging system, and mandibular growth was defined as annual increments in Co-Gn distance. A diagnostic reliability analysis was carried out to establish the capability of the circumpubertal CVM stages 2, 3, and 4 in the identification of the imminent mandibular growth peak. Variable durations of each of the CVM stages 2, 3, and 4 were seen. The overall diagnostic accuracy values for the CVM stages 2, 3, and 4 were 0.70, 0.76, and 0.77, respectively. These low values appeared to be due to false positive cases. Secular trends in conjunction with the use of a discrete staging system. In most of the Burlington Growth Study sample, the lateral head film at age 15 was missing. None of the CVM stages 2, 3, and 4 reached a satisfactorily diagnostic reliability in the identification of imminent mandibular growth peak.

  9. Fuzzy method of recognition of high molecular substances in evidence-based biology

    Science.gov (United States)

    Olevskyi, V. I.; Smetanin, V. T.; Olevska, Yu. B.

    2017-10-01

    Nowadays modern requirements to achieving reliable results along with high quality of researches put mathematical analysis methods of results at the forefront. Because of this, evidence-based methods of processing experimental data have become increasingly popular in the biological sciences and medicine. Their basis is meta-analysis, a method of quantitative generalization of a large number of randomized trails contributing to a same special problem, which are often contradictory and performed by different authors. It allows identifying the most important trends and quantitative indicators of the data, verification of advanced hypotheses and discovering new effects in the population genotype. The existing methods of recognizing high molecular substances by gel electrophoresis of proteins under denaturing conditions are based on approximate methods for comparing the contrast of electrophoregrams with a standard solution of known substances. We propose a fuzzy method for modeling experimental data to increase the accuracy and validity of the findings of the detection of new proteins.

  10. DJ-1 is a reliable serum biomarker for discriminating high-risk endometrial cancer.

    Science.gov (United States)

    Di Cello, Annalisa; Di Sanzo, Maddalena; Perrone, Francesca Marta; Santamaria, Gianluca; Rania, Erika; Angotti, Elvira; Venturella, Roberta; Mancuso, Serafina; Zullo, Fulvio; Cuda, Giovanni; Costanzo, Francesco

    2017-06-01

    New reliable approaches to stratify patients with endometrial cancer into risk categories are highly needed. We have recently demonstrated that DJ-1 is overexpressed in endometrial cancer, showing significantly higher levels both in serum and tissue of patients with high-risk endometrial cancer compared with low-risk endometrial cancer. In this experimental study, we further extended our observation, evaluating the role of DJ-1 as an accurate serum biomarker for high-risk endometrial cancer. A total of 101 endometrial cancer patients and 44 healthy subjects were prospectively recruited. DJ-1 serum levels were evaluated comparing cases and controls and, among endometrial cancer patients, between high- and low-risk patients. The results demonstrate that DJ-1 levels are significantly higher in cases versus controls and in high- versus low-risk patients. The receiver operating characteristic curve analysis shows that DJ-1 has a very good diagnostic accuracy in discriminating endometrial cancer patients versus controls and an excellent accuracy in distinguishing, among endometrial cancer patients, low- from high-risk cases. DJ-1 sensitivity and specificity are the highest when high- and low-risk patients are compared, reaching the value of 95% and 99%, respectively. Moreover, DJ-1 serum levels seem to be correlated with worsening of the endometrial cancer grade and histotype, making it a reliable tool in the preoperative decision-making process.

  11. Nordic perspectives on safety management in high reliability organizations: Theory and applications

    International Nuclear Information System (INIS)

    Svenson, Ola; Salo, I.; Sjerve, A.B.; Reiman, T.; Oedewald, P.

    2006-04-01

    The chapters in this volume are written on a stand-alone basis meaning that the chapters can be read in any order. The first 4 chapters focus on theory and method in general with some applied examples illustrating the methods and theories. Chapters 5 and 6 are about safety management in the aviation industry with some additional information about incident reporting in the aviation industry and the health care sector. Chapters 7 through 9 cover safety management with applied examples from the nuclear power industry and with considerable validity for safety management in any industry. Chapters 10 through 12 cover generic safety issues with examples from the oil industry and chapter 13 presents issues related to organizations with different internal organizational structures. Although the many of the chapters use a specific industry to illustrate safety management, the messages in all the chapters are of importance for safety management in any high reliability industry or risky activity. The interested reader is also referred to, e.g., a document by an international NEA group (SEGHOF), who is about to publish a state of the art report on Systematic Approaches to Safety Management (cf., CSNI/NEA/SEGHOF, home page: www.nea.fr). (au)

  12. Nordic perspectives on safety management in high reliability organizations: Theory and applications

    Energy Technology Data Exchange (ETDEWEB)

    Svenson, Ola; Salo, I; Sjerve, A B; Reiman, T; Oedewald, P [Stockholm Univ. (Sweden)

    2006-04-15

    The chapters in this volume are written on a stand-alone basis meaning that the chapters can be read in any order. The first 4 chapters focus on theory and method in general with some applied examples illustrating the methods and theories. Chapters 5 and 6 are about safety management in the aviation industry with some additional information about incident reporting in the aviation industry and the health care sector. Chapters 7 through 9 cover safety management with applied examples from the nuclear power industry and with considerable validity for safety management in any industry. Chapters 10 through 12 cover generic safety issues with examples from the oil industry and chapter 13 presents issues related to organizations with different internal organizational structures. Although the many of the chapters use a specific industry to illustrate safety management, the messages in all the chapters are of importance for safety management in any high reliability industry or risky activity. The interested reader is also referred to, e.g., a document by an international NEA group (SEGHOF), who is about to publish a state of the art report on Systematic Approaches to Safety Management (cf., CSNI/NEA/SEGHOF, home page: www.nea.fr). (au)

  13. Reliability of sub-seabed disposal operations for high level waste

    International Nuclear Information System (INIS)

    Sarshar, M.M.

    1985-09-01

    This report describes a study carried out into the reliability of two methods of disposal of heat generating radioactive waste: by drilled emplacement in holes drilled into the ocean sediments, and by the use of penetrators to drive the waste below the ocean floor. The study has concentrated on the risk of events leading to the release of radioactivity to the environment, and also on the hazard to personnel involved in the operation. A Failure Mode, Effects and Criticality Analysis and a Fault Tree Analysis have been used in the assessment, and the relative importance of each contributory factor estimated. (author)

  14. A fast and reliable method for simultaneous waveform, amplitude and latency estimation of single-trial EEG/MEG data.

    Directory of Open Access Journals (Sweden)

    Wouter D Weeda

    Full Text Available The amplitude and latency of single-trial EEG/MEG signals may provide valuable information concerning human brain functioning. In this article we propose a new method to reliably estimate single-trial amplitude and latency of EEG/MEG signals. The advantages of the method are fourfold. First, no a-priori specified template function is required. Second, the method allows for multiple signals that may vary independently in amplitude and/or latency. Third, the method is less sensitive to noise as it models data with a parsimonious set of basis functions. Finally, the method is very fast since it is based on an iterative linear least squares algorithm. A simulation study shows that the method yields reliable estimates under different levels of latency variation and signal-to-noise ratioÕs. Furthermore, it shows that the existence of multiple signals can be correctly determined. An application to empirical data from a choice reaction time study indicates that the method describes these data accurately.

  15. Problem of nuclear power plant reliability

    International Nuclear Information System (INIS)

    Popyrin, L.S.; Nefedov, Yu.V.

    1989-01-01

    The problem of substantiation of rational and methods of ensurance of NPP reliability at the stage of its designing has been studied. It is shown that the optimal level of NPP reliability is determined by coordinating solution of the proiblems for optimization of reliability of power industry, heat and power supply and nuclear power generation systems comprising NPP, and problems of reliability optimization of NPP proper, as a complex engineering system. The conclusion is made that the greatest attention should be paid to the development of mathematical models of reliability, taking into account different methods of equipment redundancy, as well as dependence of failures on barious factors, improvement of NPP reliability indices, development of data base, working out of the complec of consistent standards of reliability. 230 refs.; 2 figs.; 1 tab

  16. Establishing survey validity and reliability for American Indians through "think aloud" and test-retest methods.

    Science.gov (United States)

    Hauge, Cindy Horst; Jacobs-Knight, Jacque; Jensen, Jamie L; Burgess, Katherine M; Puumala, Susan E; Wilton, Georgiana; Hanson, Jessica D

    2015-06-01

    The purpose of this study was to use a mixed-methods approach to determine the validity and reliability of measurements used within an alcohol-exposed pregnancy prevention program for American Indian women. To develop validity, content experts provided input into the survey measures, and a "think aloud" methodology was conducted with 23 American Indian women. After revising the measurements based on this input, a test-retest was conducted with 79 American Indian women who were randomized to complete either the original measurements or the new, modified measurements. The test-retest revealed that some of the questions performed better for the modified version, whereas others appeared to be more reliable for the original version. The mixed-methods approach was a useful methodology for gathering feedback on survey measurements from American Indian participants and in indicating specific survey questions that needed to be modified for this population. © The Author(s) 2015.

  17. Reliable and Valid Assessment of Point-of-care Ultrasonography

    DEFF Research Database (Denmark)

    Todsen, Tobias; Tolsgaard, Martin Grønnebæk; Olsen, Beth Härstedt

    2015-01-01

    physicians' OSAUS scores with diagnostic accuracy. RESULTS: The generalizability coefficient was high (0.81) and a D-study demonstrated that 1 assessor and 5 cases would result in similar reliability. The construct validity of the OSAUS scale was supported by a significant difference in the mean scores......OBJECTIVE: To explore the reliability and validity of the Objective Structured Assessment of Ultrasound Skills (OSAUS) scale for point-of-care ultrasonography (POC US) performance. BACKGROUND: POC US is increasingly used by clinicians and is an essential part of the management of acute surgical...... conditions. However, the quality of performance is highly operator-dependent. Therefore, reliable and valid assessment of trainees' ultrasonography competence is needed to ensure patient safety. METHODS: Twenty-four physicians, representing novices, intermediates, and experts in POC US, scanned 4 different...

  18. Comparison of sample preparation methods for reliable plutonium and neptunium urinalysis using automatic extraction chromatography

    DEFF Research Database (Denmark)

    Qiao, Jixin; Xu, Yihong; Hou, Xiaolin

    2014-01-01

    This paper describes improvement and comparison of analytical methods for simultaneous determination of trace-level plutonium and neptunium in urine samples by inductively coupled plasma mass spectrometry (ICP-MS). Four sample pre-concentration techniques, including calcium phosphate, iron......), it endows urinalysis methods with better reliability and repeatability compared with co-precipitation techniques. In view of the applicability of different pre-concentration techniques proposed previously in the literature, the main challenge behind relevant method development is pointed to be the release...

  19. Reliability of supply of switchgear for auxiliary low voltage in substations extra high voltage to high voltage

    Directory of Open Access Journals (Sweden)

    Perić Dragoslav M.

    2015-01-01

    Full Text Available Switchgear for auxiliary low voltage in substations (SS of extra high voltages (EHV to high voltage (HV - SS EHV/HV kV/kV is of special interest for the functioning of these important SS, as it provides a supply for system of protection and other vital functions of SS. The article addresses several characteristic examples involving MV lines with varying degrees of independence of their supply, and the possible application of direct transformation EHV/LV through special voltage transformers. Auxiliary sources such as inverters and diesel generators, which have limited power and expensive energy, are also used for the supply of switchgear for auxiliary low voltage. Corresponding reliability indices are calculated for all examples including mean expected annual engagement of diesel generators. The applicability of certain solutions of switchgear for auxiliary low voltage SS EHV/HV, taking into account their reliability, feasibility and cost-effectiveness is analyzed too. In particular, the analysis of applications of direct transformation EHV/LV for supply of switchgear for auxiliary low voltage, for both new and existing SS EHV/HV.

  20. MEMS reliability: coming of age

    Science.gov (United States)

    Douglass, Michael R.

    2008-02-01

    In today's high-volume semiconductor world, one could easily take reliability for granted. As the MOEMS/MEMS industry continues to establish itself as a viable alternative to conventional manufacturing in the macro world, reliability can be of high concern. Currently, there are several emerging market opportunities in which MOEMS/MEMS is gaining a foothold. Markets such as mobile media, consumer electronics, biomedical devices, and homeland security are all showing great interest in microfabricated products. At the same time, these markets are among the most demanding when it comes to reliability assurance. To be successful, each company developing a MOEMS/MEMS device must consider reliability on an equal footing with cost, performance and manufacturability. What can this maturing industry learn from the successful development of DLP technology, air bag accelerometers and inkjet printheads? This paper discusses some basic reliability principles which any MOEMS/MEMS device development must use. Examples from the commercially successful and highly reliable Digital Micromirror Device complement the discussion.

  1. Performance and reliability of the Y-Balance TestTM in high school athletes.

    Science.gov (United States)

    Smith, Laura J; Creps, James R; Bean, Ryan; Rodda, Becky; Alsalaheen, Bara

    2017-11-07

    Lower extremity injuries account for 32.9% of the overall injuries in high school athletes. Previous research has suggested that asymmetry greater than 4cm using the Y-Balance TestTM Lower Quarter (YBT-LQ) in the anterior direction is predictive of non- contact injuries in adults and collegiate athletes. The prevalence of asymmetries or abnormal YBT-LQ performance is not well documented for adolescents. The primary purposes of this study are: 1) to characterize the prevalence of YBT-LQ asymmetries and performance in a cross-sectional sample of adolescents, 2) to examine possible differences in performance on the YBT-LQ between male and female adolescents, and 3) to describe the test-retest reliability of the YBT-LQ in a subsample of adolescents. Observational cross-sectional study. High-school athletes completed the YBT-LQ as main outcome measure. 51 male, 59 female high-school athletes participated in this study. Asymmetries greater than 4cm in the posteromedial (PM) reach direction were most prevalent for male (54.9%) and female (50.8%) participants. Females presented with slightly higher composite scores. Good reliability (ICC = 0.89) was found for the anterior (ANT) direction, and moderate reliability with 0.76 for posterolateral (PL) and 0.63 for PM directions. The MDC95 for the ANT direction was 6% and 12% for both the PL and PM directions. The YBT-LQ performance can be beneficial in assessing recovery in an injured extremity compared to the other limb. However, due to the large MDC95, noted in the PM and PL directions, the differences between sequential testing cannot be attributed to true change in balance unless they exceed the MDC95. In this study, 79% of the athletes presented with at least one asymmetry in YBT-LQ reach distances. Moderate reliability in the PL and PM directions warrants reexamination of the definition of asymmetry in these directions.

  2. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software in Young People with Down Syndrome.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Rey-Abella, Ferran; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2016-05-01

    People with Down syndrome present skeletal abnormalities in their feet that can be analyzed by commonly used gold standard indices (the Hernández-Corvo index, the Chippaux-Smirak index, the Staheli arch index, and the Clarke angle) based on footprint measurements. The use of Photoshop CS5 software (Adobe Systems Software Ireland Ltd, Dublin, Ireland) to measure footprints has been validated in the general population. The present study aimed to assess the reliability and validity of this footprint assessment technique in the population with Down syndrome. Using optical podography and photography, 44 footprints from 22 patients with Down syndrome (11 men [mean ± SD age, 23.82 ± 3.12 years] and 11 women [mean ± SD age, 24.82 ± 6.81 years]) were recorded in a static bipedal standing position. A blinded observer performed the measurements using a validated manual method three times during the 4-month study, with 2 months between measurements. Test-retest was used to check the reliability of the Photoshop CS5 software measurements. Validity and reliability were obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed very good values for the Photoshop CS5 method (ICC, 0.982-0.995). Validity testing also found no differences between the techniques (ICC, 0.988-0.999). The Photoshop CS5 software method is reliable and valid for the study of footprints in young people with Down syndrome.

  3. N- versus O-alkylation: utilizing NMR methods to establish reliable primary structure determinations for drug discovery.

    Science.gov (United States)

    LaPlante, Steven R; Bilodeau, François; Aubry, Norman; Gillard, James R; O'Meara, Jeff; Coulombe, René

    2013-08-15

    A classic synthetic issue that remains unresolved is the reaction that involves the control of N- versus O-alkylation of ambident anions. This common chemical transformation is important for medicinal chemists, who require predictable and reliable protocols for the rapid synthesis of inhibitors. The uncertainty of whether the product(s) are N- and/or O-alkylated is common and can be costly if undetermined. Herein, we report an NMR-based strategy that focuses on distinguishing inhibitors and intermediates that are N- or O-alkylated. The NMR strategy involves three independent and complementary methods. However, any combination of two of the methods can be reliable if the third were compromised due to resonance overlap or other issues. The timely nature of these methods (HSQC/HMQC, HMBC. ROESY, and (13)C shift predictions) allows for contemporaneous determination of regioselective alkylation as needed during the optimization of synthetic routes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Reliability of CRBR primary piping: critique of stress-strength overlap method for cold-leg inlet downcomer

    International Nuclear Information System (INIS)

    Bari, R.A.; Buslik, A.J.; Papazoglou, I.A.

    1976-04-01

    A critique is presented of the strength-stress overlap method for the reliability of the CRBR primary heat transport system piping. The report addresses, in particular, the reliability assessment of WARD-D-0127 (Piping Integrity Status Report), which is part of the CRBR PSAR docket. It was found that the reliability assessment is extremely sensitive to the assumed shape for the probability density function for the strength (regarded as a random variable) of the cold-leg inlet downcomer section of the primary piping. Based on the rigorous Chebyschev inequality, it is shown that the piping failure probability is less than 10 -2 . On the other hand, it is shown that the failure probability can be much larger than approximately 10 -13 , the typical value put forth in WARD-D-0127

  5. Reliability evaluation of non-reparable three-state systems using Markov model and its comparison with the UGF and the recursive methods

    International Nuclear Information System (INIS)

    Pourkarim Guilani, Pedram; Sharifi, Mani; Niaki, S.T.A.; Zaretalab, Arash

    2014-01-01

    In multi-state systems (MSS) reliability problems, it is assumed that the components of each subsystem have different performance rates with certain probabilities. This leads into extensive computational efforts involved in using the commonly employed universal generation function (UGF) and the recursive algorithm to obtain reliability of systems consisting of a large number of components. This research deals with evaluating non-repairable three-state systems reliability and proposes a novel method based on a Markov process for which an appropriate state definition is provided. It is shown that solving the derived differential equations significantly reduces the computational time compared to the UGF and the recursive algorithm. - Highlights: • Reliability evaluation of a non-repairable three-state systems is aimed. • A novel method based on a Markov process is proposed. • An appropriate state definition is provided. • Computational time is significantly less compared to the ones in the UGF and the recursive methods

  6. Classifier Fusion With Contextual Reliability Evaluation.

    Science.gov (United States)

    Liu, Zhunga; Pan, Quan; Dezert, Jean; Han, Jun-Wei; He, You

    2018-05-01

    Classifier fusion is an efficient strategy to improve the classification performance for the complex pattern recognition problem. In practice, the multiple classifiers to combine can have different reliabilities and the proper reliability evaluation plays an important role in the fusion process for getting the best classification performance. We propose a new method for classifier fusion with contextual reliability evaluation (CF-CRE) based on inner reliability and relative reliability concepts. The inner reliability, represented by a matrix, characterizes the probability of the object belonging to one class when it is classified to another class. The elements of this matrix are estimated from the -nearest neighbors of the object. A cautious discounting rule is developed under belief functions framework to revise the classification result according to the inner reliability. The relative reliability is evaluated based on a new incompatibility measure which allows to reduce the level of conflict between the classifiers by applying the classical evidence discounting rule to each classifier before their combination. The inner reliability and relative reliability capture different aspects of the classification reliability. The discounted classification results are combined with Dempster-Shafer's rule for the final class decision making support. The performance of CF-CRE have been evaluated and compared with those of main classical fusion methods using real data sets. The experimental results show that CF-CRE can produce substantially higher accuracy than other fusion methods in general. Moreover, CF-CRE is robust to the changes of the number of nearest neighbors chosen for estimating the reliability matrix, which is appealing for the applications.

  7. Probabilistic risk assessment course documentation. Volume 3. System reliability and analysis techniques, Session A - reliability

    International Nuclear Information System (INIS)

    Lofgren, E.V.

    1985-08-01

    This course in System Reliability and Analysis Techniques focuses on the quantitative estimation of reliability at the systems level. Various methods are reviewed, but the structure provided by the fault tree method is used as the basis for system reliability estimates. The principles of fault tree analysis are briefly reviewed. Contributors to system unreliability and unavailability are reviewed, models are given for quantitative evaluation, and the requirements for both generic and plant-specific data are discussed. Also covered are issues of quantifying component faults that relate to the systems context in which the components are embedded. All reliability terms are carefully defined. 44 figs., 22 tabs

  8. An evaluation of the multi-state node networks reliability using the traditional binary-state networks reliability algorithm

    International Nuclear Information System (INIS)

    Yeh, W.-C.

    2003-01-01

    A system where the components and system itself are allowed to have a number of performance levels is called the Multi-state system (MSS). A multi-state node network (MNN) is a generalization of the MSS without satisfying the flow conservation law. Evaluating the MNN reliability arises at the design and exploitation stage of many types of technical systems. Up to now, the known existing methods can only evaluate a special MNN reliability called the multi-state node acyclic network (MNAN) in which no cyclic is allowed. However, no method exists for evaluating the general MNN reliability. The main purpose of this article is to show first that each MNN reliability can be solved using any the traditional binary-state networks (TBSN) reliability algorithm with a special code for the state probability. A simple heuristic SDP algorithm based on minimal cuts (MC) for estimating the MNN reliability is presented as an example to show how the TBSN reliability algorithm is revised to solve the MNN reliability problem. To the author's knowledge, this study is the first to discuss the relationships between MNN and TBSN and also the first to present methods to solve the exact and approximated MNN reliability. One example is illustrated to show how the exact MNN reliability is obtained using the proposed algorithm

  9. Reliability Estimation for Digital Instrument/Control System

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yaguang; Sydnor, Russell [U.S. Nuclear Regulatory Commission, Washington, D.C. (United States)

    2011-08-15

    Digital instrumentation and controls (DI and C) systems are widely adopted in various industries because of their flexibility and ability to implement various functions that can be used to automatically monitor, analyze, and control complicated systems. It is anticipated that the DI and C will replace the traditional analog instrumentation and controls (AI and C) systems in all future nuclear reactor designs. There is an increasing interest for reliability and risk analyses for safety critical DI and C systems in regulatory organizations, such as The United States Nuclear Regulatory Commission. Developing reliability models and reliability estimation methods for digital reactor control and protection systems will involve every part of the DI and C system, such as sensors, signal conditioning and processing components, transmission lines and digital communication systems, D/A and A/D converters, computer system, signal processing software, control and protection software, power supply system, and actuators. Some of these components are hardware, such as sensors and actuators, their failure mechanisms are well understood, and the traditional reliability model and estimation methods can be directly applied. But many of these components are firmware which has software embedded in the hardware, and software needs special consideration because its failure mechanism is unique, and the reliability estimation method for a software system will be different from the ones used for hardware systems. In this paper, we will propose a reliability estimation method for the entire DI and C system reliability using a recently developed software reliability estimation method and a traditional hardware reliability estimation method.

  10. Reliability Estimation for Digital Instrument/Control System

    International Nuclear Information System (INIS)

    Yang, Yaguang; Sydnor, Russell

    2011-01-01

    Digital instrumentation and controls (DI and C) systems are widely adopted in various industries because of their flexibility and ability to implement various functions that can be used to automatically monitor, analyze, and control complicated systems. It is anticipated that the DI and C will replace the traditional analog instrumentation and controls (AI and C) systems in all future nuclear reactor designs. There is an increasing interest for reliability and risk analyses for safety critical DI and C systems in regulatory organizations, such as The United States Nuclear Regulatory Commission. Developing reliability models and reliability estimation methods for digital reactor control and protection systems will involve every part of the DI and C system, such as sensors, signal conditioning and processing components, transmission lines and digital communication systems, D/A and A/D converters, computer system, signal processing software, control and protection software, power supply system, and actuators. Some of these components are hardware, such as sensors and actuators, their failure mechanisms are well understood, and the traditional reliability model and estimation methods can be directly applied. But many of these components are firmware which has software embedded in the hardware, and software needs special consideration because its failure mechanism is unique, and the reliability estimation method for a software system will be different from the ones used for hardware systems. In this paper, we will propose a reliability estimation method for the entire DI and C system reliability using a recently developed software reliability estimation method and a traditional hardware reliability estimation method

  11. Standard high-reliability integrated circuit logic packaging. [for deep space tracking stations

    Science.gov (United States)

    Slaughter, D. W.

    1977-01-01

    A family of standard, high-reliability hardware used for packaging digital integrated circuits is described. The design transition from early prototypes to production hardware is covered and future plans are discussed. Interconnections techniques are described as well as connectors and related hardware available at both the microcircuit packaging and main-frame level. General applications information is also provided.

  12. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  13. On reliable discovery of molecular signatures

    Directory of Open Access Journals (Sweden)

    Björkegren Johan

    2009-01-01

    Full Text Available Abstract Background Molecular signatures are sets of genes, proteins, genetic variants or other variables that can be used as markers for a particular phenotype. Reliable signature discovery methods could yield valuable insight into cell biology and mechanisms of human disease. However, it is currently not clear how to control error rates such as the false discovery rate (FDR in signature discovery. Moreover, signatures for cancer gene expression have been shown to be unstable, that is, difficult to replicate in independent studies, casting doubts on their reliability. Results We demonstrate that with modern prediction methods, signatures that yield accurate predictions may still have a high FDR. Further, we show that even signatures with low FDR may fail to replicate in independent studies due to limited statistical power. Thus, neither stability nor predictive accuracy are relevant when FDR control is the primary goal. We therefore develop a general statistical hypothesis testing framework that for the first time provides FDR control for signature discovery. Our method is demonstrated to be correct in simulation studies. When applied to five cancer data sets, the method was able to discover molecular signatures with 5% FDR in three cases, while two data sets yielded no significant findings. Conclusion Our approach enables reliable discovery of molecular signatures from genome-wide data with current sample sizes. The statistical framework developed herein is potentially applicable to a wide range of prediction problems in bioinformatics.

  14. Designing high availability systems DFSS and classical reliability techniques with practical real life examples

    CERN Document Server

    Taylor, Zachary

    2014-01-01

    A practical, step-by-step guide to designing world-class, high availability systems using both classical and DFSS reliability techniques Whether designing telecom, aerospace, automotive, medical, financial, or public safety systems, every engineer aims for the utmost reliability and availability in the systems he, or she, designs. But between the dream of world-class performance and reality falls the shadow of complexities that can bedevil even the most rigorous design process. While there are an array of robust predictive engineering tools, there has been no single-source guide to understan

  15. Design of preventive maintenance system using the reliability engineering and maintenance value stream mapping methods in PT. XYZ

    Science.gov (United States)

    Sembiring, N.; Panjaitan, N.; Angelita, S.

    2018-02-01

    PT. XYZ is a company owned by non-governmental organizations engaged in the field of production of rubber processing becoming crumb rubber. Part of the production is supported by some of machines and interacting equipment to achieve optimal productivity. Types of the machine that are used in the production process are Conveyor Breaker, Breaker, Rolling Pin, Hammer Mill, Mill Roll, Conveyor, Shredder Crumb, and Dryer. Maintenance system in PT. XYZ is corrective maintenance i.e. repairing or replacing the engine components after the crash on the machine. Replacement of engine components on corrective maintenance causes the machine to stop operating during the production process is in progress. The result is in the loss of production time due to the operator must replace the damaged engine components. The loss of production time can impact on the production targets which were not reached and lead to high loss costs. The cost for all components is Rp. 4.088.514.505. This cost is really high just for maintaining a Mill Roll Machine. Therefore PT. XYZ is needed to do preventive maintenance i.e. scheduling engine components and improving maintenance efficiency. The used methods are Reliability Engineering and Maintenance Value Stream Mapping (MVSM). The needed data in this research are the interval of time damage to engine components, opportunity cost, labor cost, component cost, corrective repair time, preventive repair time, Mean Time To Opportunity (MTTO), Mean Time To Repair (MTTR), and Mean Time To Yield (MTTY). In this research, the critical components of Mill Roll machine are Spier, Bushing, Bearing, Coupling and Roll. Determination of damage distribution, reliability, MTTF, cost of failure, cost of preventive, current state map, and future state map are done so that the replacement time for each critical component with the lowest maintenance cost and preparation of Standard Operation Procedure (SOP) are developed. For the critical component that has been

  16. Reliability of Volumetry and Perimetry to Assess Knee Volume.

    Science.gov (United States)

    Nunes, Guilherme S; Yamashitafuji, Igor; Wageck, Bruna; Teixeira, Guilherme Garcia; Karloh, Manuela; de Noronha, Marcos

    2016-08-24

    The treatment of edema after a knee injury is usually 1 of the main objectives during rehabilitation. To assess the success of treatment, 2 methods are commonly used in clinical practice: volumetry and perimetry. To investigate the intra- and interassessor reliability of volumetry and perimetry to assess knee volume. Cross-sectional. Laboratory. 45 healthy participants (26 women) with mean age of 22.4 ± 2.8 y. Knee volume was assessed by 3 assessors (A, B, and C) with 3 methods (lower-limb volumetry [LLV], knee volumetry [KV], and knee perimetry [KP]). Assessor A was the most-experienced assessor, and assessor C, the least experienced. LLV and KV were performed with participants in the orthostatic position, while KP was performed with participants in supine. For the interassessor analysis, the ICC2,1 was high (.82) for KV and very high for LLV (.99) and KP (.99). For the intra-assessor analysis, ICC2,1 ranged from moderate to high for KV (.69-.83) and was very high for LLV (.99) and KP (.97-.99). KV, LLV, and KP are reliable methods, both intra- and interassessor, to measure knee volume.

  17. A novel method for monitoring high-risk breast cancer with tumor markers

    DEFF Research Database (Denmark)

    Sölétormos, G; Nielsen, D; Schiøler, V

    1993-01-01

    cancer. METHODS: Ninety females with high-risk breast cancer were included in the study. Response evaluation was based upon clinical examination, x-rays or histology and elaborated marker criteria. RESULTS: During the marker monitoring period, metastases in four patients were confined to skin or lymph......BACKGROUND: An early and reliable diagnosis of metastatic spread has increased interest in serum tumor markers. This study investigated the ability of CA 15.3, CEA, and TPA to identify, predict, and exclude metastases in bone/viscera during adjuvant treatment and follow-up of high-risk breast...

  18. A penalty guided stochastic fractal search approach for system reliability optimization

    International Nuclear Information System (INIS)

    Mellal, Mohamed Arezki; Zio, Enrico

    2016-01-01

    Modern industry requires components and systems with high reliability levels. In this paper, we address the system reliability optimization problem. A penalty guided stochastic fractal search approach is developed for solving reliability allocation, redundancy allocation, and reliability–redundancy allocation problems. Numerical results of ten case studies are presented as benchmark problems for highlighting the superiority of the proposed approach compared to others from literature. - Highlights: • System reliability optimization is investigated. • A penalty guided stochastic fractal search approach is developed. • Results of ten case studies are compared with previously published methods. • Performance of the approach is demonstrated.

  19. Mechanical Integrity Issues at MCM-Cs for High Reliability Applications

    International Nuclear Information System (INIS)

    Morgenstern, H.A.; Tarbutton, T.J.; Becka, G.A.; Uribe, F.; Monroe, S.; Burchett, S.

    1998-01-01

    During the qualification of a new high reliability low-temperature cofired ceramic (LTCC) multichip module (MCM), two issues relating to the electrical and mechanical integrity of the LTCC network were encountered while performing qualification testing. One was electrical opens after aging tests that were caused by cracks in the solder joints. The other was fracturing of the LTCC networks during mechanical testing. Through failure analysis, computer modeling, bend testing, and test samples, changes were identified. Upon implementation of all these changes, the modules passed testing, and the MCM was placed into production

  20. ASSESSING AND COMBINING RELIABILITY OF PROTEIN INTERACTION SOURCES

    Science.gov (United States)

    LEACH, SONIA; GABOW, AARON; HUNTER, LAWRENCE; GOLDBERG, DEBRA S.

    2008-01-01

    Integrating diverse sources of interaction information to create protein networks requires strategies sensitive to differences in accuracy and coverage of each source. Previous integration approaches calculate reliabilities of protein interaction information sources based on congruity to a designated ‘gold standard.’ In this paper, we provide a comparison of the two most popular existing approaches and propose a novel alternative for assessing reliabilities which does not require a gold standard. We identify a new method for combining the resultant reliabilities and compare it against an existing method. Further, we propose an extrinsic approach to evaluation of reliability estimates, considering their influence on the downstream tasks of inferring protein function and learning regulatory networks from expression data. Results using this evaluation method show 1) our method for reliability estimation is an attractive alternative to those requiring a gold standard and 2) the new method for combining reliabilities is less sensitive to noise in reliability assignments than the similar existing technique. PMID:17990508

  1. OSS reliability measurement and assessment

    CERN Document Server

    Yamada, Shigeru

    2016-01-01

    This book analyses quantitative open source software (OSS) reliability assessment and its applications, focusing on three major topic areas: the Fundamentals of OSS Quality/Reliability Measurement and Assessment; the Practical Applications of OSS Reliability Modelling; and Recent Developments in OSS Reliability Modelling. Offering an ideal reference guide for graduate students and researchers in reliability for open source software (OSS) and modelling, the book introduces several methods of reliability assessment for OSS including component-oriented reliability analysis based on analytic hierarchy process (AHP), analytic network process (ANP), and non-homogeneous Poisson process (NHPP) models, the stochastic differential equation models and hazard rate models. These measurement and management technologies are essential to producing and maintaining quality/reliable systems using OSS.

  2. Aerospace reliability applied to biomedicine.

    Science.gov (United States)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  3. Prediction method of long-term reliability in improving residual stresses by means of surface finishing

    International Nuclear Information System (INIS)

    Sera, Takehiko; Hirano, Shinro; Chigusa, Naoki; Okano, Shigetaka; Saida, Kazuyoshi; Mochizuki, Masahito; Nishimoto, Kazutoshi

    2012-01-01

    Surface finishing methods, such as Water Jet Peening (WJP), have been applied to welds in some major components of nuclear power plants as a counter measure to Primary Water Stress Corrosion Cracking (PWSCC). In addition, the methods of surface finishing (buffing treatment) is being standardized, and thus the buffing treatment has been also recognized as the well-established method of improving stress. On the other hand, the long-term stability of peening techniques has been confirmed by accelerated test. However, the effectiveness of stress improvement by surface treatment is limited to thin layers and the effect of complicated residual stress distribution in the weld metal beneath the surface is not strictly taken into account for long-term stability. This paper, therefore, describes the accelerated tests, which confirmed that the long-term stability of the layer subjected to buffing treatment was equal to that subjected to WJP. The long-term reliability of very thin stress improved layer was also confirmed through a trial evaluation by thermal elastic-plastic creep analysis, even if the effect of complicated residual stress distribution in the weld metal was excessively taken into account. Considering the above findings, an approach is proposed for constructing the prediction method of the long-term reliability of stress improvement by surface finishing. (author)

  4. Pneumothorax size measurements on digital chest radiographs: Intra- and inter- rater reliability.

    Science.gov (United States)

    Thelle, Andreas; Gjerdevik, Miriam; Grydeland, Thomas; Skorge, Trude D; Wentzel-Larsen, Tore; Bakke, Per S

    2015-10-01

    Detailed and reliable methods may be important for discussions on the importance of pneumothorax size in clinical decision-making. Rhea's method is widely used to estimate pneumothorax size in percent based on chest X-rays (CXRs) from three measure points. Choi's addendum is used for anterioposterior projections. The aim of this study was to examine the intrarater and interrater reliability of the Rhea and Choi method using digital CXR in the ward based PACS monitors. Three physicians examined a retrospective series of 80 digital CXRs showing pneumothorax, using Rhea and Choi's method, then repeated in a random order two weeks later. We used the analysis of variance technique by Eliasziw et al. to assess the intrarater and interrater reliability in altogether 480 estimations of pneumothorax size. Estimated pneumothorax sizes ranged between 5% and 100%. The intrarater reliability coefficient was 0.98 (95% one-sided lower-limit confidence interval C 0.96), and the interrater reliability coefficient was 0.95 (95% one-sided lower-limit confidence interval 0.93). This study has shown that the Rhea and Choi method for calculating pneumothorax size has high intrarater and interrater reliability. These results are valid across gender, side of pneumothorax and whether the patient is diagnosed with primary or secondary pneumothorax. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. On the reliability of finite element solutions

    International Nuclear Information System (INIS)

    Prasad, K.S.R.K.

    1975-01-01

    The extent of reliability of the finite element method for analysis of nuclear reactor structures, and that of reactor vessels in particular and the need for the engineer to guard against the pitfalls that may arise out of both physical and mathematical models have been high-lighted. A systematic way of checking the model to obtain reasonably accurate solutions is presented. Quite often sophisticated elements are suggested for specific design and stress concentration problems. The desirability or otherwise of these elements, their scope and utility vis-a-vis the use of large stack of conventional elements are discussed from the view point of stress analysts. The methods of obtaining a check on the reliability of the finite element solutions either through modelling changes or an extrapolation technique are discussed. (author)

  6. Finite element reliability analysis of fatigue life

    International Nuclear Information System (INIS)

    Harkness, H.H.; Belytschko, T.; Liu, W.K.

    1992-01-01

    Fatigue reliability is addressed by the first-order reliability method combined with a finite element method. Two-dimensional finite element models of components with cracks in mode I are considered with crack growth treated by the Paris law. Probability density functions of the variables affecting fatigue are proposed to reflect a setting where nondestructive evaluation is used, and the Rosenblatt transformation is employed to treat non-Gaussian random variables. Comparisons of the first-order reliability results and Monte Carlo simulations suggest that the accuracy of the first-order reliability method is quite good in this setting. Results show that the upper portion of the initial crack length probability density function is crucial to reliability, which suggests that if nondestructive evaluation is used, the probability of detection curve plays a key role in reliability. (orig.)

  7. Time-dependent reliability sensitivity analysis of motion mechanisms

    International Nuclear Information System (INIS)

    Wei, Pengfei; Song, Jingwen; Lu, Zhenzhou; Yue, Zhufeng

    2016-01-01

    Reliability sensitivity analysis aims at identifying the source of structure/mechanism failure, and quantifying the effects of each random source or their distribution parameters on failure probability or reliability. In this paper, the time-dependent parametric reliability sensitivity (PRS) analysis as well as the global reliability sensitivity (GRS) analysis is introduced for the motion mechanisms. The PRS indices are defined as the partial derivatives of the time-dependent reliability w.r.t. the distribution parameters of each random input variable, and they quantify the effect of the small change of each distribution parameter on the time-dependent reliability. The GRS indices are defined for quantifying the individual, interaction and total contributions of the uncertainty in each random input variable to the time-dependent reliability. The envelope function method combined with the first order approximation of the motion error function is introduced for efficiently estimating the time-dependent PRS and GRS indices. Both the time-dependent PRS and GRS analysis techniques can be especially useful for reliability-based design. This significance of the proposed methods as well as the effectiveness of the envelope function method for estimating the time-dependent PRS and GRS indices are demonstrated with a four-bar mechanism and a car rack-and-pinion steering linkage. - Highlights: • Time-dependent parametric reliability sensitivity analysis is presented. • Time-dependent global reliability sensitivity analysis is presented for mechanisms. • The proposed method is especially useful for enhancing the kinematic reliability. • An envelope method is introduced for efficiently implementing the proposed methods. • The proposed method is demonstrated by two real planar mechanisms.

  8. High inter-tester reliability of the new mobility score in patients with hip fracture

    DEFF Research Database (Denmark)

    Kristensen, M.T.; Bandholm, T.; Foss, N.B.

    2008-01-01

    OBJECTIVE: To assess the inter-tester reliability of the New Mobility Score in patients with acute hip fracture. DESIGN: An inter-tester reliability study. SUBJECTS: Forty-eight consecutive patients with acute hip fracture at a median age of 84 (interquartile range, 76-89) years; 40 admitted from...... their own home and 8 from nursing homes to an acute orthopaedic hip fracture unit at a university hospital. METHODS: The New Mobility Score, which evaluates the prefracture functional level with a score from 0 (not able to walk at all) to 9 (fully independent), was assessed by 2 independent physiotherapists...... the prefracture functional level in patients with acute hip fracture Udgivelsesdato: 2008/7...

  9. Identification of Black Spots Based on Reliability Approach

    Directory of Open Access Journals (Sweden)

    Ahmadreza Ghaffari

    2013-12-01

    Full Text Available Identifying crash “black-spots”, “hot-spots” or “high-risk” locations is one of the most important and prevalent concerns in traffic safety and various methods have been devised and presented for solving this issue until now. In this paper, a new method based on the reliability analysis is presented to identify black-spots. Reliability analysis has an ordered framework to consider the probabilistic nature of engineering problems, so crashes with their probabilistic na -ture can be applied. In this study, the application of this new method was compared with the commonly implemented Frequency and Empirical Bayesian methods using simulated data. The results indicated that the traditional methods can lead to an inconsistent prediction due to their inconsider -ation of the variance of the number of crashes in each site and their dependence on the mean of the data.

  10. Quasi-Optical Network Analyzers and High-Reliability RF MEMS Switched Capacitors

    Science.gov (United States)

    Grichener, Alexander

    The thesis first presents a 2-port quasi-optical scalar network analyzer consisting of a transmitter and receiver both built in planar technology. The network analyzer is based on a Schottky-diode mixer integrated inside a planar antenna and fed differentially by a CPW transmission line. The antenna is placed on an extended hemispherical high-resistivity silicon substrate lens. The LO signal is swept from 3-5 GHz and high-order harmonic mixing in both up- and down- conversion mode is used to realize the 15-50 GHz RF bandwidth. The network analyzer resulted in a dynamic range of greater than 40 dB and was successfully used to measure a frequency selective surface with a second-order bandpass response. Furthermore, the system was built with circuits and components for easy scaling to millimeter-wave frequencies which is the primary motivation for this work. The application areas for a millimeter and submillimeter-wave network analyzer include material characterization and art diagnostics. The second project presents several RF MEMS switched capacitors designed for high-reliability operation and suitable for tunable filters and reconfigurable networks. The first switched-capacitor resulted in a digital capacitance ratio of 5 and an analog capacitance ratio of 5-9. The analog tuning of the down-state capacitance is enhanced by a positive vertical stress gradient in the the beam, making it ideal for applications that require precision tuning. A thick electroplated beam resulted in Q greater than 100 at C to X-band frequencies, and power handling of 0.6-1.1 W. The design also minimized charging in the dielectric, resulting in excellent reliability performance even under hot-switched and high power (1 W) conditions. The second switched-capacitor was designed without any dielectric to minimize charging. The device was hot-switched at 1 W of RF power for greater than 11 billion cycles with virtually no change in the C-V curve. The final project presents a 7-channel

  11. Measurements of gamma (γ)-emitting radionuclides with a high-purity germanium detector: the methods and reliability of our environmental assessments on the Fukushima 1 Nuclear Power Plant accident.

    Science.gov (United States)

    Mimura, Tetsuro; Mimura, Mari; Komiyama, Chiyo; Miyamoto, Masaaki; Kitamura, Akira

    2014-01-01

    The severe accident of Fukushima 1 Nuclear Power Plant due to the Tohoku Region Pacific Coast Earthquake in 11 March 2011 caused wide contamination and pollution by radionuclides in Fukushima and surrounding prefectures. In the current JPR symposium, a group of plant scientists attempted to examine the impact of the radioactive contamination on wild and cultivated plants. Measurements of gamma (γ) radiation from radionuclides in "Fukushima samples", which we called and collected from natural and agricultural areas in Fukushima prefecture were mostly done with a high-purity Ge detector in the Graduate School of Maritime Sciences, Kobe University. In this technical note, we describe the methods of sample preparation and measurements of radioactivity of the samples and discuss the reliability of our data in regards to the International Atomic Energy Agency (IAEA) Interlaboratory comparisons and proficiency test (IAEA proficiency test).

  12. A hybrid approach to quantify software reliability in nuclear safety systems

    International Nuclear Information System (INIS)

    Arun Babu, P.; Senthil Kumar, C.; Murali, N.

    2012-01-01

    Highlights: ► A novel method to quantify software reliability using software verification and mutation testing in nuclear safety systems. ► Contributing factors that influence software reliability estimate. ► Approach to help regulators verify the reliability of safety critical software system during software licensing process. -- Abstract: Technological advancements have led to the use of computer based systems in safety critical applications. As computer based systems are being introduced in nuclear power plants, effective and efficient methods are needed to ensure dependability and compliance to high reliability requirements of systems important to safety. Even after several years of research, quantification of software reliability remains controversial and unresolved issue. Also, existing approaches have assumptions and limitations, which are not acceptable for safety applications. This paper proposes a theoretical approach combining software verification and mutation testing to quantify the software reliability in nuclear safety systems. The theoretical results obtained suggest that the software reliability depends on three factors: the test adequacy, the amount of software verification carried out and the reusability of verified code in the software. The proposed approach may help regulators in licensing computer based safety systems in nuclear reactors.

  13. Designing Glass Panels for Economy and Reliability

    Science.gov (United States)

    Moore, D. M.

    1983-01-01

    Analytical method determines probability of failure of rectangular glass plates subjected to uniformly distributed loads such as those from wind, earthquake, snow, and deadweight. Developed as aid in design of protective glass covers for solar-cell arrays and solar collectors, method is also useful in estimating the reliability of large windows in buildings exposed to high winds and is adapted to nonlinear stress analysis of simply supported plates of any elastic material.

  14. A study on a reliability assessment methodology for the VHTR safety systems

    International Nuclear Information System (INIS)

    Lee, Hyung Sok

    2012-02-01

    The passive safety system of a 300MWt VHTR (Very High Temperature Reactor)which has attracted worldwide attention recently is actively considered for designing the improvement in the safety of the next generation nuclear power plant. The passive system functionality does not rely on an external source of the electrical support system,but on an intelligent use of the natural phenomena, such as convection, conduction, radiation, and gravity. It is not easy to evaluate quantitatively the reliability of the passive safety for the risk analysis considering the existing active system failure since the classical reliability assessment method could not be applicable. Therefore a new reliability methodology needs to be developed and applied for evaluating the reliability of the conceptual designed VHTR in this study. The preliminary evaluation and conceptualization are performed using the concept of the load and capacity theory related to the reliability physics model. The method of response surface method (RSM) is also utilized for evaluating the maximum temperature of nuclear fuel in this study. The significant variables and their correlation are considered for utilizing the GAMMA+ code. The proposed method might contribute to designing the new passive system of the VHTR

  15. Applying the High Reliability Health Care Maturity Model to Assess Hospital Performance: A VA Case Study.

    Science.gov (United States)

    Sullivan, Jennifer L; Rivard, Peter E; Shin, Marlena H; Rosen, Amy K

    2016-09-01

    The lack of a tool for categorizing and differentiating hospitals according to their high reliability organization (HRO)-related characteristics has hindered progress toward implementing and sustaining evidence-based HRO practices. Hospitals would benefit both from an understanding of the organizational characteristics that support HRO practices and from knowledge about the steps necessary to achieve HRO status to reduce the risk of harm and improve outcomes. The High Reliability Health Care Maturity (HRHCM) model, a model for health care organizations' achievement of high reliability with zero patient harm, incorporates three major domains critical for promoting HROs-Leadership, Safety Culture, and Robust Process Improvement ®. A study was conducted to examine the content validity of the HRHCM model and evaluate whether it can differentiate hospitals' maturity levels for each of the model's components. Staff perceptions of patient safety at six US Department of Veterans Affairs (VA) hospitals were examined to determine whether all 14 HRHCM components were present and to characterize each hospital's level of organizational maturity. Twelve of the 14 components from the HRHCM model were detected; two additional characteristics emerged that are present in the HRO literature but not represented in the model-teamwork culture and system-focused tools for learning and improvement. Each hospital's level of organizational maturity could be characterized for 9 of the 14 components. The findings suggest the HRHCM model has good content validity and that there is differentiation between hospitals on model components. Additional research is needed to understand how these components can be used to build the infrastructure necessary for reaching high reliability.

  16. Self-Tuning Method for Increased Obstacle Detection Reliability Based on Internet of Things LiDAR Sensor Models.

    Science.gov (United States)

    Castaño, Fernando; Beruvides, Gerardo; Villalonga, Alberto; Haber, Rodolfo E

    2018-05-10

    On-chip LiDAR sensors for vehicle collision avoidance are a rapidly expanding area of research and development. The assessment of reliable obstacle detection using data collected by LiDAR sensors has become a key issue that the scientific community is actively exploring. The design of a self-tuning methodology and its implementation are presented in this paper, to maximize the reliability of LiDAR sensors network for obstacle detection in the 'Internet of Things' (IoT) mobility scenarios. The Webots Automobile 3D simulation tool for emulating sensor interaction in complex driving environments is selected in order to achieve that objective. Furthermore, a model-based framework is defined that employs a point-cloud clustering technique, and an error-based prediction model library that is composed of a multilayer perceptron neural network, and k-nearest neighbors and linear regression models. Finally, a reinforcement learning technique, specifically a Q-learning method, is implemented to determine the number of LiDAR sensors that are required to increase sensor reliability for obstacle localization tasks. In addition, a IoT driving assistance user scenario, connecting a five LiDAR sensor network is designed and implemented to validate the accuracy of the computational intelligence-based framework. The results demonstrated that the self-tuning method is an appropriate strategy to increase the reliability of the sensor network while minimizing detection thresholds.

  17. Component reliability analysis for development of component reliability DB of Korean standard NPPs

    International Nuclear Information System (INIS)

    Choi, S. Y.; Han, S. H.; Kim, S. H.

    2002-01-01

    The reliability data of Korean NPP that reflects the plant specific characteristics is necessary for PSA and Risk Informed Application. We have performed a project to develop the component reliability DB and calculate the component reliability such as failure rate and unavailability. We have collected the component operation data and failure/repair data of Korean standard NPPs. We have analyzed failure data by developing a data analysis method which incorporates the domestic data situation. And then we have compared the reliability results with the generic data for the foreign NPPs

  18. Trends of HVDC technology - highly reliable converting equipment

    International Nuclear Information System (INIS)

    Muraoka, Yasuo; Kato, Yasushi; Watanabe, Atsumi; Kano, Takashi; Kawai, Tadao

    1983-01-01

    At present, the DC power transmission in Japan is practically used for the system connection of relatively small capacity, and the reliability of AC-DC converting system has been proven to exceed the world level by the operational results. However, when the application of this system to trunk power transmission of large capacity in future is considered, it is desirable to raise the reliability of converting equipment further and to develop the stabilized control techniques in harmony with connected AC system. Hitachi Ltd. has developed diversified system-related technologies centering around DC power transmission and the techniques for raising the reliability of converting equipment tending to large capacity. In this report, the results and the future prospect are described. The recent trend of DC power transmission, the development of DC power transmission technology such as the simulation analysis, the stable operation of a DC system connected to a weak AC system and the DC independent transmission from nuclear power stations, the technical development of light direct ignition thyristor bulbs and control protection equipment are reported. (Kako, I.)

  19. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    Science.gov (United States)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  20. Reliability-based design optimization using a generalized subset simulation method and posterior approximation

    Science.gov (United States)

    Ma, Yuan-Zhuo; Li, Hong-Shuang; Yao, Wei-Xing

    2018-05-01

    The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol' sequences and Bucher's design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.