WorldWideScience

Sample records for reliable experimental approach

  1. Development of a Conservative Model Validation Approach for Reliable Analysis

    Science.gov (United States)

    2015-01-01

    CIE 2015 August 2-5, 2015, Boston, Massachusetts, USA [DRAFT] DETC2015-46982 DEVELOPMENT OF A CONSERVATIVE MODEL VALIDATION APPROACH FOR RELIABLE...obtain a conservative simulation model for reliable design even with limited experimental data. Very little research has taken into account the...3, the proposed conservative model validation is briefly compared to the conventional model validation approach. Section 4 describes how to account

  2. Improving Wind Turbine Drivetrain Reliability Using a Combined Experimental, Computational, and Analytical Approach

    Energy Technology Data Exchange (ETDEWEB)

    Guo, Y.; van Dam, J.; Bergua, R.; Jove, J.; Campbell, J.

    2015-03-01

    Nontorque loads induced by the wind turbine rotor overhang weight and aerodynamic forces can greatly affect drivetrain loads and responses. If not addressed properly, these loads can result in a decrease in gearbox component life. This work uses analytical modeling, computational modeling, and experimental data to evaluate a unique drivetrain design that minimizes the effects of nontorque loads on gearbox reliability: the Pure Torque(R) drivetrain developed by Alstom. The drivetrain has a hub-support configuration that transmits nontorque loads directly into the tower rather than through the gearbox as in other design approaches. An analytical model of Alstom's Pure Torque drivetrain provides insight into the relationships among turbine component weights, aerodynamic forces, and the resulting drivetrain loads. Main shaft bending loads are orders of magnitude lower than the rated torque and are hardly affected by wind conditions and turbine operations.

  3. Inverse Reliability Task: Artificial Neural Networks and Reliability-Based Optimization Approaches

    OpenAIRE

    Lehký , David; Slowik , Ondřej; Novák , Drahomír

    2014-01-01

    Part 7: Genetic Algorithms; International audience; The paper presents two alternative approaches to solve inverse reliability task – to determine the design parameters to achieve desired target reliabilities. The first approach is based on utilization of artificial neural networks and small-sample simulation Latin hypercube sampling. The second approach considers inverse reliability task as reliability-based optimization task using double-loop method and also small-sample simulation. Efficie...

  4. Approach to reliability assessment

    International Nuclear Information System (INIS)

    Green, A.E.; Bourne, A.J.

    1975-01-01

    Experience has shown that reliability assessments can play an important role in the early design and subsequent operation of technological systems where reliability is at a premium. The approaches to and techniques for such assessments, which have been outlined in the paper, have been successfully applied in variety of applications ranging from individual equipments to large and complex systems. The general approach involves the logical and systematic establishment of the purpose, performance requirements and reliability criteria of systems. This is followed by an appraisal of likely system achievment based on the understanding of different types of variational behavior. A fundamental reliability model emerges from the correlation between the appropriate Q and H functions for performance requirement and achievement. This model may cover the complete spectrum of performance behavior in all the system dimensions

  5. New Approaches to Reliability Assessment

    DEFF Research Database (Denmark)

    Ma, Ke; Wang, Huai; Blaabjerg, Frede

    2016-01-01

    of energy. New approaches for reliability assessment are being taken in the design phase of power electronics systems based on the physics-of-failure in components. In this approach, many new methods, such as multidisciplinary simulation tools, strength testing of components, translation of mission profiles......, and statistical analysis, are involved to enable better prediction and design of reliability for products. This article gives an overview of the new design flow in the reliability engineering of power electronics from the system-level point of view and discusses some of the emerging needs for the technology...

  6. A reliability program approach to operational safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1985-01-01

    A Reliability Program (RP) model based on proven reliability techniques is being formulated for potential application in the nuclear power industry. Methods employed under NASA and military direction, commercial airline and related FAA programs were surveyed and a review of current nuclear risk-dominant issues conducted. The need for a reliability approach to address dependent system failures, operating and emergency procedures and human performance, and develop a plant-specific performance data base for safety decision making is demonstrated. Current research has concentrated on developing a Reliability Program approach for the operating phase of a nuclear plant's lifecycle. The approach incorporates performance monitoring and evaluation activities with dedicated tasks that integrate these activities with operation, surveillance, and maintenance of the plant. The detection, root-cause evaluation and before-the-fact correction of incipient or actual systems failures as a mechanism for maintaining plant safety is a major objective of the Reliability Program. (orig./HP)

  7. Reliability analysis - systematic approach based on limited data

    International Nuclear Information System (INIS)

    Bourne, A.J.

    1975-11-01

    The initial approaches required for reliability analysis are outlined. These approaches highlight the system boundaries, examine the conditions under which the system is required to operate, and define the overall performance requirements. The discussion is illustrated by a simple example of an automatic protective system for a nuclear reactor. It is then shown how the initial approach leads to a method of defining the system, establishing performance parameters of interest and determining the general form of reliability models to be used. The overall system model and the availability of reliability data at the system level are next examined. An iterative process is then described whereby the reliability model and data requirements are systematically refined at progressively lower hierarchic levels of the system. At each stage, the approach is illustrated with examples from the protective system previously described. The main advantages of the approach put forward are the systematic process of analysis, the concentration of assessment effort in the critical areas and the maximum use of limited reliability data. (author)

  8. Reliability of infarct volumetry: Its relevance and the improvement by a software-assisted approach.

    Science.gov (United States)

    Friedländer, Felix; Bohmann, Ferdinand; Brunkhorst, Max; Chae, Ju-Hee; Devraj, Kavi; Köhler, Yvette; Kraft, Peter; Kuhn, Hannah; Lucaciu, Alexandra; Luger, Sebastian; Pfeilschifter, Waltraud; Sadler, Rebecca; Liesz, Arthur; Scholtyschik, Karolina; Stolz, Leonie; Vutukuri, Rajkumar; Brunkhorst, Robert

    2017-08-01

    Despite the efficacy of neuroprotective approaches in animal models of stroke, their translation has so far failed from bench to bedside. One reason is presumed to be a low quality of preclinical study design, leading to bias and a low a priori power. In this study, we propose that the key read-out of experimental stroke studies, the volume of the ischemic damage as commonly measured by free-handed planimetry of TTC-stained brain sections, is subject to an unrecognized low inter-rater and test-retest reliability with strong implications for statistical power and bias. As an alternative approach, we suggest a simple, open-source, software-assisted method, taking advantage of automatic-thresholding techniques. The validity and the improvement of reliability by an automated method to tMCAO infarct volumetry are demonstrated. In addition, we show the probable consequences of increased reliability for precision, p-values, effect inflation, and power calculation, exemplified by a systematic analysis of experimental stroke studies published in the year 2015. Our study reveals an underappreciated quality problem in translational stroke research and suggests that software-assisted infarct volumetry might help to improve reproducibility and therefore the robustness of bench to bedside translation.

  9. Reliability-based optimal structural design by the decoupling approach

    International Nuclear Information System (INIS)

    Royset, J.O.; Der Kiureghian, A.; Polak, E.

    2001-01-01

    A decoupling approach for solving optimal structural design problems involving reliability terms in the objective function, the constraint set or both is discussed and extended. The approach employs a reformulation of each problem, in which reliability terms are replaced by deterministic functions. The reformulated problems can be solved by existing semi-infinite optimization algorithms and computational reliability methods. It is shown that the reformulated problems produce solutions that are identical to those of the original problems when the limit-state functions defining the reliability problem are affine. For nonaffine limit-state functions, approximate solutions are obtained by solving series of reformulated problems. An important advantage of the approach is that the required reliability and optimization calculations are completely decoupled, thus allowing flexibility in the choice of the optimization algorithm and the reliability computation method

  10. Different Reliability Assessment Approaches for Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kramer, Morten Mejlhede; Sørensen, John Dalsgaard

    2015-01-01

    Reliability assessments are of importance for wave energy converters (WECs) due to the fact that accessibility might be limited in case of failure and maintenance. These failure rates can be adapted by reliability considerations. There are two different approaches to how reliability can...

  11. Efficient approach for reliability-based optimization based on weighted importance sampling approach

    International Nuclear Information System (INIS)

    Yuan, Xiukai; Lu, Zhenzhou

    2014-01-01

    An efficient methodology is presented to perform the reliability-based optimization (RBO). It is based on an efficient weighted approach for constructing an approximation of the failure probability as an explicit function of the design variables which is referred to as the ‘failure probability function (FPF)’. It expresses the FPF as a weighted sum of sample values obtained in the simulation-based reliability analysis. The required computational effort for decoupling in each iteration is just single reliability analysis. After the approximation of the FPF is established, the target RBO problem can be decoupled into a deterministic one. Meanwhile, the proposed weighted approach is combined with a decoupling approach and a sequential approximate optimization framework. Engineering examples are given to demonstrate the efficiency and accuracy of the presented methodology

  12. Assessing high reliability via Bayesian approach and accelerated tests

    International Nuclear Information System (INIS)

    Erto, Pasquale; Giorgio, Massimiliano

    2002-01-01

    Sometimes the assessment of very high reliability levels is difficult for the following main reasons: - the high reliability level of each item makes it impossible to obtain, in a reasonably short time, a sufficient number of failures; - the high cost of the high reliability items to submit to life tests makes it unfeasible to collect enough data for 'classical' statistical analyses. In the above context, this paper presents a Bayesian solution to the problem of estimation of the parameters of the Weibull-inverse power law model, on the basis of a limited number (say six) of life tests, carried out at different stress levels, all higher than the normal one. The over-stressed (i.e. accelerated) tests allow the use of experimental data obtained in a reasonably short time. The Bayesian approach enables one to reduce the required number of failures adding to the failure information the available a priori engineers' knowledge. This engineers' involvement conforms to the most advanced management policy that aims at involving everyone's commitment in order to obtain total quality. A Monte Carlo study of the non-asymptotic properties of the proposed estimators and a comparison with the properties of maximum likelihood estimators closes the work

  13. Approach to developing reliable space reactor power systems

    International Nuclear Information System (INIS)

    Mondt, J.F.; Shinbrot, C.H.

    1991-01-01

    The Space Reactor Power System Project is in the engineering development phase of a three-phase program. During Phase II, the Engineering Development Phase, the SP-100 Project has defined and is pursuing a new approach to developing reliable power systems. The approach to developing such a system during the early technology phase is described in this paper along with some preliminary examples to help explain the approach. Developing reliable components to meet space reactor power system requirements is based on a top down systems approach which includes a point design based on a detailed technical specification of a 100 kW power system

  14. Soft computing approach for reliability optimization: State-of-the-art survey

    International Nuclear Information System (INIS)

    Gen, Mitsuo; Yun, Young Su

    2006-01-01

    In the broadest sense, reliability is a measure of performance of systems. As systems have grown more complex, the consequences of their unreliable behavior have become severe in terms of cost, effort, lives, etc., and the interest in assessing system reliability and the need for improving the reliability of products and systems have become very important. Most solution methods for reliability optimization assume that systems have redundancy components in series and/or parallel systems and alternative designs are available. Reliability optimization problems concentrate on optimal allocation of redundancy components and optimal selection of alternative designs to meet system requirement. In the past two decades, numerous reliability optimization techniques have been proposed. Generally, these techniques can be classified as linear programming, dynamic programming, integer programming, geometric programming, heuristic method, Lagrangean multiplier method and so on. A Genetic Algorithm (GA), as a soft computing approach, is a powerful tool for solving various reliability optimization problems. In this paper, we briefly survey GA-based approach for various reliability optimization problems, such as reliability optimization of redundant system, reliability optimization with alternative design, reliability optimization with time-dependent reliability, reliability optimization with interval coefficients, bicriteria reliability optimization, and reliability optimization with fuzzy goals. We also introduce the hybrid approaches for combining GA with fuzzy logic, neural network and other conventional search techniques. Finally, we have some experiments with an example of various reliability optimization problems using hybrid GA approach

  15. A Latent Class Approach to Estimating Test-Score Reliability

    Science.gov (United States)

    van der Ark, L. Andries; van der Palm, Daniel W.; Sijtsma, Klaas

    2011-01-01

    This study presents a general framework for single-administration reliability methods, such as Cronbach's alpha, Guttman's lambda-2, and method MS. This general framework was used to derive a new approach to estimating test-score reliability by means of the unrestricted latent class model. This new approach is the latent class reliability…

  16. A penalty guided stochastic fractal search approach for system reliability optimization

    International Nuclear Information System (INIS)

    Mellal, Mohamed Arezki; Zio, Enrico

    2016-01-01

    Modern industry requires components and systems with high reliability levels. In this paper, we address the system reliability optimization problem. A penalty guided stochastic fractal search approach is developed for solving reliability allocation, redundancy allocation, and reliability–redundancy allocation problems. Numerical results of ten case studies are presented as benchmark problems for highlighting the superiority of the proposed approach compared to others from literature. - Highlights: • System reliability optimization is investigated. • A penalty guided stochastic fractal search approach is developed. • Results of ten case studies are compared with previously published methods. • Performance of the approach is demonstrated.

  17. Relevance and reliability of experimental data in human health risk assessment of pesticides.

    Science.gov (United States)

    Kaltenhäuser, Johanna; Kneuer, Carsten; Marx-Stoelting, Philip; Niemann, Lars; Schubert, Jens; Stein, Bernd; Solecki, Roland

    2017-08-01

    Evaluation of data relevance, reliability and contribution to uncertainty is crucial in regulatory health risk assessment if robust conclusions are to be drawn. Whether a specific study is used as key study, as additional information or not accepted depends in part on the criteria according to which its relevance and reliability are judged. In addition to GLP-compliant regulatory studies following OECD Test Guidelines, data from peer-reviewed scientific literature have to be evaluated in regulatory risk assessment of pesticide active substances. Publications should be taken into account if they are of acceptable relevance and reliability. Their contribution to the overall weight of evidence is influenced by factors including test organism, study design and statistical methods, as well as test item identification, documentation and reporting of results. Various reports make recommendations for improving the quality of risk assessments and different criteria catalogues have been published to support evaluation of data relevance and reliability. Their intention was to guide transparent decision making on the integration of the respective information into the regulatory process. This article describes an approach to assess the relevance and reliability of experimental data from guideline-compliant studies as well as from non-guideline studies published in the scientific literature in the specific context of uncertainty and risk assessment of pesticides. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  18. Analytical approach for confirming the achievement of LMFBR reliability goals

    International Nuclear Information System (INIS)

    Ingram, G.E.; Elerath, J.G.; Wood, A.P.

    1981-01-01

    The approach, recommended by GE-ARSD, for confirming the achievement of LMFBR reliability goals relies upon a comprehensive understanding of the physical and operational characteristics of the system and the environments to which the system will be subjected during its operational life. This kind of understanding is required for an approach based on system hardware testing or analyses, as recommended in this report. However, for a system as complex and expensive as the LMFBR, an approach which relies primarily on system hardware testing would be prohibitive both in cost and time to obtain the required system reliability test information. By using an analytical approach, results of tests (reliability and functional) at a low level within the specific system of interest, as well as results from other similar systems can be used to form the data base for confirming the achievement of the system reliability goals. This data, along with information relating to the design characteristics and operating environments of the specific system, will be used in the assessment of the system's reliability

  19. A hybrid approach to quantify software reliability in nuclear safety systems

    International Nuclear Information System (INIS)

    Arun Babu, P.; Senthil Kumar, C.; Murali, N.

    2012-01-01

    Highlights: ► A novel method to quantify software reliability using software verification and mutation testing in nuclear safety systems. ► Contributing factors that influence software reliability estimate. ► Approach to help regulators verify the reliability of safety critical software system during software licensing process. -- Abstract: Technological advancements have led to the use of computer based systems in safety critical applications. As computer based systems are being introduced in nuclear power plants, effective and efficient methods are needed to ensure dependability and compliance to high reliability requirements of systems important to safety. Even after several years of research, quantification of software reliability remains controversial and unresolved issue. Also, existing approaches have assumptions and limitations, which are not acceptable for safety applications. This paper proposes a theoretical approach combining software verification and mutation testing to quantify the software reliability in nuclear safety systems. The theoretical results obtained suggest that the software reliability depends on three factors: the test adequacy, the amount of software verification carried out and the reusability of verified code in the software. The proposed approach may help regulators in licensing computer based safety systems in nuclear reactors.

  20. Development of a quality-assessment tool for experimental bruxism studies: reliability and validity.

    Science.gov (United States)

    Dawson, Andreas; Raphael, Karen G; Glaros, Alan; Axelsson, Susanna; Arima, Taro; Ernberg, Malin; Farella, Mauro; Lobbezoo, Frank; Manfredini, Daniele; Michelotti, Ambra; Svensson, Peter; List, Thomas

    2013-01-01

    To combine empirical evidence and expert opinion in a formal consensus method in order to develop a quality-assessment tool for experimental bruxism studies in systematic reviews. Tool development comprised five steps: (1) preliminary decisions, (2) item generation, (3) face-validity assessment, (4) reliability and discriminitive validity assessment, and (5) instrument refinement. The kappa value and phi-coefficient were calculated to assess inter-observer reliability and discriminative ability, respectively. Following preliminary decisions and a literature review, a list of 52 items to be considered for inclusion in the tool was compiled. Eleven experts were invited to join a Delphi panel and 10 accepted. Four Delphi rounds reduced the preliminary tool-Quality-Assessment Tool for Experimental Bruxism Studies (Qu-ATEBS)- to 8 items: study aim, study sample, control condition or group, study design, experimental bruxism task, statistics, interpretation of results, and conflict of interest statement. Consensus among the Delphi panelists yielded good face validity. Inter-observer reliability was acceptable (k = 0.77). Discriminative validity was excellent (phi coefficient 1.0; P reviews of experimental bruxism studies, exhibits face validity, excellent discriminative validity, and acceptable inter-observer reliability. Development of quality assessment tools for many other topics in the orofacial pain literature is needed and may follow the described procedure.

  1. Integrated approach to economical, reliable, safe nuclear power production

    International Nuclear Information System (INIS)

    1982-06-01

    An Integrated Approach to Economical, Reliable, Safe Nuclear Power Production is the latest evolution of a concept which originated with the Defense-in-Depth philosophy of the nuclear industry. As Defense-in-Depth provided a framework for viewing physical barriers and equipment redundancy, the Integrated Approach gives a framework for viewing nuclear power production in terms of functions and institutions. In the Integrated Approach, four plant Goals are defined (Normal Operation, Core and Plant Protection, Containment Integrity and Emergency Preparedness) with the attendant Functional and Institutional Classifications that support them. The Integrated Approach provides a systematic perspective that combines the economic objective of reliable power production with the safety objective of consistent, controlled plant operation

  2. A double-loop adaptive sampling approach for sensitivity-free dynamic reliability analysis

    International Nuclear Information System (INIS)

    Wang, Zequn; Wang, Pingfeng

    2015-01-01

    Dynamic reliability measures reliability of an engineered system considering time-variant operation condition and component deterioration. Due to high computational costs, conducting dynamic reliability analysis at an early system design stage remains challenging. This paper presents a confidence-based meta-modeling approach, referred to as double-loop adaptive sampling (DLAS), for efficient sensitivity-free dynamic reliability analysis. The DLAS builds a Gaussian process (GP) model sequentially to approximate extreme system responses over time, so that Monte Carlo simulation (MCS) can be employed directly to estimate dynamic reliability. A generic confidence measure is developed to evaluate the accuracy of dynamic reliability estimation while using the MCS approach based on developed GP models. A double-loop adaptive sampling scheme is developed to efficiently update the GP model in a sequential manner, by considering system input variables and time concurrently in two sampling loops. The model updating process using the developed sampling scheme can be terminated once the user defined confidence target is satisfied. The developed DLAS approach eliminates computationally expensive sensitivity analysis process, thus substantially improves the efficiency of dynamic reliability analysis. Three case studies are used to demonstrate the efficacy of DLAS for dynamic reliability analysis. - Highlights: • Developed a novel adaptive sampling approach for dynamic reliability analysis. • POD Developed a new metric to quantify the accuracy of dynamic reliability estimation. • Developed a new sequential sampling scheme to efficiently update surrogate models. • Three case studies were used to demonstrate the efficacy of the new approach. • Case study results showed substantially enhanced efficiency with high accuracy

  3. Novel approach for evaluation of service reliability for electricity customers

    Institute of Scientific and Technical Information of China (English)

    JIANG; John; N

    2009-01-01

    Understanding reliability value for electricity customer is important to market-based reliability management. This paper proposes a novel approach to evaluate the reliability for electricity customers by using indifference curve between economic compensation for power interruption and service reliability of electricity. Indifference curve is formed by calculating different planning schemes of network expansion for different reliability requirements of customers, which reveals the information about economic values for different reliability levels for electricity customers, so that the reliability based on market supply demand mechanism can be established and economic signals can be provided for reliability management and enhancement.

  4. Network reliability assessment using a cellular automata approach

    International Nuclear Information System (INIS)

    Rocco S, Claudio M.; Moreno, Jose Ali

    2002-01-01

    Two cellular automata (CA) models that evaluate the s-t connectedness and shortest path in a network are presented. CA based algorithms enhance the performance of classical algorithms, since they allow a more reliable and straightforward parallel implementation resulting in a dynamic network evaluation, where changes in the connectivity and/or link costs can readily be incorporated avoiding recalculation from scratch. The paper also demonstrates how these algorithms can be applied for network reliability evaluation (based on Monte-Carlo approach) and for finding s-t path with maximal reliability

  5. A computational Bayesian approach to dependency assessment in system reliability

    International Nuclear Information System (INIS)

    Yontay, Petek; Pan, Rong

    2016-01-01

    Due to the increasing complexity of engineered products, it is of great importance to develop a tool to assess reliability dependencies among components and systems under the uncertainty of system reliability structure. In this paper, a Bayesian network approach is proposed for evaluating the conditional probability of failure within a complex system, using a multilevel system configuration. Coupling with Bayesian inference, the posterior distributions of these conditional probabilities can be estimated by combining failure information and expert opinions at both system and component levels. Three data scenarios are considered in this study, and they demonstrate that, with the quantification of the stochastic relationship of reliability within a system, the dependency structure in system reliability can be gradually revealed by the data collected at different system levels. - Highlights: • A Bayesian network representation of system reliability is presented. • Bayesian inference methods for assessing dependencies in system reliability are developed. • Complete and incomplete data scenarios are discussed. • The proposed approach is able to integrate reliability information from multiple sources at multiple levels of the system.

  6. Ensemble of different approaches for a reliable person re-identification system

    Directory of Open Access Journals (Sweden)

    Loris Nanni

    2016-07-01

    Full Text Available An ensemble of approaches for reliable person re-identification is proposed in this paper. The proposed ensemble is built combining widely used person re-identification systems using different color spaces and some variants of state-of-the-art approaches that are proposed in this paper. Different descriptors are tested, and both texture and color features are extracted from the images; then the different descriptors are compared using different distance measures (e.g., the Euclidean distance, angle, and the Jeffrey distance. To improve performance, a method based on skeleton detection, extracted from the depth map, is also applied when the depth map is available. The proposed ensemble is validated on three widely used datasets (CAVIAR4REID, IAS, and VIPeR, keeping the same parameter set of each approach constant across all tests to avoid overfitting and to demonstrate that the proposed system can be considered a general-purpose person re-identification system. Our experimental results show that the proposed system offers significant improvements over baseline approaches. The source code used for the approaches tested in this paper will be available at https://www.dei.unipd.it/node/2357 and http://robotics.dei.unipd.it/reid/.

  7. An approach for assessing human decision reliability

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-01-01

    This paper presents a method to study human reliability in decision situations related to nuclear power plant disturbances. Decisions often play a significant role in handling of emergency situations. The method may be applied to probabilistic safety assessments (PSAs) in cases where decision making is an important dimension of an accident sequence. Such situations are frequent e.g. in accident management. In this paper, a modelling approach for decision reliability studies is first proposed. Then, a case study with two decision situations with relatively different characteristics is presented. Qualitative and quantitative findings of the study are discussed. In very simple decision cases with time pressure, time reliability correlation proved out to be a feasible reliability modelling method. In all other decision situations, more advanced probabilistic decision models have to be used. Finally, decision probability assessment by using simulator run results and expert judgement is presented

  8. Life cycle reliability assessment of new products—A Bayesian model updating approach

    International Nuclear Information System (INIS)

    Peng, Weiwen; Huang, Hong-Zhong; Li, Yanfeng; Zuo, Ming J.; Xie, Min

    2013-01-01

    The rapidly increasing pace and continuously evolving reliability requirements of new products have made life cycle reliability assessment of new products an imperative yet difficult work. While much work has been done to separately estimate reliability of new products in specific stages, a gap exists in carrying out life cycle reliability assessment throughout all life cycle stages. We present a Bayesian model updating approach (BMUA) for life cycle reliability assessment of new products. Novel features of this approach are the development of Bayesian information toolkits by separately including “reliability improvement factor” and “information fusion factor”, which allow the integration of subjective information in a specific life cycle stage and the transition of integrated information between adjacent life cycle stages. They lead to the unique characteristics of the BMUA in which information generated throughout life cycle stages are integrated coherently. To illustrate the approach, an application to the life cycle reliability assessment of a newly developed Gantry Machining Center is shown

  9. A new approach for reliability analysis with time-variant performance characteristics

    International Nuclear Information System (INIS)

    Wang, Zequn; Wang, Pingfeng

    2013-01-01

    Reliability represents safety level in industry practice and may variant due to time-variant operation condition and components deterioration throughout a product life-cycle. Thus, the capability to perform time-variant reliability analysis is of vital importance in practical engineering applications. This paper presents a new approach, referred to as nested extreme response surface (NERS), that can efficiently tackle time dependency issue in time-variant reliability analysis and enable to solve such problem by easily integrating with advanced time-independent tools. The key of the NERS approach is to build a nested response surface of time corresponding to the extreme value of the limit state function by employing Kriging model. To obtain the data for the Kriging model, the efficient global optimization technique is integrated with the NERS to extract the extreme time responses of the limit state function for any given system input. An adaptive response prediction and model maturation mechanism is developed based on mean square error (MSE) to concurrently improve the accuracy and computational efficiency of the proposed approach. With the nested response surface of time, the time-variant reliability analysis can be converted into the time-independent reliability analysis and existing advanced reliability analysis methods can be used. Three case studies are used to demonstrate the efficiency and accuracy of NERS approach

  10. Assuring the reliability of structural components - experimental data and non-destructive examination requirements

    International Nuclear Information System (INIS)

    Lucia, A.C.

    1984-01-01

    The probability of failure of a structural component can be estimated by either statistical methods or a probabilistic structural reliability approach (where the failure is seen as a level crossing of a damage stochastic process which develops in space and in time). The probabilistic approach has the advantage that it makes available not only an absolute value of the failure probability but also a lot of additional information. The disadvantage of the probabilistic approach is its complexity. It is discussed for the following situations: reliability of a structural component, material properties, data for fatigue crack growth evaluation, a bench mark exercise on reactor pressure vessel failure probability computation, and non-destructive examination for assuring a given level of structural reliability. (U.K.)

  11. A Data-Driven Reliability Estimation Approach for Phased-Mission Systems

    Directory of Open Access Journals (Sweden)

    Hua-Feng He

    2014-01-01

    Full Text Available We attempt to address the issues associated with reliability estimation for phased-mission systems (PMS and present a novel data-driven approach to achieve reliability estimation for PMS using the condition monitoring information and degradation data of such system under dynamic operating scenario. In this sense, this paper differs from the existing methods only considering the static scenario without using the real-time information, which aims to estimate the reliability for a population but not for an individual. In the presented approach, to establish a linkage between the historical data and real-time information of the individual PMS, we adopt a stochastic filtering model to model the phase duration and obtain the updated estimation of the mission time by Bayesian law at each phase. At the meanwhile, the lifetime of PMS is estimated from degradation data, which are modeled by an adaptive Brownian motion. As such, the mission reliability can be real time obtained through the estimated distribution of the mission time in conjunction with the estimated lifetime distribution. We demonstrate the usefulness of the developed approach via a numerical example.

  12. Parts and Components Reliability Assessment: A Cost Effective Approach

    Science.gov (United States)

    Lee, Lydia

    2009-01-01

    System reliability assessment is a methodology which incorporates reliability analyses performed at parts and components level such as Reliability Prediction, Failure Modes and Effects Analysis (FMEA) and Fault Tree Analysis (FTA) to assess risks, perform design tradeoffs, and therefore, to ensure effective productivity and/or mission success. The system reliability is used to optimize the product design to accommodate today?s mandated budget, manpower, and schedule constraints. Stand ard based reliability assessment is an effective approach consisting of reliability predictions together with other reliability analyses for electronic, electrical, and electro-mechanical (EEE) complex parts and components of large systems based on failure rate estimates published by the United States (U.S.) military or commercial standards and handbooks. Many of these standards are globally accepted and recognized. The reliability assessment is especially useful during the initial stages when the system design is still in the development and hard failure data is not yet available or manufacturers are not contractually obliged by their customers to publish the reliability estimates/predictions for their parts and components. This paper presents a methodology to assess system reliability using parts and components reliability estimates to ensure effective productivity and/or mission success in an efficient manner, low cost, and tight schedule.

  13. An integrated approach to human reliability analysis -- decision analytic dynamic reliability model

    International Nuclear Information System (INIS)

    Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.

    1999-01-01

    The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis

  14. Some approaches to system reliability improvement in engineering design

    International Nuclear Information System (INIS)

    Shen, Kecheng.

    1990-01-01

    In this thesis some approaches to system reliability improvement in engineering design are studied. In particular, the thesis aims at developing alternative methodologies for ranking of component importance which are more related to the design practice and which are more useful in system synthesis than the existing ones. It also aims at developing component reliability models by means of stress-strength interference which will enable both component reliability prediction and design for reliability. A new methodology for ranking of component importance is first developed based on the notion of the increase of the expected system yield. This methodology allows for incorporation of different improvement actions at the component level such as parallel redundancy, standby redundancy, burn-in, minimal repair and perfect replacement. For each of these improvement actions, the increase of system reliability is studied and used as the component importance measure. A possible connection between the commonly known models of component lifetimes and the stress-strength interference models is suggested. Under some general conditions the relationship between component failure rate and the stress and strength distribution characteristics is studied. A heuristic approach for obtaining bounds on failure probability through stress-strength interference is also presented. A case study and a worked example are presented, which illustrate and verify the developed importance measures and their applications in the analytical as well as synthetical work of engineering design. (author)

  15. Failure mode and effect analysis experimental reliability determination for the CANDU reactor equipment

    International Nuclear Information System (INIS)

    Vieru, G.

    1996-01-01

    This paper describes the experimental tests performed in order to prove the reliability parameters for certain equipment manufactured in INR Pitesti, for NPP Cernavoda. The tests were provided by Technical Specifications and test procedures. A comparison, referring to the reliability parameters, between Canadian equipment and INR manufactured equipment ones is also given. The results of tests and conclusions are shown. (author)

  16. Experimental Test and Simulations on a Linear Generator-Based Prototype of a Wave Energy Conversion System Designed with a Reliability-Oriented Approach

    Directory of Open Access Journals (Sweden)

    Valeria Boscaino

    2017-01-01

    Full Text Available In this paper, we propose a reliability-oriented design of a linear generator-based prototype of a wave energy conversion (WEC, useful for the production of hydrogen in a sheltered water area like Mediterranean Sea. The hydrogen production has been confirmed by a lot of experimental testing and simulations. The system design is aimed to enhance the robustness and reliability and is based on an analysis of the main WEC failures reported in literature. The results of this analysis led to some improvements that are applied to a WEC system prototype for hydrogen production and storage. The proposed WEC system includes the electrical linear generator, the power conversion system, and a sea-water electrolyzer. A modular architecture is conceived to provide ease of extension of the power capability of the marine plant. The experimental results developed on the permanent magnet linear electric generator have allowed identification of the stator winding typology and, consequently, ability to size the power electronics system. The produced hydrogen has supplied a low-power fuel cell stack directly connected to the hydrogen output from the electrolyzer. The small-scale prototype is designed to be installed, in the near future, into the Mediterranean Sea. As shown by experimental and simulation results, the small-scale prototype is suitable for hydrogen production and storage from sea water in this area.

  17. The reliability analysis of cutting tools in the HSM processes

    OpenAIRE

    W.S. Lin

    2008-01-01

    Purpose: This article mainly describe the reliability of the cutting tools in the high speed turning by normaldistribution model.Design/methodology/approach: A series of experimental tests have been done to evaluate the reliabilityvariation of the cutting tools. From experimental results, the tool wear distribution and the tool life are determined,and the tool life distribution and the reliability function of cutting tools are derived. Further, the reliability ofcutting tools at anytime for h...

  18. An efficient approach to bioconversion kinetic model generation based on automated microscale experimentation integrated with model driven experimental design

    DEFF Research Database (Denmark)

    Chen, B. H.; Micheletti, M.; Baganz, F.

    2009-01-01

    -erythrulose. Experiments were performed using automated microwell studies at the 150 or 800 mu L scale. The derived kinetic parameters were then verified in a second round of experiments where model predictions showed excellent agreement with experimental data obtained under conditions not included in the original......Reliable models of enzyme kinetics are required for the effective design of bioconversion processes. Kinetic expressions of the enzyme-catalysed reaction rate however, are frequently complex and establishing accurate values of kinetic parameters normally requires a large number of experiments....... These can be both time consuming and expensive when working with the types of non-natural chiral intermediates important in pharmaceutical syntheses. This paper presents ail automated microscale approach to the rapid and cost effective generation of reliable kinetic models useful for bioconversion process...

  19. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  20. A rule induction approach to improve Monte Carlo system reliability assessment

    International Nuclear Information System (INIS)

    Rocco S, Claudio M.

    2003-01-01

    A Decision Tree (DT) approach to build empirical models for use in Monte Carlo reliability evaluation is presented. The main idea is to develop an estimation algorithm, by training a model on a restricted data set, and replacing the Evaluation Function (EF) by a simpler calculation, which provides reasonably accurate model outputs. The proposed approach is illustrated with two systems of different size, represented by their equivalent networks. The robustness of the DT approach as an approximated method to replace the EF is also analysed. Excellent system reliability results are obtained by training a DT with a small amount of information

  1. A novel ontology approach to support design for reliability considering environmental effects.

    Science.gov (United States)

    Sun, Bo; Li, Yu; Ye, Tianyuan; Ren, Yi

    2015-01-01

    Environmental effects are not considered sufficiently in product design. Reliability problems caused by environmental effects are very prominent. This paper proposes a method to apply ontology approach in product design. During product reliability design and analysis, environmental effects knowledge reusing is achieved. First, the relationship of environmental effects and product reliability is analyzed. Then environmental effects ontology to describe environmental effects domain knowledge is designed. Related concepts of environmental effects are formally defined by using the ontology approach. This model can be applied to arrange environmental effects knowledge in different environments. Finally, rubber seals used in the subhumid acid rain environment are taken as an example to illustrate ontological model application on reliability design and analysis.

  2. Reliability Approach of a Compressor System using Reliability Block ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... This paper presents a reliability analysis of such a system using reliability ... Keywords-compressor system, reliability, reliability block diagram, RBD .... the same structure has been kept with the three subsystems: air flow, oil flow and .... and Safety in Engineering Design", Springer, 2009. [3] P. O'Connor ...

  3. Predicting risk and human reliability: a new approach

    International Nuclear Information System (INIS)

    Duffey, R.; Ha, T.-S.

    2009-01-01

    Learning from experience describes human reliability and skill acquisition, and the resulting theory has been validated by comparison against millions of outcome data from multiple industries and technologies worldwide. The resulting predictions were used to benchmark the classic first generation human reliability methods adopted in probabilistic risk assessments. The learning rate, probabilities and response times are also consistent with the existing psychological models for human learning and error correction. The new approach also implies a finite lower bound probability that is not predicted by empirical statistical distributions that ignore the known and fundamental learning effects. (author)

  4. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    Science.gov (United States)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  5. Experimental approaches and applications

    CERN Document Server

    Crasemann, Bernd

    1975-01-01

    Atomic Inner-Shell Processes, Volume II: Experimental Approaches and Applications focuses on the physics of atomic inner shells, with emphasis on experimental aspects including the use of radioactive atoms for studies of atomic transition probabilities. Surveys of modern techniques of electron and photon spectrometry are also presented, and selected practical applications of inner-shell processes are outlined. Comprised of six chapters, this volume begins with an overview of the general principles underlying the experimental techniques that make use of radioactive isotopes for inner-sh

  6. Engineering systems reliability, safety, and maintenance an integrated approach

    CERN Document Server

    Dhillon, B S

    2017-01-01

    Today, engineering systems are an important element of the world economy and each year billions of dollars are spent to develop, manufacture, operate, and maintain various types of engineering systems around the globe. Many of these systems are highly sophisticated and contain millions of parts. For example, a Boeing jumbo 747 is made up of approximately 4.5 million parts including fasteners. Needless to say, reliability, safety, and maintenance of systems such as this have become more important than ever before.  Global competition and other factors are forcing manufacturers to produce highly reliable, safe, and maintainable engineering products. Therefore, there is a definite need for the reliability, safety, and maintenance professionals to work closely during design and other phases. Engineering Systems Reliability, Safety, and Maintenance: An Integrated Approach eliminates the need to consult many different and diverse sources in the hunt for the information required to design better engineering syste...

  7. An Integrated Approach to Establish Validity and Reliability of Reading Tests

    Science.gov (United States)

    Razi, Salim

    2012-01-01

    This study presents the processes of developing and establishing reliability and validity of a reading test by administering an integrative approach as conventional reliability and validity measures superficially reveals the difficulty of a reading test. In this respect, analysing vocabulary frequency of the test is regarded as a more eligible way…

  8. Reliability-oriented multi-objective optimal decision-making approach for uncertainty-based watershed load reduction

    International Nuclear Information System (INIS)

    Dong, Feifei; Liu, Yong; Su, Han; Zou, Rui; Guo, Huaicheng

    2015-01-01

    Water quality management and load reduction are subject to inherent uncertainties in watershed systems and competing decision objectives. Therefore, optimal decision-making modeling in watershed load reduction is suffering due to the following challenges: (a) it is difficult to obtain absolutely “optimal” solutions, and (b) decision schemes may be vulnerable to failure. The probability that solutions are feasible under uncertainties is defined as reliability. A reliability-oriented multi-objective (ROMO) decision-making approach was proposed in this study for optimal decision making with stochastic parameters and multiple decision reliability objectives. Lake Dianchi, one of the three most eutrophic lakes in China, was examined as a case study for optimal watershed nutrient load reduction to restore lake water quality. This study aimed to maximize reliability levels from considerations of cost and load reductions. The Pareto solutions of the ROMO optimization model were generated with the multi-objective evolutionary algorithm, demonstrating schemes representing different biases towards reliability. The Pareto fronts of six maximum allowable emission (MAE) scenarios were obtained, which indicated that decisions may be unreliable under unpractical load reduction requirements. A decision scheme identification process was conducted using the back propagation neural network (BPNN) method to provide a shortcut for identifying schemes at specific reliability levels for decision makers. The model results indicated that the ROMO approach can offer decision makers great insights into reliability tradeoffs and can thus help them to avoid ineffective decisions. - Highlights: • Reliability-oriented multi-objective (ROMO) optimal decision approach was proposed. • The approach can avoid specifying reliability levels prior to optimization modeling. • Multiple reliability objectives can be systematically balanced using Pareto fronts. • Neural network model was used to

  9. Reliability-oriented multi-objective optimal decision-making approach for uncertainty-based watershed load reduction

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Feifei [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Liu, Yong, E-mail: yongliu@pku.edu.cn [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Institute of Water Sciences, Peking University, Beijing 100871 (China); Su, Han [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China); Zou, Rui [Tetra Tech, Inc., 10306 Eaton Place, Ste 340, Fairfax, VA 22030 (United States); Yunnan Key Laboratory of Pollution Process and Management of Plateau Lake-Watershed, Kunming 650034 (China); Guo, Huaicheng [College of Environmental Science and Engineering, Key Laboratory of Water and Sediment Sciences (MOE), Peking University, Beijing 100871 (China)

    2015-05-15

    Water quality management and load reduction are subject to inherent uncertainties in watershed systems and competing decision objectives. Therefore, optimal decision-making modeling in watershed load reduction is suffering due to the following challenges: (a) it is difficult to obtain absolutely “optimal” solutions, and (b) decision schemes may be vulnerable to failure. The probability that solutions are feasible under uncertainties is defined as reliability. A reliability-oriented multi-objective (ROMO) decision-making approach was proposed in this study for optimal decision making with stochastic parameters and multiple decision reliability objectives. Lake Dianchi, one of the three most eutrophic lakes in China, was examined as a case study for optimal watershed nutrient load reduction to restore lake water quality. This study aimed to maximize reliability levels from considerations of cost and load reductions. The Pareto solutions of the ROMO optimization model were generated with the multi-objective evolutionary algorithm, demonstrating schemes representing different biases towards reliability. The Pareto fronts of six maximum allowable emission (MAE) scenarios were obtained, which indicated that decisions may be unreliable under unpractical load reduction requirements. A decision scheme identification process was conducted using the back propagation neural network (BPNN) method to provide a shortcut for identifying schemes at specific reliability levels for decision makers. The model results indicated that the ROMO approach can offer decision makers great insights into reliability tradeoffs and can thus help them to avoid ineffective decisions. - Highlights: • Reliability-oriented multi-objective (ROMO) optimal decision approach was proposed. • The approach can avoid specifying reliability levels prior to optimization modeling. • Multiple reliability objectives can be systematically balanced using Pareto fronts. • Neural network model was used to

  10. Reliability-redundancy optimization by means of a chaotic differential evolution approach

    International Nuclear Information System (INIS)

    Coelho, Leandro dos Santos

    2009-01-01

    The reliability design is related to the performance analysis of many engineering systems. The reliability-redundancy optimization problems involve selection of components with multiple choices and redundancy levels that produce maximum benefits, can be subject to the cost, weight, and volume constraints. Classical mathematical methods have failed in handling nonconvexities and nonsmoothness in optimization problems. As an alternative to the classical optimization approaches, the meta-heuristics have been given much attention by many researchers due to their ability to find an almost global optimal solution in reliability-redundancy optimization problems. Evolutionary algorithms (EAs) - paradigms of evolutionary computation field - are stochastic and robust meta-heuristics useful to solve reliability-redundancy optimization problems. EAs such as genetic algorithm, evolutionary programming, evolution strategies and differential evolution are being used to find global or near global optimal solution. A differential evolution approach based on chaotic sequences using Lozi's map for reliability-redundancy optimization problems is proposed in this paper. The proposed method has a fast convergence rate but also maintains the diversity of the population so as to escape from local optima. An application example in reliability-redundancy optimization based on the overspeed protection system of a gas turbine is given to show its usefulness and efficiency. Simulation results show that the application of deterministic chaotic sequences instead of random sequences is a possible strategy to improve the performance of differential evolution.

  11. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  12. Simulation Approach to Mission Risk and Reliability Analysis, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  13. Molecular approach of uranyl/mineral surfaces: experimental approach

    International Nuclear Information System (INIS)

    Drot, R.

    2009-01-01

    The author reports an experimental approach in which different spectroscopic approaches are coupled (laser spectroscopy, X-ray absorption spectroscopy, vibrational spectroscopy) to investigate the mechanisms controlling actinide sorption processes by different substrates, in order to assess radioactive waste storage site safety. Different substrates have been considered: monocrystalline or powdered TiO 2 , montmorillonite, and gibbsite

  14. Monte Carlo simulation - a powerful tool to support experimental activities in structure reliability

    International Nuclear Information System (INIS)

    Yuritzinn, T.; Chapuliot, S.; Eid, M.; Masson, R.; Dahl, A.; Moinereau, D.

    2003-01-01

    Monte-Carlo Simulation (MCS) can have different uses in supporting structure reliability investigations and assessments. In this paper we focus our interest on the use of MCS as a numerical tool to support the fitting of the experimental data related to toughness experiments. (authors)

  15. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  16. Development of a Reliability Program approach to assuring operational nuclear safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1985-01-01

    A Reliability Program (RP) model based on proven reliability techniques used in other high technology industries is being formulated for potential application in the nuclear power industry. Research findings are discussed. The reliability methods employed under NASA and military direction, commercial airline and related FAA programs were surveyed with several reliability concepts (e.g., quantitative reliability goals, reliability centered maintenance) appearing to be directly transferable. Other tasks in the RP development effort involved the benchmarking and evaluation of the existing nuclear regulations and practices relevant to safety/reliability integration. A review of current risk-dominant issues was also conducted using results from existing probabilistic risk assessment studies. The ongoing RP development tasks have concentrated on defining a RP for the operating phase of a nuclear plant's lifecycle. The RP approach incorporates safety systems risk/reliability analysis and performance monitoring activities with dedicated tasks that integrate these activities with operating, surveillance, and maintenance of the plant. The detection, root-cause evaluation and before-the-fact correction of incipient or actual systems failures as a mechanism for maintaining plant safety is a major objective of the RP

  17. The DYLAM approach for the dynamic reliability analysis of systems

    International Nuclear Information System (INIS)

    Cojazzi, Giacomo

    1996-01-01

    In many real systems, failures occurring to the components, control failures and human interventions often interact with the physical system evolution in such a way that a simple reliability analysis, de-coupled from process dynamics, is very difficult or even impossible. In the last ten years many dynamic reliability approaches have been proposed to properly assess the reliability of these systems characterized by dynamic interactions. The DYLAM methodology, now implemented in its latest version, DYLAM-3, offers a powerful tool for integrating deterministic and failure events. This paper describes the main features of the DYLAM-3 code with reference to the classic fault-tree and event-tree techniques. Some aspects connected to the practical problems underlying dynamic event-trees are also discussed. A simple system, already analyzed with other dynamic methods is used as a reference for the numerical applications. The same system is also studied with a time-dependent fault-tree approach in order to show some features of dynamic methods vs classical techniques. Examples including stochastic failures, without and with repair, failures on demand and time dependent failure rates give an extensive overview of DYLAM-3 capabilities

  18. Approach to assurance of reliability of linear accelerator operation observations

    International Nuclear Information System (INIS)

    Bakov, S.M.; Borovikov, A.A.; Kavkun, S.L.

    1994-01-01

    The system approach to solving the task of assuring reliability of observations over the linear accelerator operation is proposed. The basic principles of this method consist in application of dependences between the facility parameters, decrease in the number of the system apparatus channels for data acquisition without replacement of failed channel by reserve one. The signal commutation unit, the introduction whereof into the data acquisition system essentially increases the reliability of the measurement system on the account of active reserve, is considered detail. 8 refs. 6 figs

  19. Reliability of an experimental method to analyse the impact point on a golf ball during putting.

    Science.gov (United States)

    Richardson, Ashley K; Mitchell, Andrew C S; Hughes, Gerwyn

    2015-06-01

    This study aimed to examine the reliability of an experimental method identifying the location of the impact point on a golf ball during putting. Forty trials were completed using a mechanical putting robot set to reproduce a putt of 3.2 m, with four different putter-ball combinations. After locating the centre of the dimple pattern (centroid) the following variables were tested; distance of the impact point from the centroid, angle of the impact point from the centroid and distance of the impact point from the centroid derived from the X, Y coordinates. Good to excellent reliability was demonstrated in all impact variables reflected in very strong relative (ICC = 0.98-1.00) and absolute reliability (SEM% = 0.9-4.3%). The highest SEM% observed was 7% for the angle of the impact point from the centroid. In conclusion, the experimental method was shown to be reliable at locating the centroid location of a golf ball, therefore allowing for the identification of the point of impact with the putter head and is suitable for use in subsequent studies.

  20. Correlating neutron yield and reliability for selecting experimental parameters for a plasma focus machine

    International Nuclear Information System (INIS)

    Pross, G.

    Possibilities of optimizing focus machines with a given energy content in the sense of high neutron yield and high reliability of the discharges are investigated experimentally. For this purpose, a focus machine of the Mather type with an energy content of 12 kJ was constructed. The following experimental parameters were varied: the material of the insulator in the ignition zone, the structure of the outside electrode, the length of the inside electrode, the filling pressure and the amount and polarity of the battery voltage. An important part of the diagnostic program consists of measurements of the azimuthal and axial current distribution in the accelerator, correlated with short-term photographs of the luminous front as a function of time. The results are given. A functional schematic has been drafted for focus discharge as an aid in extensive optimization of focus machines, combining findings from theory and experiments. The schematic takes into account the multiparameter character of the discharge and clarifies relationships between the experimental parameters and the target variables neutron yield and reliability

  1. A heuristic-based approach for reliability importance assessment of energy producers

    International Nuclear Information System (INIS)

    Akhavein, A.; Fotuhi Firuzabad, M.

    2011-01-01

    Reliability of energy supply is one of the most important issues of service quality. On one hand, customers usually have different expectations for service reliability and price. On the other hand, providing different level of reliability at load points is a challenge for system operators. In order to take reasonable decisions and obviate reliability implementation difficulties, market players need to know impacts of their assets on system and load-point reliabilities. One tool to specify reliability impacts of assets is the criticality or reliability importance measure by which system components can be ranked based on their effect on reliability. Conventional methods for determination of reliability importance are essentially on the basis of risk sensitivity analysis and hence, impose prohibitive calculation burden in large power systems. An approach is proposed in this paper to determine reliability importance of energy producers from perspective of consumers or distribution companies in a composite generation and transmission system. In the presented method, while avoiding immense computational burden, the energy producers are ranked based on their rating, unavailability and impact on power flows in the lines connecting to the considered load points. Study results on the IEEE reliability test system show successful application of the proposed method. - Research highlights: → Required reliability level at load points is a concern in modern power systems. → It is important to assess reliability importance of energy producers or generators. → Generators can be ranked based on their impacts on power flow to a selected area. → Ranking of generators is an efficient tool to assess their reliability importance.

  2. Application of safety and reliability approaches in the power sector: Inside-sectoral overview

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    This chapter summarizes the state-of-the-art and state-of-practice on the applications of safety and reliability approaches in the Power Sector. The nature and composition of this industrial sector including the characteristics of major hazards are summarized. The present situation with regard...... to a number of key technical aspects involved in the use of safety and reliability approaches in the power sector is discussed. Based on this review a Technology Maturity Matrix is synthesized. Barriers to the wider use of risk and reliability methods in the design and operation of power installations...... are identified and possible ways of overcoming these barriers are suggested. Key issues and priorities for research are identified....

  3. An approach for assessing ALWR passive safety system reliability

    International Nuclear Information System (INIS)

    Hake, T.M.

    1991-01-01

    Many of the advanced light water reactor (ALWR) concepts proposed for the next generation of nuclear power plants rely on passive rather than active systems to perform safety functions. Despite the reduced redundancy of the passive systems as compared to active systems in current plants, the assertion is that the overall safety of the plant is enhanced due to the much higher expected reliability of the passive systems. In order to investigate this assertion, a study is being conducted at Sandia National Laboratories to evaluate the reliability of ALWR passive safety features in the context of probabilistic risk assessment (PRA). The purpose of this paper is to provide a brief overview of the approach to this study. The quantification of passive system reliability is not as straightforward as for active systems, due to the lack of operating experience, and to the greater uncertainty in the governing physical phenomena. Thus, the adequacy of current methods for evaluating system reliability must be assessed, and alternatives proposed if necessary. For this study, the Westinghouse Advanced Passive 600 MWe reactor (AP600) was chosen as the advanced reactor for analysis, because of the availability of AP600 design information. This study compares the reliability of AP600 emergency cooling system with that of corresponding systems in a current generation reactor

  4. Using Bayesian belief networks for reliability management : construction and evaluation: a step by step approach

    NARCIS (Netherlands)

    Houben, M.J.H.A.

    2010-01-01

    In the capital goods industry, there is a growing need to manage reliability throughout the product development process. A number of trends can be identified that have a strong effect on the way in which reliability prediction and management is approached, i.e.: - The lifecycle costs approach that

  5. Experimental research of fuel element reliability

    International Nuclear Information System (INIS)

    Cech, B.; Novak, J.; Chamrad, B.

    1980-01-01

    The rate and extent of the damage of the can integrity for fission products is the basic criterion of reliability. The extent of damage is measurable by the fission product leakage into the reactor coolant circuit. An analysis is made of the causes of the fuel element can damage and a model is proposed for testing fuel element reliability. Special experiments should be carried out to assess partial processes, such as heat transfer and fuel element surface temperature, fission gas liberation and pressure changes inside the element, corrosion weakening of the can wall, can deformation as a result of mechanical interactions. The irradiation probe for reliability testing of fuel elements is described. (M.S.)

  6. Reliability assessment using degradation models: bayesian and classical approaches

    Directory of Open Access Journals (Sweden)

    Marta Afonso Freitas

    2010-04-01

    Full Text Available Traditionally, reliability assessment of devices has been based on (accelerated life tests. However, for highly reliable products, little information about reliability is provided by life tests in which few or no failures are typically observed. Since most failures arise from a degradation mechanism at work for which there are characteristics that degrade over time, one alternative is monitor the device for a period of time and assess its reliability from the changes in performance (degradation observed during that period. The goal of this article is to illustrate how degradation data can be modeled and analyzed by using "classical" and Bayesian approaches. Four methods of data analysis based on classical inference are presented. Next we show how Bayesian methods can also be used to provide a natural approach to analyzing degradation data. The approaches are applied to a real data set regarding train wheels degradation.Tradicionalmente, o acesso à confiabilidade de dispositivos tem sido baseado em testes de vida (acelerados. Entretanto, para produtos altamente confiáveis, pouca informação a respeito de sua confiabilidade é fornecida por testes de vida no quais poucas ou nenhumas falhas são observadas. Uma vez que boa parte das falhas é induzida por mecanismos de degradação, uma alternativa é monitorar o dispositivo por um período de tempo e acessar sua confiabilidade através das mudanças em desempenho (degradação observadas durante aquele período. O objetivo deste artigo é ilustrar como dados de degradação podem ser modelados e analisados utilizando-se abordagens "clássicas" e Bayesiana. Quatro métodos de análise de dados baseados em inferência clássica são apresentados. A seguir, mostramos como os métodos Bayesianos podem também ser aplicados para proporcionar uma abordagem natural à análise de dados de degradação. As abordagens são aplicadas a um banco de dados real relacionado à degradação de rodas de trens.

  7. Experimental design research approaches, perspectives, applications

    CERN Document Server

    Stanković, Tino; Štorga, Mario

    2016-01-01

    This book presents a new, multidisciplinary perspective on and paradigm for integrative experimental design research. It addresses various perspectives on methods, analysis and overall research approach, and how they can be synthesized to advance understanding of design. It explores the foundations of experimental approaches and their utility in this domain, and brings together analytical approaches to promote an integrated understanding. The book also investigates where these approaches lead to and how they link design research more fully with other disciplines (e.g. psychology, cognition, sociology, computer science, management). Above all, the book emphasizes the integrative nature of design research in terms of the methods, theories, and units of study—from the individual to the organizational level. Although this approach offers many advantages, it has inherently led to a situation in current research practice where methods are diverging and integration between individual, team and organizational under...

  8. Experimental design a chemometric approach

    CERN Document Server

    Deming, SN

    1987-01-01

    Now available in a paperback edition is a book which has been described as ``...an exceptionally lucid, easy-to-read presentation... would be an excellent addition to the collection of every analytical chemist. I recommend it with great enthusiasm.'' (Analytical Chemistry). Unlike most current textbooks, it approaches experimental design from the point of view of the experimenter, rather than that of the statistician. As the reviewer in `Analytical Chemistry' went on to say: ``Deming and Morgan should be given high praise for bringing the principles of experimental design to the level of the p

  9. The reliability paradox: Why robust cognitive tasks do not produce reliable individual differences.

    Science.gov (United States)

    Hedge, Craig; Powell, Georgina; Sumner, Petroc

    2017-07-19

    Individual differences in cognitive paradigms are increasingly employed to relate cognition to brain structure, chemistry, and function. However, such efforts are often unfruitful, even with the most well established tasks. Here we offer an explanation for failures in the application of robust cognitive paradigms to the study of individual differences. Experimental effects become well established - and thus those tasks become popular - when between-subject variability is low. However, low between-subject variability causes low reliability for individual differences, destroying replicable correlations with other factors and potentially undermining published conclusions drawn from correlational relationships. Though these statistical issues have a long history in psychology, they are widely overlooked in cognitive psychology and neuroscience today. In three studies, we assessed test-retest reliability of seven classic tasks: Eriksen Flanker, Stroop, stop-signal, go/no-go, Posner cueing, Navon, and Spatial-Numerical Association of Response Code (SNARC). Reliabilities ranged from 0 to .82, being surprisingly low for most tasks given their common use. As we predicted, this emerged from low variance between individuals rather than high measurement variance. In other words, the very reason such tasks produce robust and easily replicable experimental effects - low between-participant variability - makes their use as correlational tools problematic. We demonstrate that taking such reliability estimates into account has the potential to qualitatively change theoretical conclusions. The implications of our findings are that well-established approaches in experimental psychology and neuropsychology may not directly translate to the study of individual differences in brain structure, chemistry, and function, and alternative metrics may be required.

  10. Approaches to safety, environment and regulatory approval for the International Thermonuclear Experimental Reactor

    International Nuclear Information System (INIS)

    Saji, G.; Bartels, H.W.; Chuyanov, V.; Holland, D.; Kashirski, A.V.; Morozov, S.I.; Piet, S.J.; Poucet, A.; Raeder, J.; Rebut, P.H.; Topilski, L.N.

    1995-01-01

    International Thermonuclear Experimental Reactor (ITER) Engineering Design Activities (EDA) in safety and environment are approaching the point where conceptual safety design, topic studies and research will give way to project oriented engineering design activities. The Joint Central Team (JCT) is promoting safety design and analysis necessary for siting and regulatory approval. Scoping studies are underway at the general level, in terms of laying out the safety and environmental design framework for ITER. ITER must follow the nuclear regulations of the host country as the future construction site of ITER. That is, regulatory approval is required before construction of ITER. Thus, during the EDA, some preparations are necessary for the future application for regulatory approval. Notwithstanding the future host country's jurisdictional framework of nuclear regulations, the primary responsibility for safety and reliability of ITER rests with the legally responsible body which will operate ITER. Since scientific utilization of ITER and protection of the large investment depends on safe and reliable operation of ITER, we are highly motivated to achieve maximum levels of operability, maintainability, and safety. ITER will be the first fusion facility in which overall 'nuclear safety' provisions need to be integrated into the facility. For example, it will be the first fusion facility with significant decay heat and structural radiational damage. Since ITER is an experimental facility, it is also important that necessary experiments can be performed within some safety design limits without requiring extensive regulatory procedures. ITER will be designed with such a robust safety envelope compatible with the fusion power and the energy inventories. The basic approach to safety will be realized by 'defense-in-depth'. (orig.)

  11. A New Two-Step Approach for Hands-On Teaching of Gene Technology: Effects on Students' Activities During Experimentation in an Outreach Gene Technology Lab

    Science.gov (United States)

    Scharfenberg, Franz-Josef; Bogner, Franz X.

    2011-08-01

    Emphasis on improving higher level biology education continues. A new two-step approach to the experimental phases within an outreach gene technology lab, derived from cognitive load theory, is presented. We compared our approach using a quasi-experimental design with the conventional one-step mode. The difference consisted of additional focused discussions combined with students writing down their ideas (step one) prior to starting any experimental procedure (step two). We monitored students' activities during the experimental phases by continuously videotaping 20 work groups within each approach ( N = 131). Subsequent classification of students' activities yielded 10 categories (with well-fitting intra- and inter-observer scores with respect to reliability). Based on the students' individual time budgets, we evaluated students' roles during experimentation from their prevalent activities (by independently using two cluster analysis methods). Independently of the approach, two common clusters emerged, which we labeled as `all-rounders' and as `passive students', and two clusters specific to each approach: `observers' as well as `high-experimenters' were identified only within the one-step approach whereas under the two-step conditions `managers' and `scribes' were identified. Potential changes in group-leadership style during experimentation are discussed, and conclusions for optimizing science teaching are drawn.

  12. Interactions among biotic and abiotic factors affect the reliability of tungsten microneedles puncturing in vitro and in vivo peripheral nerves: A hybrid computational approach

    Energy Technology Data Exchange (ETDEWEB)

    Sergi, Pier Nicola, E-mail: p.sergi@sssup.it [Translational Neural Engineering Laboratory, The Biorobotics Institute, Scuola Superiore Sant' Anna, Viale Rinaldo Piaggio 34, Pontedera, 56025 (Italy); Jensen, Winnie [Department of Health Science and Technology, Fredrik Bajers Vej 7, 9220 Aalborg (Denmark); Yoshida, Ken [Department of Biomedical Engineering, Indiana University - Purdue University Indianapolis, 723 W. Michigan St., SL220, Indianapolis, IN 46202 (United States)

    2016-02-01

    Tungsten is an elective material to produce slender and stiff microneedles able to enter soft tissues and minimize puncture wounds. In particular, tungsten microneedles are used to puncture peripheral nerves and insert neural interfaces, bridging the gap between the nervous system and robotic devices (e.g., hand prostheses). Unfortunately, microneedles fail during the puncture process and this failure is not dependent on stiffness or fracture toughness of the constituent material. In addition, the microneedles' performances decrease during in vivo trials with respect to the in vitro ones. This further effect is independent on internal biotic effects, while it seems to be related to external biotic causes. Since the exact synergy of phenomena decreasing the in vivo reliability is still not known, this work explored the connection between in vitro and in vivo behavior of tungsten microneedles through the study of interactions between biotic and abiotic factors. A hybrid computational approach, simultaneously using theoretical relationships and in silico models of nerves, was implemented to model the change of reliability varying the microneedle diameter, and to predict in vivo performances by using in vitro reliability and local differences between in vivo and in vitro mechanical response of nerves. - Highlights: • We provide phenomenological Finite Element (FE) models of peripheral nerves to study the interactions with W microneedles • We provide a general interaction-based approach to model the reliability of slender microneedles • We evaluate the reliability of W microneedels to puncture in vivo nerves • We provide a novel synergistic hybrid approach (theory + simulations) involving interactions among biotic and abiotic factors • We validate the hybrid approach by using experimental data from literature.

  13. Interactions among biotic and abiotic factors affect the reliability of tungsten microneedles puncturing in vitro and in vivo peripheral nerves: A hybrid computational approach

    International Nuclear Information System (INIS)

    Sergi, Pier Nicola; Jensen, Winnie; Yoshida, Ken

    2016-01-01

    Tungsten is an elective material to produce slender and stiff microneedles able to enter soft tissues and minimize puncture wounds. In particular, tungsten microneedles are used to puncture peripheral nerves and insert neural interfaces, bridging the gap between the nervous system and robotic devices (e.g., hand prostheses). Unfortunately, microneedles fail during the puncture process and this failure is not dependent on stiffness or fracture toughness of the constituent material. In addition, the microneedles' performances decrease during in vivo trials with respect to the in vitro ones. This further effect is independent on internal biotic effects, while it seems to be related to external biotic causes. Since the exact synergy of phenomena decreasing the in vivo reliability is still not known, this work explored the connection between in vitro and in vivo behavior of tungsten microneedles through the study of interactions between biotic and abiotic factors. A hybrid computational approach, simultaneously using theoretical relationships and in silico models of nerves, was implemented to model the change of reliability varying the microneedle diameter, and to predict in vivo performances by using in vitro reliability and local differences between in vivo and in vitro mechanical response of nerves. - Highlights: • We provide phenomenological Finite Element (FE) models of peripheral nerves to study the interactions with W microneedles • We provide a general interaction-based approach to model the reliability of slender microneedles • We evaluate the reliability of W microneedels to puncture in vivo nerves • We provide a novel synergistic hybrid approach (theory + simulations) involving interactions among biotic and abiotic factors • We validate the hybrid approach by using experimental data from literature

  14. Reliability analysis with linguistic data: An evidential network approach

    International Nuclear Information System (INIS)

    Zhang, Xiaoge; Mahadevan, Sankaran; Deng, Xinyang

    2017-01-01

    In practical applications of reliability assessment of a system in-service, information about the condition of a system and its components is often available in text form, e.g., inspection reports. Estimation of the system reliability from such text-based records becomes a challenging problem. In this paper, we propose a four-step framework to deal with this problem. In the first step, we construct an evidential network with the consideration of available knowledge and data. Secondly, we train a Naive Bayes text classification algorithm based on the past records. By using the trained Naive Bayes algorithm to classify the new records, we build interval basic probability assignments (BPA) for each new record available in text form. Thirdly, we combine the interval BPAs of multiple new records using an evidence combination approach based on evidence theory. Finally, we propagate the interval BPA through the evidential network constructed earlier to obtain the system reliability. Two numerical examples are used to demonstrate the efficiency of the proposed method. We illustrate the effectiveness of the proposed method by comparing with Monte Carlo Simulation (MCS) results. - Highlights: • We model reliability analysis with linguistic data using evidential network. • Two examples are used to demonstrate the efficiency of the proposed method. • We compare the results with Monte Carlo Simulation (MCS).

  15. Combined computational and experimental approach to improve the assessment of mitral regurgitation by echocardiography.

    Science.gov (United States)

    Sonntag, Simon J; Li, Wei; Becker, Michael; Kaestner, Wiebke; Büsen, Martin R; Marx, Nikolaus; Merhof, Dorit; Steinseifer, Ulrich

    2014-05-01

    Mitral regurgitation (MR) is one of the most frequent valvular heart diseases. To assess MR severity, color Doppler imaging (CDI) is the clinical standard. However, inadequate reliability, poor reproducibility and heavy user-dependence are known limitations. A novel approach combining computational and experimental methods is currently under development aiming to improve the quantification. A flow chamber for a circulatory flow loop was developed. Three different orifices were used to mimic variations of MR. The flow field was recorded simultaneously by a 2D Doppler ultrasound transducer and Particle Image Velocimetry (PIV). Computational Fluid Dynamics (CFD) simulations were conducted using the same geometry and boundary conditions. The resulting computed velocity field was used to simulate synthetic Doppler signals. Comparison between PIV and CFD shows a high level of agreement. The simulated CDI exhibits the same characteristics as the recorded color Doppler images. The feasibility of the proposed combination of experimental and computational methods for the investigation of MR is shown and the numerical methods are successfully validated against the experiments. Furthermore, it is discussed how the approach can be used in the long run as a platform to improve the assessment of MR quantification.

  16. Design and experimentation of an empirical multistructure framework for accurate, sharp and reliable hydrological ensembles

    Science.gov (United States)

    Seiller, G.; Anctil, F.; Roy, R.

    2017-09-01

    This paper outlines the design and experimentation of an Empirical Multistructure Framework (EMF) for lumped conceptual hydrological modeling. This concept is inspired from modular frameworks, empirical model development, and multimodel applications, and encompasses the overproduce and select paradigm. The EMF concept aims to reduce subjectivity in conceptual hydrological modeling practice and includes model selection in the optimisation steps, reducing initial assumptions on the prior perception of the dominant rainfall-runoff transformation processes. EMF generates thousands of new modeling options from, for now, twelve parent models that share their functional components and parameters. Optimisation resorts to ensemble calibration, ranking and selection of individual child time series based on optimal bias and reliability trade-offs, as well as accuracy and sharpness improvement of the ensemble. Results on 37 snow-dominated Canadian catchments and 20 climatically-diversified American catchments reveal the excellent potential of the EMF in generating new individual model alternatives, with high respective performance values, that may be pooled efficiently into ensembles of seven to sixty constitutive members, with low bias and high accuracy, sharpness, and reliability. A group of 1446 new models is highlighted to offer good potential on other catchments or applications, based on their individual and collective interests. An analysis of the preferred functional components reveals the importance of the production and total flow elements. Overall, results from this research confirm the added value of ensemble and flexible approaches for hydrological applications, especially in uncertain contexts, and open up new modeling possibilities.

  17. A structural approach to constructing perspective efficient and reliable human-computer interfaces

    International Nuclear Information System (INIS)

    Balint, L.

    1989-01-01

    The principles of human-computer interface (HCI) realizations are investigated with the aim of getting closer to a general framework and thus, to a more or less solid background of constructing perspective efficient, reliable and cost-effective human-computer interfaces. On the basis of characterizing and classifying the different HCI solutions, the fundamental problems of interface construction are pointed out especially with respect to human error occurrence possibilities. The evolution of HCI realizations is illustrated by summarizing the main properties of past, present and foreseeable future interface generations. HCI modeling is pointed out to be a crucial problem in theoretical and practical investigations. Suggestions concerning HCI structure (hierarchy and modularity), HCI functional dynamics (mapping from input to output information), minimization of human error caused system failures (error-tolerance, error-recovery and error-correcting) as well as cost-effective HCI design and realization methodology (universal and application-oriented vs. application-specific solutions) are presented. The concept of RISC-based and SCAMP-type HCI components is introduced with the aim of having a reduced interaction scheme in communication and a well defined architecture in HCI components' internal structure. HCI efficiency and reliability are dealt with, by taking into account complexity and flexibility. The application of fast computerized prototyping is also briefly investigated as an experimental device of achieving simple, parametrized, invariant HCI models. Finally, a concise outline of an approach of how to construct ideal HCI's is also suggested by emphasizing the open questions and the need of future work related to the proposals, as well. (author). 14 refs, 6 figs

  18. Force-based and displacement-based reliability assessment approaches for highway bridges under multiple hazard actions

    Directory of Open Access Journals (Sweden)

    Chao Huang

    2015-08-01

    Full Text Available The strength limit state of American Association of State Highway and Transportation Officials (AASHTO Load and Resistance Factor Design (LRFD Bridge Design Specifications is developed based on the failure probabilities of the combination of non-extreme loads. The proposed design limit state equation (DLSE has been fully calibrated for dead load and live load by using the reliability-based approach. On the other hand, most of DLSEs in other limit states, including the extreme events Ⅰ and Ⅱ, have not been developed and calibrated though taking certain probability-based concepts into account. This paper presents an assessment procedure of highway bridge reliabilities under the limit state of extreme event Ⅰ, i. e., the combination of dead load, live load and earthquake load. A force-based approach and a displacement-based approach are proposed and implemented on a set of nine simplified bridge models. Results show that the displacement-based approach comes up with more convergent and accurate reliabilities for selected models, which can be applied to other hazards.

  19. A comprehensive approach to identify reliable reference gene candidates to investigate the link between alcoholism and endocrinology in Sprague-Dawley rats.

    Directory of Open Access Journals (Sweden)

    Faten A Taki

    Full Text Available Gender and hormonal differences are often correlated with alcohol dependence and related complications like addiction and breast cancer. Estrogen (E2 is an important sex hormone because it serves as a key protein involved in organism level signaling pathways. Alcoholism has been reported to affect estrogen receptor signaling; however, identifying the players involved in such multi-faceted syndrome is complex and requires an interdisciplinary approach. In many situations, preliminary investigations included a straight forward, yet informative biotechniques such as gene expression analyses using quantitative real time PCR (qRT-PCR. The validity of qRT-PCR-based conclusions is affected by the choice of reliable internal controls. With this in mind, we compiled a list of 15 commonly used housekeeping genes (HKGs as potential reference gene candidates in rat biological models. A comprehensive comparison among 5 statistical approaches (geNorm, dCt method, NormFinder, BestKeeper, and RefFinder was performed to identify the minimal number as well the most stable reference genes required for reliable normalization in experimental rat groups that comprised sham operated (SO, ovariectomized rats in the absence (OVX or presence of E2 (OVXE2. These rat groups were subdivided into subgroups that received alcohol in liquid diet or isocalroic control liquid diet for 12 weeks. Our results showed that U87, 5S rRNA, GAPDH, and U5a were the most reliable gene candidates for reference genes in heart and brain tissue. However, different gene stability ranking was specific for each tissue input combination. The present preliminary findings highlight the variability in reference gene rankings across different experimental conditions and analytic methods and constitute a fundamental step for gene expression assays.

  20. IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    G. W. Parry; J.A Forester; V.N. Dang; S. M. L. Hendrickson; M. Presley; E. Lois; J. Xing

    2013-09-01

    This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure event (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.

  1. QA support for TFTR reliability improvement program in preparation for DT operation

    International Nuclear Information System (INIS)

    Parsells, R.F.; Howard, H.P.

    1987-01-01

    As TFTR approaches experiments in the Q=1 regime, machine reliability becomes a major variable in achieving experimental objectives. This paper describes the methods used to quantify current reliability levels, levels required for D-T operations, proposed methods for reliability growth and improvement, and tracking of reliability performance in that growth. Included in this scope are data collection techniques and short comings, bounding current reliability on the upper end, and requirements for D-T operations. Problem characterization through Pareto diagrams provides insight into recurrent failure modes and the use of Duane plots for charting of reliability changes both cumulative and instantaneous, is explained and demonstrated

  2. Forecasting systems reliability based on support vector regression with genetic algorithms

    International Nuclear Information System (INIS)

    Chen, K.-Y.

    2007-01-01

    This study applies a novel neural-network technique, support vector regression (SVR), to forecast reliability in engine systems. The aim of this study is to examine the feasibility of SVR in systems reliability prediction by comparing it with the existing neural-network approaches and the autoregressive integrated moving average (ARIMA) model. To build an effective SVR model, SVR's parameters must be set carefully. This study proposes a novel approach, known as GA-SVR, which searches for SVR's optimal parameters using real-value genetic algorithms, and then adopts the optimal parameters to construct the SVR models. A real reliability data for 40 suits of turbochargers were employed as the data set. The experimental results demonstrate that SVR outperforms the existing neural-network approaches and the traditional ARIMA models based on the normalized root mean square error and mean absolute percentage error

  3. Experimental approach to high power long duration neutral beams

    International Nuclear Information System (INIS)

    Horiike, Hiroshi

    1981-12-01

    Experimental studies of ion sources and beam dumps for the development of a high power long duration neutral beam injector for JT-60 are presented. Long pulse operation of high power beams requires a high degree of reliability. To develop a reliable ion source with large extraction area, a new duoPIGatron ion source with a coaxially shaped intermediate electrode is proposed and tested. Magnetic configuration is examined numerically to obtain high current arc discharge and source plasma with small density variation. Experimental results show that primary electrons were fed widely from the cathode plasma region to the source plasma region and that dense uniform source plasma could be obtained easily. Source plasma characteristics are studied and comparison of these with other sources are also described. To develop extraction electrode of high power ion source, experimental studies were made on the cooling of the electrode. Long Pulse beams were extracted safely under the condition of high heat loading on the electrode. Finally, burnout study for the development of high power beam dumps is presented. Burnout data were obtained from subcooled forced-convective boiling of water in a copper finned tube irradiated by high power ion beams. The results yield simple burnout correlations which can be used for the prediction of burnout heat flux of the beam dump. (author)

  4. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  5. Reliability optimization using multiobjective ant colony system approaches

    International Nuclear Information System (INIS)

    Zhao Jianhua; Liu Zhaoheng; Dao, M.-T.

    2007-01-01

    The multiobjective ant colony system (ACS) meta-heuristic has been developed to provide solutions for the reliability optimization problem of series-parallel systems. This type of problems involves selection of components with multiple choices and redundancy levels that produce maximum benefits, and is subject to the cost and weight constraints at the system level. These are very common and realistic problems encountered in conceptual design of many engineering systems. It is becoming increasingly important to develop efficient solutions to these problems because many mechanical and electrical systems are becoming more complex, even as development schedules get shorter and reliability requirements become very stringent. The multiobjective ACS algorithm offers distinct advantages to these problems compared with alternative optimization methods, and can be applied to a more diverse problem domain with respect to the type or size of the problems. Through the combination of probabilistic search, multiobjective formulation of local moves and the dynamic penalty method, the multiobjective ACSRAP, allows us to obtain an optimal design solution very frequently and more quickly than with some other heuristic approaches. The proposed algorithm was successfully applied to an engineering design problem of gearbox with multiple stages

  6. Building and integrating reliability models in a Reliability-Centered-Maintenance approach

    International Nuclear Information System (INIS)

    Verite, B.; Villain, B.; Venturini, V.; Hugonnard, S.; Bryla, P.

    1998-03-01

    Electricite de France (EDF) has recently developed its OMF-Structures method, designed to optimize preventive maintenance of passive structures such as pipes and support, based on risk. In particular, reliability performances of components need to be determined; it is a two-step process, consisting of a qualitative sort followed by a quantitative evaluation, involving two types of models. Initially, degradation models are widely used to exclude some components from the field of preventive maintenance. The reliability of the remaining components is then evaluated by means of quantitative reliability models. The results are then included in a risk indicator that is used to directly optimize preventive maintenance tasks. (author)

  7. A shortened version of the THERP/Handbook approach to human reliability analysis for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1986-01-01

    The approach to human reliability analysis (HRA) known as THERP/Handbook has been applied to several probabilistic risk assessments (PRAs) of nuclear power plants (NPPs) and other complex systems. The approach is based on a thorough task analysis of the man-machine interfaces, including the interactions among the people, involved in the operations being assessed. The idea is to assess fully the underlying performance shaping factors (PSFs) and dependence effects which result either in reliable or unreliable human performance

  8. Reliable Rescue Routing Optimization for Urban Emergency Logistics under Travel Time Uncertainty

    Directory of Open Access Journals (Sweden)

    Qiuping Li

    2018-02-01

    Full Text Available The reliability of rescue routes is critical for urban emergency logistics during disasters. However, studies on reliable rescue routing under stochastic networks are still rare. This paper proposes a multiobjective rescue routing model for urban emergency logistics under travel time reliability. A hybrid metaheuristic integrating ant colony optimization (ACO and tabu search (TS was designed to solve the model. An experiment optimizing rescue routing plans under a real urban storm event, was carried out to validate the proposed model. The experimental results showed how our approach can improve rescue efficiency with high travel time reliability.

  9. Creation of reliable relevance judgments in information retrieval systems evaluation experimentation through crowdsourcing: a review.

    Science.gov (United States)

    Samimi, Parnia; Ravana, Sri Devi

    2014-01-01

    Test collection is used to evaluate the information retrieval systems in laboratory-based evaluation experimentation. In a classic setting, generating relevance judgments involves human assessors and is a costly and time consuming task. Researchers and practitioners are still being challenged in performing reliable and low-cost evaluation of retrieval systems. Crowdsourcing as a novel method of data acquisition is broadly used in many research fields. It has been proven that crowdsourcing is an inexpensive and quick solution as well as a reliable alternative for creating relevance judgments. One of the crowdsourcing applications in IR is to judge relevancy of query document pair. In order to have a successful crowdsourcing experiment, the relevance judgment tasks should be designed precisely to emphasize quality control. This paper is intended to explore different factors that have an influence on the accuracy of relevance judgments accomplished by workers and how to intensify the reliability of judgments in crowdsourcing experiment.

  10. A fuzzy-based reliability approach to evaluate basic events of fault tree analysis for nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry

    2014-01-01

    Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment

  11. Reliability-based failure cause assessment of collapsed bridge during construction

    International Nuclear Information System (INIS)

    Choi, Hyun-Ho; Lee, Sang-Yoon; Choi, Il-Yoon; Cho, Hyo-Nam; Mahadevan, Sankaran

    2006-01-01

    Until now, in many forensic reports, the failure cause assessments are usually carried out by a deterministic approach so far. However, it may be possible for the forensic investigation to lead to unreasonable results far from the real collapse scenario, because the deterministic approach does not systematically take into account any information on the uncertainties involved in the failures of structures. Reliability-based failure cause assessment (reliability-based forensic engineering) methodology is developed which can incorporate the uncertainties involved in structural failures and structures, and to apply them to the collapsed bridge in order to identify the most critical failure scenario and find the cause that triggered the bridge collapse. Moreover, to save the time and cost of evaluation, an algorithm of automated event tree analysis (ETA) is proposed and possible to automatically calculate the failure probabilities of the failure events and the occurrence probabilities of failure scenarios. Also, for reliability analysis, uncertainties are estimated more reasonably by using the Bayesian approach based on the experimental laboratory testing data in the forensic report. For the applicability, the proposed approach is applied to the Hang-ju Grand Bridge, which collapsed during construction, and compared with deterministic approach

  12. Reliability of four experimental mechanical pain tests in children

    DEFF Research Database (Denmark)

    Søe, Ann-Britt Langager; Thomsen, Lise L; Tornoe, Birte

    2013-01-01

    In order to study pain in children, it is necessary to determine whether pain measurement tools used in adults are reliable measurements in children. The aim of this study was to explore the intrasession reliability of pressure pain thresholds (PPT) in healthy children. Furthermore, the aim was a...... was also to study the intersession reliability of the following four tests: (1) Total Tenderness Score; (2) PPT; (3) Visual Analog Scale score at suprapressure pain threshold; and (4) area under the curve (stimulus-response functions for pressure versus pain).......In order to study pain in children, it is necessary to determine whether pain measurement tools used in adults are reliable measurements in children. The aim of this study was to explore the intrasession reliability of pressure pain thresholds (PPT) in healthy children. Furthermore, the aim...

  13. Bayesian approach for the reliability assessment of corroded interdependent pipe networks

    International Nuclear Information System (INIS)

    Ait Mokhtar, El Hassene; Chateauneuf, Alaa; Laggoune, Radouane

    2016-01-01

    Pipelines under corrosion are subject to various environment conditions, and consequently it becomes difficult to build realistic corrosion models. In the present work, a Bayesian methodology is proposed to allow for updating the corrosion model parameters according to the evolution of environmental conditions. For reliability assessment of dependent structures, Bayesian networks are used to provide interesting qualitative and quantitative description of the information in the system. The qualitative contribution lies in the modeling of complex system, composed by dependent pipelines, as a Bayesian network. The quantitative one lies in the evaluation of the dependencies between pipelines by the use of a new method for the generation of conditional probability tables. The effectiveness of Bayesian updating is illustrated through an application where the new reliability of degraded (corroded) pipe networks is assessed. - Highlights: • A methodology for Bayesian network modeling of pipe networks is proposed. • Bayesian approach based on Metropolis - Hastings algorithm is conducted for corrosion model updating. • The reliability of corroded pipe network is assessed by considering the interdependencies between the pipelines.

  14. USING A TOTAL QUALITY STRATEGY IN A NEW PRACTICAL APPROACH FOR IMPROVING THE PRODUCT RELIABILITY IN AUTOMOTIVE INDUSTRY

    Directory of Open Access Journals (Sweden)

    Cristiano Fragassa

    2014-09-01

    Full Text Available In this paper a Total Quality Management strategy is proposed, refined and used with the aim at improving the quality of large-mass industrial products far beyond the technical specifications demanded at the end-customer level. This approach combines standard and non-standard tools used for Reliability, Availability and Maintainability analysis. The procedure also realizes a stricter correlation between theoretical evaluation methods and experimental evidences as part of a modern integrated method for strengthening quality in design and process. A commercial Intake Manifold, largely spread in the market, is used as test-case for the validation of the methodology. As general additional result, the research underlines the impact of Total Quality Management and its tools on the development of innovation.

  15. Development of slim-maud: a multi-attribute utility approach to human reliability evaluation

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1984-01-01

    This paper describes further work on the Success Likelihood Index Methodology (SLIM), a procedure for quantitatively evaluating human reliability in nuclear power plants and other systems. SLIM was originally developed by Human Reliability Associates during an earlier contract with Brookhaven National Laboratory (BNL). A further development of SLIM, SLIM-MAUD (Multi-Attribute Utility Decomposition) is also described. This is an extension of the original approach using an interactive, computer-based system. All of the work described in this report was supported by the Human Factors and Safeguards Branch of the US Nuclear Regulatory Commission

  16. Reliability and vulnerability analyses of critical infrastructures: Comparing two approaches in the context of power systems

    International Nuclear Information System (INIS)

    Johansson, Jonas; Hassel, Henrik; Zio, Enrico

    2013-01-01

    Society depends on services provided by critical infrastructures, and hence it is important that they are reliable and robust. Two main approaches for gaining knowledge required for designing and improving critical infrastructures are reliability analysis and vulnerability analysis. The former analyses the ability of the system to perform its intended function; the latter analyses its inability to withstand strains and the effects of the consequent failures. The two approaches have similarities but also some differences with respect to what type of information they generate about the system. In this view, the main purpose of this paper is to discuss and contrast these approaches. To strengthen the discussion and exemplify its findings, a Monte Carlo-based reliability analysis and a vulnerability analysis are considered in their application to a relatively simple, but representative, system the IEEE RTS96 electric power test system. The exemplification reveals that reliability analysis provides a good picture of the system likely behaviour, but fails to capture a large portion of the high consequence scenarios, which are instead captured in the vulnerability analysis. Although these scenarios might be estimated to have small probabilities of occurrence, they should be identified, considered and treated cautiously, as probabilistic analyses should not be the only input to decision-making for the design and protection of critical infrastructures. The general conclusion that can be drawn from the findings of the example is that vulnerability analysis should be used to complement reliability studies, as well as other forms of probabilistic risk analysis. Measures should be sought for reducing both the vulnerability, i.e. improving the system ability to withstand strains and stresses, and the reliability, i.e. improving the likely behaviour

  17. Reliability of four experimental mechanical pain tests in children

    Directory of Open Access Journals (Sweden)

    Soee AL

    2013-02-01

    Full Text Available Ann-Britt L Soee,1 Lise L Thomsen,2 Birte Tornoe,1,3 Liselotte Skov11Department of Pediatrics, Children’s Headache Clinic, Copenhagen University Hospital Herlev, Copenhagen, Denmark; 2Department of Neuropediatrics, Juliane Marie Centre, Copenhagen University Hospital Rigshospitalet, København Ø, Denmark; 3Department of Physiotherapy, Medical Department O, Copenhagen University Hospital Herlev, Herlev, DenmarkPurpose: In order to study pain in children, it is necessary to determine whether pain measurement tools used in adults are reliable measurements in children. The aim of this study was to explore the intrasession reliability of pressure pain thresholds (PPT in healthy children. Furthermore, the aim was also to study the intersession reliability of the following four tests: (1 Total Tenderness Score; (2 PPT; (3 Visual Analog Scale score at suprapressure pain threshold; and (4 area under the curve (stimulus–response functions for pressure versus pain.Participants and methods: Twenty-five healthy school children, 8–14 years of age, participated. Test 2, PPT, was repeated three times at 2 minute intervals on the same day to estimate PPT intrasession reliability using Cronbach’s alpha. Tests 1–4 were repeated after median 21 (interquartile range 10.5–22 days, and Pearson’s correlation coefficient was used to describe the intersession reliability.Results: The PPT test was precise and reliable (Cronbach’s alpha ≥ 0.92. All tests showed a good to excellent correlation between days (intersessions r = 0.66–0.81. There were no indications of significant systematic differences found in any of the four tests between days.Conclusion: All tests seemed to be reliable measurements in pain evaluation in healthy children aged 8–14 years. Given the small sample size, this conclusion needs to be confirmed in future studies.Keywords: repeatability, intraindividual reliability, pressure pain threshold, pain measurement, algometer

  18. Mechanical behaviour of the heel pad: experimental and numerical approach

    DEFF Research Database (Denmark)

    Matteoli, Sara; Fontanella, C. G.; Virga, A.

    The aim of the present work was to investigate the stress relaxation phenomena of the heel pad region under different loading conditions. A 31-year-old healthy female was enrolled in this study and her left foot underwent both MRI and experimental compression tests. Experimental results were...... compared with those obtained from finite element analysis performed on numerical 3D subject-specific heel pad model built on the basis of MRI. The calcaneal fat pad tissue was described with a visco-hyperelastic model, while a fiber-reinforced hyperelastic model was formulated for the skin. The reliability...

  19. [Cognitive experimental approach to anxiety disorders].

    Science.gov (United States)

    Azaïs, F

    1995-01-01

    Cognitive psychology is proposing a functional model to explain the mental organisation leading to emotional disorders. Among these disorders, anxiety spectrum represents a domain in which this model seems to be interesting for an efficient and comprehensive approach of the pathology. Number of behavioral or cognitive psychotherapeutic methods are relating to these cognitive references, but the theorical concepts of cognitive "shemata" or cognitive "processes" evoked to describe mental functioning in anxiety need an experimental approach for a better rational understanding. Cognitive function as perception, attention or memory can be explored in this domaine in an efficient way, allowing a more precise study of each stage of information processing. The cognitive model proposed in the psychopathology of anxiety suggests that anxious subjects are characterized by biases in processing of emotionally valenced information. This hypothesis suggests functional interference in information processing in these subjects, leading to an anxious response to the most of different stimuli. Experimental approach permit to explore this hypothesis, using many tasks for testing different cognitive dysfunction evoked in the anxious cognitive organisation. Impairments revealed in anxiety disorders seem to result from specific biases in threat-related information processing, involving several stages of cognitive processes. Semantic interference, attentional bias, implicit memory bias and priming effect are the most often disorders observed in anxious pathology, like simple phobia, generalised anxiety, panic disorder or post-traumatic stress disorder. These results suggest a top-down organisation of information processing in anxious subjects, who tend to detect, perceive and label many situations as threatening experience. The processes of reasoning and elaboration are consequently impaired in their adaptative function to threat, leading to the anxious response observed in clinical

  20. Alternative approach to automated management of load flow in engineering networks considering functional reliability

    Directory of Open Access Journals (Sweden)

    Ирина Александровна Гавриленко

    2016-02-01

    Full Text Available The approach to automated management of load flow in engineering networks considering functional reliability was proposed in the article. The improvement of the concept of operational and strategic management of load flow in engineering networks was considered. The verbal statement of the problem for thesis research is defined, namely, the problem of development of information technology for exact calculation of the functional reliability of the network, or the risk of short delivery of purpose-oriented product for consumers

  1. Reliability improvement in GaN HEMT power device using a field plate approach

    Science.gov (United States)

    Wu, Wen-Hao; Lin, Yueh-Chin; Chin, Ping-Chieh; Hsu, Chia-Chieh; Lee, Jin-Hwa; Liu, Shih-Chien; Maa, Jer-shen; Iwai, Hiroshi; Chang, Edward Yi; Hsu, Heng-Tung

    2017-07-01

    This study investigates the effect of implementing a field plate on a GaN high-electron-mobility transistor (HEMT) to improve power device reliability. The results indicate that the field plate structure reduces the peak electrical field and interface traps in the device, resulting in higher breakdown voltage, lower leakage current, smaller current collapse, and better threshold voltage control. Furthermore, after high voltage stress, steady dynamic on-resistance and gate capacitance degradation improvement were observed for the device with the field plate. This demonstrates that GaN device reliability can be improved by using the field plate approach.

  2. Beyond reliability, multi-state failure analysis of satellite subsystems: A statistical approach

    International Nuclear Information System (INIS)

    Castet, Jean-Francois; Saleh, Joseph H.

    2010-01-01

    Reliability is widely recognized as a critical design attribute for space systems. In recent articles, we conducted nonparametric analyses and Weibull fits of satellite and satellite subsystems reliability for 1584 Earth-orbiting satellites launched between January 1990 and October 2008. In this paper, we extend our investigation of failures of satellites and satellite subsystems beyond the binary concept of reliability to the analysis of their anomalies and multi-state failures. In reliability analysis, the system or subsystem under study is considered to be either in an operational or failed state; multi-state failure analysis introduces 'degraded states' or partial failures, and thus provides more insights through finer resolution into the degradation behavior of an item and its progression towards complete failure. The database used for the statistical analysis in the present work identifies five states for each satellite subsystem: three degraded states, one fully operational state, and one failed state (complete failure). Because our dataset is right-censored, we calculate the nonparametric probability of transitioning between states for each satellite subsystem with the Kaplan-Meier estimator, and we derive confidence intervals for each probability of transitioning between states. We then conduct parametric Weibull fits of these probabilities using the Maximum Likelihood Estimation (MLE) approach. After validating the results, we compare the reliability versus multi-state failure analyses of three satellite subsystems: the thruster/fuel; the telemetry, tracking, and control (TTC); and the gyro/sensor/reaction wheel subsystems. The results are particularly revealing of the insights that can be gleaned from multi-state failure analysis and the deficiencies, or blind spots, of the traditional reliability analysis. In addition to the specific results provided here, which should prove particularly useful to the space industry, this work highlights the importance

  3. An approach for assessing ALWR passive safety system reliability

    International Nuclear Information System (INIS)

    Hake, T.M.

    1991-01-01

    Many advanced light water reactor designs incorporate passive rather than active safety features for front-line accident response. A method for evaluating the reliability of these passive systems in the context of probabilistic risk assessment has been developed at Sandia National Laboratories. This method addresses both the component (e.g. valve) failure aspect of passive system failure, and uncertainties in system success criteria arising from uncertainties in the system's underlying physical processes. These processes provide the system's driving force; examples are natural circulation and gravity-induced injection. This paper describes the method, and provides some preliminary results of application of the approach to the Westinghouse AP600 design

  4. Approaches to determining the reliability of a multimodal three-dimensional dynamic signature

    Directory of Open Access Journals (Sweden)

    Yury E. Kozlov

    2018-03-01

    Full Text Available The market of modern mobile applications has increasingly strict requirements for the authentication system reliability. This article examines an authentication method using a multimodal three-dimensional dynamic signature (MTDS, that can be used both as a main and additional method of user authentication in mobile applications. It is based on the use of gesture in the air performed by two independent mobile devices as an identifier. The MTDS method has certain advantages over currently used biometric methods, including fingerprint authentication, face recognition and voice recognition. A multimodal three-dimensional dynamic signature allows quickly changing an authentication gesture, as well as concealing the authentication procedure using gestures that do not attract attention. Despite all its advantages, the MTDS method has certain limitations, the main one is building functionally dynamic complex (FDC skills required for accurate repeating an authentication gesture. To correctly create MTDS need to have a system for assessing the reliability of gestures. Approaches to the solution of this task are grouped in this article according to methods of their implementation. Two of the approaches can be implemented only with the use of a server as a centralized MTDS processing center and one approach can be implemented using smartphone's own computing resources. The final part of the article provides data of testing one of these methods on a template performing the MTDS authentication.

  5. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  6. A Squeezed Artificial Neural Network for the Symbolic Network Reliability Functions of Binary-State Networks.

    Science.gov (United States)

    Yeh, Wei-Chang

    Network reliability is an important index to the provision of useful information for decision support in the modern world. There is always a need to calculate symbolic network reliability functions (SNRFs) due to dynamic and rapid changes in network parameters. In this brief, the proposed squeezed artificial neural network (SqANN) approach uses the Monte Carlo simulation to estimate the corresponding reliability of a given designed matrix from the Box-Behnken design, and then the Taguchi method is implemented to find the appropriate number of neurons and activation functions of the hidden layer and the output layer in ANN to evaluate SNRFs. According to the experimental results of the benchmark networks, the comparison appears to support the superiority of the proposed SqANN method over the traditional ANN-based approach with at least 16.6% improvement in the median absolute deviation in the cost of extra 2 s on average for all experiments.Network reliability is an important index to the provision of useful information for decision support in the modern world. There is always a need to calculate symbolic network reliability functions (SNRFs) due to dynamic and rapid changes in network parameters. In this brief, the proposed squeezed artificial neural network (SqANN) approach uses the Monte Carlo simulation to estimate the corresponding reliability of a given designed matrix from the Box-Behnken design, and then the Taguchi method is implemented to find the appropriate number of neurons and activation functions of the hidden layer and the output layer in ANN to evaluate SNRFs. According to the experimental results of the benchmark networks, the comparison appears to support the superiority of the proposed SqANN method over the traditional ANN-based approach with at least 16.6% improvement in the median absolute deviation in the cost of extra 2 s on average for all experiments.

  7. Semiconductor laser engineering, reliability and diagnostics a practical approach to high power and single mode devices

    CERN Document Server

    Epperlein, Peter W

    2013-01-01

    This reference book provides a fully integrated novel approach to the development of high-power, single-transverse mode, edge-emitting diode lasers by addressing the complementary topics of device engineering, reliability engineering and device diagnostics in the same book, and thus closes the gap in the current book literature. Diode laser fundamentals are discussed, followed by an elaborate discussion of problem-oriented design guidelines and techniques, and by a systematic treatment of the origins of laser degradation and a thorough exploration of the engineering means to enhance the optical strength of the laser. Stability criteria of critical laser characteristics and key laser robustness factors are discussed along with clear design considerations in the context of reliability engineering approaches and models, and typical programs for reliability tests and laser product qualifications. Novel, advanced diagnostic methods are reviewed to discuss, for the first time in detail in book literature, performa...

  8. A reliability-based approach of fastest routes planning in dynamic traffic network under emergency management situation

    Directory of Open Access Journals (Sweden)

    Ye Sun

    2011-12-01

    Full Text Available In order to establish an available emergency management system, it is important to conduct effective evacuation with reliable and real time optimal route plans. This paper aims at creating a route finding strategy by considering the time dependent factors as well as uncertainties that may be encountered during the emergency management system. To combine dynamic features with the level of reliability in the process of fastest route planning, the speed distribution of typical intercity roads is studied in depth, and the strategy of modifying real time speed to a more reliable value based on speed distribution is proposed. Two algorithms of route planning have been developed to find three optimal routes with the shortest travel time and the reliability of 0.9. In order to validate the new strategy, experimental implementation of the route planning method is conducted based on road speed information acquired by field study. The results show that the proposed strategy might provide more reliable routes in dynamic traffic networks by conservatively treating roads with large speed discretion or with relative extreme real speed value.

  9. Design for Six Sigma: Approach for reliability and low-cost manufacturing.

    Directory of Open Access Journals (Sweden)

    Jesus Gerardo Cruz Alvarez

    2012-07-01

    Full Text Available The aim of this study is to discuss new product development based on a traditional stage-gate process and to examine how new product development [NPD] tools, such as lean design for Six Sigma, can accelerate the achievement of the main goals of NPD: reliable product quality, cost-effective implementation, and desired time-to-market. These new tools must be incorporated into a new approach to NPD based on the Advanced Product and Quality Planning methodology.

  10. Extending Failure Modes and Effects Analysis Approach for Reliability Analysis at the Software Architecture Design Level

    NARCIS (Netherlands)

    Sözer, Hasan; Tekinerdogan, B.; Aksit, Mehmet; de Lemos, Rogerio; Gacek, Cristina

    2007-01-01

    Several reliability engineering approaches have been proposed to identify and recover from failures. A well-known and mature approach is the Failure Mode and Effect Analysis (FMEA) method that is usually utilized together with Fault Tree Analysis (FTA) to analyze and diagnose the causes of failures.

  11. Stability of nanofluids: Molecular dynamic approach and experimental study

    International Nuclear Information System (INIS)

    Farzaneh, H.; Behzadmehr, A.; Yaghoubi, M.; Samimi, A.; Sarvari, S.M.H.

    2016-01-01

    Highlights: • Nanofluid stability is investigated and discussed. • A molecular dynamic approach, considering different forces on the nanoparticles, is adopted. • Stability diagrams are presented for different thermo-fluid conditions. • An experimental investigation is carried out to confirm the theoretical approach. - Abstract: Nanofluids as volumetric absorbent in solar energy conversion devices or as working fluid in different heat exchangers have been proposed by various researchers. However, dispersion stability of nanofluids is an important issue that must be well addressed before any industrial applications. Conditions such as severe temperature gradient, high temperature of heat transfer fluid, nanoparticle mean diameters and types of nanoparticles and base fluid are among the most effective parameters on the stability of nanofluid. A molecular dynamic approach, considering kinetic energy of nanoparticles and DLVO potential energy between nanoparticles, is adopted to study the nanofluid stability for different nanofluids at different working conditions. Different forces such as Brownian, thermophoresis, drag and DLVO are considered to introduce the stability diagrams. The latter presents the conditions for which a nanofluid can be stable. In addition an experimental investigation is carried out to find a stable nanofluid and to show the validity of the theoretical approach. There is a good agreement between the experimental and theoretical results that confirms the validity of our theoretical approach.

  12. Experimental Research of Reliability of Plant Stress State Detection by Laser-Induced Fluorescence Method

    Directory of Open Access Journals (Sweden)

    Yury Fedotov

    2016-01-01

    Full Text Available Experimental laboratory investigations of the laser-induced fluorescence spectra of watercress and lawn grass were conducted. The fluorescence spectra were excited by YAG:Nd laser emitting at 532 nm. It was established that the influence of stress caused by mechanical damage, overwatering, and soil pollution is manifested in changes of the spectra shapes. The mean values and confidence intervals for the ratio of two fluorescence maxima near 685 and 740 nm were estimated. It is presented that the fluorescence ratio could be considered a reliable characteristic of plant stress state.

  13. The possibilities of applying a risk-oriented approach to the NPP reliability and safety enhancement problem

    Science.gov (United States)

    Komarov, Yu. A.

    2014-10-01

    An analysis and some generalizations of approaches to risk assessments are presented. Interconnection between different interpretations of the "risk" notion is shown, and the possibility of applying the fuzzy set theory to risk assessments is demonstrated. A generalized formulation of the risk assessment notion is proposed in applying risk-oriented approaches to the problem of enhancing reliability and safety in nuclear power engineering. The solution of problems using the developed risk-oriented approaches aimed at achieving more reliable and safe operation of NPPs is described. The results of studies aimed at determining the need (advisability) to modernize/replace NPP elements and systems are presented together with the results obtained from elaborating the methodical principles of introducing the repair concept based on the equipment technical state. The possibility of reducing the scope of tests and altering the NPP systems maintenance strategy is substantiated using the risk-oriented approach. A probabilistic model for estimating the validity of boric acid concentration measurements is developed.

  14. A non-invasive experimental approach for surface temperature measurements on semi-crystalline thermoplastics

    Science.gov (United States)

    Boztepe, Sinan; Gilblas, Remi; de Almeida, Olivier; Le Maoult, Yannick; Schmidt, Fabrice

    2017-10-01

    Most of the thermoforming processes of thermoplastic polymers and their composites are performed adopting a combined heating and forming stages at which a precursor is heated prior to the forming. This step is done in order to improve formability by softening the thermoplastic polymer. Due to low thermal conductivity and semi-transparency of polymers, infrared (IR) heating is widely used for thermoforming of such materials. Predictive radiation heat transfer models for temperature distributions are therefore critical for optimizations of thermoforming process. One of the key challenges is to build a predictive model including the physical background of radiation heat transfer phenomenon in semi-crystalline thermoplastics as their microcrystalline structure introduces an optically heterogeneous medium. In addition, the accuracy of a predictive model is required to be validated experimentally where IR thermography is one of the suitable methods for such a validation as it provides a non-invasive, full-field surface temperature measurement. Although IR cameras provide a non-invasive measurement, a key issue for obtaining a reliable measurement depends on the optical characteristics of a heated material and the operating spectral band of IR camera. It is desired that the surface of a material to be measured has a spectral band where the material behaves opaque and an employed IR camera operates in the corresponding band. In this study, the optical characteristics of the PO-based polymer are discussed and, an experimental approach is proposed in order to measure the surface temperature of the PO-based polymer via IR thermography. The preliminary analyses showed that IR thermographic measurements may not be simply performed on PO-based polymers and require a correction method as their semi-transparent medium introduce a challenge to obtain reliable surface temperature measurements.

  15. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  16. An efficient particle swarm approach for mixed-integer programming in reliability-redundancy optimization applications

    International Nuclear Information System (INIS)

    Santos Coelho, Leandro dos

    2009-01-01

    The reliability-redundancy optimization problems can involve the selection of components with multiple choices and redundancy levels that produce maximum benefits, and are subject to the cost, weight, and volume constraints. Many classical mathematical methods have failed in handling nonconvexities and nonsmoothness in reliability-redundancy optimization problems. As an alternative to the classical optimization approaches, the meta-heuristics have been given much attention by many researchers due to their ability to find an almost global optimal solutions. One of these meta-heuristics is the particle swarm optimization (PSO). PSO is a population-based heuristic optimization technique inspired by social behavior of bird flocking and fish schooling. This paper presents an efficient PSO algorithm based on Gaussian distribution and chaotic sequence (PSO-GC) to solve the reliability-redundancy optimization problems. In this context, two examples in reliability-redundancy design problems are evaluated. Simulation results demonstrate that the proposed PSO-GC is a promising optimization technique. PSO-GC performs well for the two examples of mixed-integer programming in reliability-redundancy applications considered in this paper. The solutions obtained by the PSO-GC are better than the previously best-known solutions available in the recent literature

  17. An efficient particle swarm approach for mixed-integer programming in reliability-redundancy optimization applications

    Energy Technology Data Exchange (ETDEWEB)

    Santos Coelho, Leandro dos [Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Pontifical Catholic University of Parana, PUCPR, Imaculada Conceicao, 1155, 80215-901 Curitiba, Parana (Brazil)], E-mail: leandro.coelho@pucpr.br

    2009-04-15

    The reliability-redundancy optimization problems can involve the selection of components with multiple choices and redundancy levels that produce maximum benefits, and are subject to the cost, weight, and volume constraints. Many classical mathematical methods have failed in handling nonconvexities and nonsmoothness in reliability-redundancy optimization problems. As an alternative to the classical optimization approaches, the meta-heuristics have been given much attention by many researchers due to their ability to find an almost global optimal solutions. One of these meta-heuristics is the particle swarm optimization (PSO). PSO is a population-based heuristic optimization technique inspired by social behavior of bird flocking and fish schooling. This paper presents an efficient PSO algorithm based on Gaussian distribution and chaotic sequence (PSO-GC) to solve the reliability-redundancy optimization problems. In this context, two examples in reliability-redundancy design problems are evaluated. Simulation results demonstrate that the proposed PSO-GC is a promising optimization technique. PSO-GC performs well for the two examples of mixed-integer programming in reliability-redundancy applications considered in this paper. The solutions obtained by the PSO-GC are better than the previously best-known solutions available in the recent literature.

  18. Experimental and Modeling Approaches for Understanding the Effect of Gene Expression Noise in Biological Development

    Directory of Open Access Journals (Sweden)

    David M. Holloway

    2018-04-01

    Full Text Available Biological development involves numerous chemical and physical processes which must act in concert to reliably produce a cell, a tissue, or a body. To be successful, the developing organism must be robust to variability at many levels, such as the environment (e.g., temperature, moisture, upstream information (such as long-range positional information gradients, or intrinsic noise due to the stochastic nature of low concentration chemical kinetics. The latter is especially relevant to the regulation of gene expression in cell differentiation. The temporal stochasticity of gene expression has been studied in single celled organisms for nearly two decades, but only recently have techniques become available to gather temporally-resolved data across spatially-distributed gene expression patterns in developing multicellular organisms. These demonstrate temporal noisy “bursting” in the number of gene transcripts per cell, raising the question of how the transcript number defining a particular cell type is produced, such that one cell type can reliably be distinguished from a neighboring cell of different type along a tissue boundary. Stochastic spatio-temporal modeling of tissue-wide expression patterns can identify signatures for specific types of gene regulation, which can be used to extract regulatory mechanism information from experimental time series. This Perspective focuses on using this type of approach to study gene expression noise during the anterior-posterior segmentation of the fruit fly embryo. Advances in experimental and theoretical techniques will lead to an increasing quantification of expression noise that can be used to understand how regulatory mechanisms contribute to embryonic robustness across a range of developmental processes.

  19. Wind turbine reliability : a database and analysis approach.

    Energy Technology Data Exchange (ETDEWEB)

    Linsday, James (ARES Corporation); Briand, Daniel; Hill, Roger Ray; Stinebaugh, Jennifer A.; Benjamin, Allan S. (ARES Corporation)

    2008-02-01

    The US wind Industry has experienced remarkable growth since the turn of the century. At the same time, the physical size and electrical generation capabilities of wind turbines has also experienced remarkable growth. As the market continues to expand, and as wind generation continues to gain a significant share of the generation portfolio, the reliability of wind turbine technology becomes increasingly important. This report addresses how operations and maintenance costs are related to unreliability - that is the failures experienced by systems and components. Reliability tools are demonstrated, data needed to understand and catalog failure events is described, and practical wind turbine reliability models are illustrated, including preliminary results. This report also presents a continuing process of how to proceed with controlling industry requirements, needs, and expectations related to Reliability, Availability, Maintainability, and Safety. A simply stated goal of this process is to better understand and to improve the operable reliability of wind turbine installations.

  20. Managing Cybersecurity Research and Experimental Development: The REVO Approach

    OpenAIRE

    Dan Craigen; Drew Vandeth; D’Arcy Walsh

    2013-01-01

    We present a systematic approach for managing a research and experimental development cybersecurity program that must be responsive to continuously evolving cybersecurity, and other, operational concerns. The approach will be of interest to research-program managers, academe, corporate leads, government leads, chief information officers, chief technology officers, and social and technology policy analysts. The approach is compatible with international standards and procedures published by the...

  1. Reliability of neural encoding

    DEFF Research Database (Denmark)

    Alstrøm, Preben; Beierholm, Ulrik; Nielsen, Carsten Dahl

    2002-01-01

    The reliability with which a neuron is able to create the same firing pattern when presented with the same stimulus is of critical importance to the understanding of neuronal information processing. We show that reliability is closely related to the process of phaselocking. Experimental results f...

  2. Experimental approaches for studying non-equilibrium atmospheric plasma jets

    Energy Technology Data Exchange (ETDEWEB)

    Shashurin, A., E-mail: ashashur@purdue.edu [School of Aeronautics & Astronautics, Purdue University, West Lafayette, Indiana 47907 (United States); Keidar, M. [Department of Mechanical and Aerospace Engineering, The George Washington University, Washington, District of Columbia 20052 (United States)

    2015-12-15

    This work reviews recent research efforts undertaken in the area non-equilibrium atmospheric plasma jets with special focus on experimental approaches. Physics of small non-equilibrium atmospheric plasma jets operating in kHz frequency range at powers around few Watts will be analyzed, including mechanism of breakdown, process of ionization front propagation, electrical coupling of the ionization front with the discharge electrodes, distributions of excited and ionized species, discharge current spreading, transient dynamics of various plasma parameters, etc. Experimental diagnostic approaches utilized in the field will be considered, including Rayleigh microwave scattering, Thomson laser scattering, electrostatic streamer scatterers, optical emission spectroscopy, fast photographing, etc.

  3. Toward a Cooperative Experimental System Development Approach

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kyng, Morten; Mogensen, Preben Holst

    1997-01-01

    This chapter represents a step towards the establishment of a new system development approach, called Cooperative Experimental System Development (CESD). CESD seeks to overcome a number of limitations in existing approaches: specification oriented methods usually assume that system design can...... be based solely on observation and detached reflection; prototyping methods often have a narrow focus on the technical construction of various kinds of prototypes; Participatory Design techniques—including the Scandinavian Cooperative Design (CD) approaches—seldom go beyond the early analysis....../design activities of development projects. In contrast, the CESD approach is characterized by its focus on: active user involvement throughout the entire development process; prototyping experiments closely coupled to work-situations and use-scenarios; transforming results from early cooperative analysis...

  4. Proposed reliability cost model

    Science.gov (United States)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  5. New approaches for the reliable in vitro assessment of binding affinity based on high-resolution real-time data acquisition of radioligand-receptor binding kinetics.

    Science.gov (United States)

    Zeilinger, Markus; Pichler, Florian; Nics, Lukas; Wadsak, Wolfgang; Spreitzer, Helmut; Hacker, Marcus; Mitterhauser, Markus

    2017-12-01

    Resolving the kinetic mechanisms of biomolecular interactions have become increasingly important in early-phase drug development. Since traditional in vitro methods belong to dose-dependent assessments, binding kinetics is usually overlooked. The present study aimed at the establishment of two novel experimental approaches for the assessment of binding affinity of both, radiolabelled and non-labelled compounds targeting the A 3 R, based on high-resolution real-time data acquisition of radioligand-receptor binding kinetics. A novel time-resolved competition assay was developed and applied to determine the K i of eight different A 3 R antagonists, using CHO-K1 cells stably expressing the hA 3 R. In addition, a new kinetic real-time cell-binding approach was established to quantify the rate constants k on and k off , as well as the dedicated K d of the A 3 R agonist [ 125 I]-AB-MECA. Furthermore, lipophilicity measurements were conducted to control influences due to physicochemical properties of the used compounds. Two novel real-time cell-binding approaches were successfully developed and established. Both experimental procedures were found to visualize the kinetic binding characteristics with high spatial and temporal resolution, resulting in reliable affinity values, which are in good agreement with values previously reported with traditional methods. Taking into account the lipophilicity of the A 3 R antagonists, no influences on the experimental performance and the resulting affinity were investigated. Both kinetic binding approaches comprise tracer administration and subsequent binding to living cells, expressing the dedicated target protein. Therefore, the experiments resemble better the true in vivo physiological conditions and provide important markers of cellular feedback and biological response.

  6. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  7. Reliability assessment of serviceability performance of braced retaining walls using a neural network approach

    Science.gov (United States)

    Goh, A. T. C.; Kulhawy, F. H.

    2005-05-01

    In urban environments, one major concern with deep excavations in soft clay is the potentially large ground deformations in and around the excavation. Excessive movements can damage adjacent buildings and utilities. There are many uncertainties associated with the calculation of the ultimate or serviceability performance of a braced excavation system. These include the variabilities of the loadings, geotechnical soil properties, and engineering and geometrical properties of the wall. A risk-based approach to serviceability performance failure is necessary to incorporate systematically the uncertainties associated with the various design parameters. This paper demonstrates the use of an integrated neural network-reliability method to assess the risk of serviceability failure through the calculation of the reliability index. By first performing a series of parametric studies using the finite element method and then approximating the non-linear limit state surface (the boundary separating the safe and failure domains) through a neural network model, the reliability index can be determined with the aid of a spreadsheet. Two illustrative examples are presented to show how the serviceability performance for braced excavation problems can be assessed using the reliability index.

  8. Approach for an integral power transformer reliability model

    NARCIS (Netherlands)

    Schijndel, van A.; Wouters, P.A.A.F.; Steennis, E.F.; Wetzer, J.M.

    2012-01-01

    In electrical power transmission and distribution networks power transformers represent a crucial group of assets both in terms of reliability and investments. In order to safeguard the required quality at acceptable costs, decisions must be based on a reliable forecast of future behaviour. The aim

  9. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  10. Web server's reliability improvements using recurrent neural networks

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albu, Rǎzvan-Daniel; Felea, Ioan

    2012-01-01

    In this paper we describe an interesting approach to error prediction illustrated by experimental results. The application consists of monitoring the activity for the web servers in order to collect the specific data. Predicting an error with severe consequences for the performance of a server (t...... usage, network usage and memory usage. We collect different data sets from monitoring the web server's activity and for each one we predict the server's reliability with the proposed recurrent neural network. © 2012 Taylor & Francis Group...

  11. The experimental and shell model approach to 100Sn

    International Nuclear Information System (INIS)

    Grawe, H.; Maier, K.H.; Fitzgerald, J.B.; Heese, J.; Spohr, K.; Schubart, R.; Gorska, M.; Rejmund, M.

    1995-01-01

    The present status of experimental approach to 100 Sn and its shell model structure is given. New developments in experimental techniques, such as low background isomer spectroscopy and charged particle detection in 4π are surveyed. Based on recent experimental data shell model calculations are used to predict the structure of the single- and two-nucleon neighbours of 100 Sn. The results are compared to the systematic of Coulomb energies and spin-orbit splitting and discussed with respect to future experiments. (author). 51 refs, 11 figs, 1 tab

  12. Unlocking water markets: an experimental approach

    Science.gov (United States)

    Cook, J.; Rabotyagov, S.

    2011-12-01

    Water markets are frequently referred to as a promising approach to alleviate stress on water systems, especially as future hydrologic assessments suggest increasing demand and less reliable supply. Yet, despite decades of advocacy by water resource economists, water markets (leases and sales of water rights between willing buyers and sellers) have largely failed to develop in the western US. Although there are a number of explanations for this failure, we explore one potential reason that has received less attention : farmers as sellers may have preferences for different elements of a water market transaction that are not captured in the relative comparison of their profits from farming and their profits from agreeing to a deal. We test this explanation by recruiting irrigators with senior water rights in the upper Yakima River Basin in Washington state to participate in a series of experimental auctions. In concept, the Yakima Basin is well situated for water market transactions as it has significant water shortages for junior water users ~15% of years and projections show these are likely to increase in the future. Participants were asked a series of questions about the operation of a hypothetical 100-acre timothy hay farm including the type of buyer, how the water bank is managed, the lease type, and the offer price. Results from 7 sessions with irrigators (n=49) and a comparison group of undergraduates (n=38) show that irrigators are more likely to accept split-season than full-season leases (controlling for differences in farm profits) and are more likely to accept a lease from an irrigation district and less likely to accept an offer from a Developer. Most notably, we find farmers were far more likely than students to reject offers from buyers even though it would increase their winnings from the experiment. These results could be used in ongoing water supply policy debates in the Yakima Basin to simulate the amount of water that could be freed by water

  13. Safety, reliability, risk management and human factors: an integrated engineering approach applied to nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Silva, Eliane Magalhaes Pereira da; Costa, Antonio Carlos Lopes da; Reis, Sergio Carneiro dos

    2009-01-01

    Nuclear energy has an important engineering legacy to share with the conventional industry. Much of the development of the tools related to safety, reliability, risk management, and human factors are associated with nuclear plant processes, mainly because the public concern about nuclear power generation. Despite the close association between these subjects, there are some important different approaches. The reliability engineering approach uses several techniques to minimize the component failures that cause the failure of the complex systems. These techniques include, for instance, redundancy, diversity, standby sparing, safety factors, and reliability centered maintenance. On the other hand system safety is primarily concerned with hazard management, that is, the identification, evaluation and control of hazards. Rather than just look at failure rates or engineering strengths, system safety would examine the interactions among system components. The events that cause accidents may be complex combinations of component failures, faulty maintenance, design errors, human actions, or actuation of instrumentation and control. Then, system safety deals with a broader spectrum of risk management, including: ergonomics, legal requirements, quality control, public acceptance, political considerations, and many other non-technical influences. Taking care of these subjects individually can compromise the completeness of the analysis and the measures associated with both risk reduction, and safety and reliability increasing. Analyzing together the engineering systems and controls of a nuclear facility, their management systems and operational procedures, and the human factors engineering, many benefits can be realized. This paper proposes an integration of these issues based on the application of systems theory. (author)

  14. Safety, reliability, risk management and human factors: an integrated engineering approach applied to nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Silva, Eliane Magalhaes Pereira da; Costa, Antonio Carlos Lopes da; Reis, Sergio Carneiro dos [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)], e-mail: vasconv@cdtn.br, e-mail: silvaem@cdtn.br, e-mail: aclc@cdtn.br, e-mail: reissc@cdtn.br

    2009-07-01

    Nuclear energy has an important engineering legacy to share with the conventional industry. Much of the development of the tools related to safety, reliability, risk management, and human factors are associated with nuclear plant processes, mainly because the public concern about nuclear power generation. Despite the close association between these subjects, there are some important different approaches. The reliability engineering approach uses several techniques to minimize the component failures that cause the failure of the complex systems. These techniques include, for instance, redundancy, diversity, standby sparing, safety factors, and reliability centered maintenance. On the other hand system safety is primarily concerned with hazard management, that is, the identification, evaluation and control of hazards. Rather than just look at failure rates or engineering strengths, system safety would examine the interactions among system components. The events that cause accidents may be complex combinations of component failures, faulty maintenance, design errors, human actions, or actuation of instrumentation and control. Then, system safety deals with a broader spectrum of risk management, including: ergonomics, legal requirements, quality control, public acceptance, political considerations, and many other non-technical influences. Taking care of these subjects individually can compromise the completeness of the analysis and the measures associated with both risk reduction, and safety and reliability increasing. Analyzing together the engineering systems and controls of a nuclear facility, their management systems and operational procedures, and the human factors engineering, many benefits can be realized. This paper proposes an integration of these issues based on the application of systems theory. (author)

  15. An integrated approach to estimate storage reliability with initial failures based on E-Bayesian estimates

    International Nuclear Information System (INIS)

    Zhang, Yongjin; Zhao, Ming; Zhang, Shitao; Wang, Jiamei; Zhang, Yanjun

    2017-01-01

    An integrated approach to estimate the storage reliability is proposed. • A non-parametric measure to estimate the number of failures and the reliability at each testing time is presented. • E-Baysian method to estimate the failure probability is introduced. • The possible initial failures in storage are introduced. • The non-parametric estimates of failure numbers can be used into the parametric models.

  16. Pitting corrosion and structural reliability of corroding RC structures: Experimental data and probabilistic analysis

    International Nuclear Information System (INIS)

    Stewart, Mark G.; Al-Harthy, Ali

    2008-01-01

    A stochastic analysis is developed to assess the temporal and spatial variability of pitting corrosion on the reliability of corroding reinforced concrete (RC) structures. The structure considered herein is a singly reinforced RC beam with Y16 or Y27 reinforcing bars. Experimental data obtained from corrosion tests are used to characterise the probability distribution of pit depth. The RC beam is discretised into a series of small elements and maximum pit depths are generated for each reinforcing steel bar in each element. The loss of cross-sectional area, reduction in yield strength and reduction in flexural resistance are then inferred. The analysis considers various member spans, loading ratios, bar diameters and numbers of bars in a given cross-section, and moment diagrams. It was found that the maximum corrosion loss in a reinforcing bar conditional on beam collapse was no more than 16%. The probabilities of failure considering spatial variability of pitting corrosion were up to 200% higher than probabilities of failure obtained from a non-spatial analysis after 50 years of corrosion. This shows the importance of considering spatial variability in a structural reliability analysis for deteriorating structures, particularly for corroding RC beams in flexure

  17. A novel approach to the experimental study on methane/steam reforming kinetics using the Orthogonal Least Squares method

    Science.gov (United States)

    Sciazko, Anna; Komatsu, Yosuke; Brus, Grzegorz; Kimijima, Shinji; Szmyd, Janusz S.

    2014-09-01

    For a mathematical model based on the result of physical measurements, it becomes possible to determine their influence on the final solution and its accuracy. However, in classical approaches, the influence of different model simplifications on the reliability of the obtained results are usually not comprehensively discussed. This paper presents a novel approach to the study of methane/steam reforming kinetics based on an advanced methodology called the Orthogonal Least Squares method. The kinetics of the reforming process published earlier are divergent among themselves. To obtain the most probable values of kinetic parameters and enable direct and objective model verification, an appropriate calculation procedure needs to be proposed. The applied Generalized Least Squares (GLS) method includes all the experimental results into the mathematical model which becomes internally contradicted, as the number of equations is greater than number of unknown variables. The GLS method is adopted to select the most probable values of results and simultaneously determine the uncertainty coupled with all the variables in the system. In this paper, the evaluation of the reaction rate after the pre-determination of the reaction rate, which was made by preliminary calculation based on the obtained experimental results over a Nickel/Yttria-stabilized Zirconia catalyst, was performed.

  18. Managing Cybersecurity Research and Experimental Development: The REVO Approach

    Directory of Open Access Journals (Sweden)

    Dan Craigen

    2013-07-01

    Full Text Available We present a systematic approach for managing a research and experimental development cybersecurity program that must be responsive to continuously evolving cybersecurity, and other, operational concerns. The approach will be of interest to research-program managers, academe, corporate leads, government leads, chief information officers, chief technology officers, and social and technology policy analysts. The approach is compatible with international standards and procedures published by the Organisation for Economic Co-operation and Development (OECD and the Treasury Board of Canada Secretariat (TBS. The key benefits of the approach are the following: i the breadth of the overall (cybersecurity space is described; ii depth statements about specific (cybersecurity challenges are articulated and mapped to the breadth of the problem; iii specific (cybersecurity initiatives that have been resourced through funding or personnel are tracked and linked to specific challenges; and iv progress is assessed through key performance indicators. Although we present examples from cybersecurity, the method may be transferred to other domains. We have found the approach to be rigorous yet adaptive to change; it challenges an organization to be explicit about the nature of its research and experimental development in a manner that fosters alignment with evolving business priorities, knowledge transfer, and partner engagement.

  19. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  20. Developing a novel hierarchical approach for multiscale structural reliability predictions for ultra-high consequence applications

    Energy Technology Data Exchange (ETDEWEB)

    Emery, John M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Coffin, Peter [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Robbins, Brian A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carroll, Jay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Field, Richard V. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jeremy Yoo, Yung Suk [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kacher, Josh [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    Microstructural variabilities are among the predominant sources of uncertainty in structural performance and reliability. We seek to develop efficient algorithms for multiscale calcu- lations for polycrystalline alloys such as aluminum alloy 6061-T6 in environments where ductile fracture is the dominant failure mode. Our approach employs concurrent multiscale methods, but does not focus on their development. They are a necessary but not sufficient ingredient to multiscale reliability predictions. We have focused on how to efficiently use concurrent models for forward propagation because practical applications cannot include fine-scale details throughout the problem domain due to exorbitant computational demand. Our approach begins with a low-fidelity prediction at the engineering scale that is sub- sequently refined with multiscale simulation. The results presented in this report focus on plasticity and damage at the meso-scale, efforts to expedite Monte Carlo simulation with mi- crostructural considerations, modeling aspects regarding geometric representation of grains and second-phase particles, and contrasting algorithms for scale coupling.

  1. A standard for test reliability in group research.

    Science.gov (United States)

    Ellis, Jules L

    2013-03-01

    Many authors adhere to the rule that test reliabilities should be at least .70 or .80 in group research. This article introduces a new standard according to which reliabilities can be evaluated. This standard is based on the costs or time of the experiment and of administering the test. For example, if test administration costs are 7 % of the total experimental costs, the efficient value of the reliability is .93. If the actual reliability of a test is equal to this efficient reliability, the test size maximizes the statistical power of the experiment, given the costs. As a standard in experimental research, it is proposed that the reliability of the dependent variable be close to the efficient reliability. Adhering to this standard will enhance the statistical power and reduce the costs of experiments.

  2. Reliability modeling of a hard real-time system using the path-space approach

    International Nuclear Information System (INIS)

    Kim, Hagbae

    2000-01-01

    A hard real-time system, such as a fly-by-wire system, fails catastrophically (e.g. losing stability) if its control inputs are not updated by its digital controller computer within a certain timing constraint called the hard deadline. To assess and validate those systems' reliabilities by using a semi-Markov model that explicitly contains the deadline information, we propose a path-space approach deriving the upper and lower bounds of the probability of system failure. These bounds are derived by using only simple parameters, and they are especially suitable for highly reliable systems which should recover quickly. Analytical bounds are derived for both exponential and Wobble failure distributions encountered commonly, which have proven effective through numerical examples, while considering three repair strategies: repair-as-good-as-new, repair-as-good-as-old, and repair-better-than-old

  3. Probability of extreme interference levels computed from reliability approaches: application to transmission lines with uncertain parameters

    International Nuclear Information System (INIS)

    Larbi, M.; Besnier, P.; Pecqueux, B.

    2014-01-01

    This paper deals with the risk analysis of an EMC default using a statistical approach. It is based on reliability methods from probabilistic engineering mechanics. A computation of probability of failure (i.e. probability of exceeding a threshold) of an induced current by crosstalk is established by taking into account uncertainties on input parameters influencing levels of interference in the context of transmission lines. The study has allowed us to evaluate the probability of failure of the induced current by using reliability methods having a relative low computational cost compared to Monte Carlo simulation. (authors)

  4. An experimental study of assessment of weld quality on fatigue reliability analysis of a nuclear pressure vessel

    International Nuclear Information System (INIS)

    Dai Shuhe

    1993-01-01

    The steam generator in PWR primary coolant system China of Qinshan Nuclear Power Plant is a crucial unit belonging to the category of nuclear pressure vessel. The purpose of this research work is to make an examination of the weld quality of the steam generator under fatigue loading and to assess its reliability by using the experimental results of fatigue test of material of nuclear pressure vessel S-271 (Chinese Standard) and of qualified tests of welded seams of a simulated prototype of bottom closure head of the steam generator. A guarantee of weld quality is proposed as a subsequent verification for China National Nuclear Safety Supervision Bureau. The results of reliability analysis reported in this work can be taken as a supplementary material of Probabilistic Safety Assessment (PSA) of Qinshan Nuclear Power Plant. According to the requirement of Provision II-1500 cyclic testing, ASME Boiler and Pressure Vessel Code, Section III, Rules for Construction of Nuclear Power Plant Components, a simulated prototype of the bottom closure head of the steam generator was made for qualified tests. To find the quantified results of reliability assessment by using the testing data, two proposals are presented

  5. Test Reliability at the Individual Level

    Science.gov (United States)

    Hu, Yueqin; Nesselroade, John R.; Erbacher, Monica K.; Boker, Steven M.; Burt, S. Alexandra; Keel, Pamela K.; Neale, Michael C.; Sisk, Cheryl L.; Klump, Kelly

    2016-01-01

    Reliability has a long history as one of the key psychometric properties of a test. However, a given test might not measure people equally reliably. Test scores from some individuals may have considerably greater error than others. This study proposed two approaches using intraindividual variation to estimate test reliability for each person. A simulation study suggested that the parallel tests approach and the structural equation modeling approach recovered the simulated reliability coefficients. Then in an empirical study, where forty-five females were measured daily on the Positive and Negative Affect Schedule (PANAS) for 45 consecutive days, separate estimates of reliability were generated for each person. Results showed that reliability estimates of the PANAS varied substantially from person to person. The methods provided in this article apply to tests measuring changeable attributes and require repeated measures across time on each individual. This article also provides a set of parallel forms of PANAS. PMID:28936107

  6. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  7. Discussion about the use of the volume specific surface area (VSSA) as a criterion to identify nanomaterials according to the EU definition. Part two: experimental approach.

    Science.gov (United States)

    Lecloux, André J; Atluri, Rambabu; Kolen'ko, Yury V; Deepak, Francis Leonard

    2017-10-12

    The first part of this study was dedicated to the modelling of the influence of particle shape, porosity and particle size distribution on the volume specific surface area (VSSA) values in order to check the applicability of this concept to the identification of nanomaterials according to the European Commission Recommendation. In this second part, experimental VSSA values are obtained for various samples from nitrogen adsorption isotherms and these values were used as a screening tool to identify and classify nanomaterials. These identification results are compared to the identification based on the 50% of particles with a size below 100 nm criterion applied to the experimental particle size distributions obtained by analysis of electron microscopy images on the same materials. It is concluded that the experimental VSSA values are able to identify nanomaterials, without false negative identification, if they have a mono-modal particle size, if the adsorption data cover the relative pressure range from 0.001 to 0.65 and if a simple, qualitative image of the particles by transmission or scanning electron microscopy is available to define their shape. The experimental conditions to obtain reliable adsorption data as well as the way to analyze the adsorption isotherms are described and discussed in some detail in order to help the reader in using the experimental VSSA criterion. To obtain the experimental VSSA values, the BET surface area can be used for non-porous particles, but for porous, nanostructured or coated nanoparticles, only the external surface of the particles, obtained by a modified t-plot approach, should be considered to determine the experimental VSSA and to avoid false positive identification of nanomaterials, only the external surface area being related to the particle size. Finally, the availability of experimental VSSA values together with particle size distributions obtained by electron microscopy gave the opportunity to check the

  8. Bayesian approach in the power electric systems study of reliability ...

    African Journals Online (AJOL)

    Keywords: Reliability - Power System - Bayes Theorem - Weibull Model - Probability. ... ensure a series of estimated parameter (failure rate, mean time to failure, function .... only on random variable r.v. describing the operating conditions: ..... Multivariate performance reliability prediction in real-time, Reliability Engineering.

  9. Modeling and Forecasting (Un)Reliable Realized Covariances for More Reliable Financial Decisions

    DEFF Research Database (Denmark)

    Bollerslev, Tim; Patton, Andrew J.; Quaedvlieg, Rogier

    We propose a new framework for modeling and forecasting common financial risks based on (un)reliable realized covariance measures constructed from high-frequency intraday data. Our new approach explicitly incorporates the effect of measurement errors and time-varying attenuation biases into the c......We propose a new framework for modeling and forecasting common financial risks based on (un)reliable realized covariance measures constructed from high-frequency intraday data. Our new approach explicitly incorporates the effect of measurement errors and time-varying attenuation biases...

  10. Different Approaches for Ensuring Performance/Reliability of Plastic Encapsulated Microcircuits (PEMs) in Space Applications

    Science.gov (United States)

    Gerke, R. David; Sandor, Mike; Agarwal, Shri; Moor, Andrew F.; Cooper, Kim A.

    2000-01-01

    Engineers within the commercial and aerospace industries are using trade-off and risk analysis to aid in reducing spacecraft system cost while increasing performance and maintaining high reliability. In many cases, Commercial Off-The-Shelf (COTS) components, which include Plastic Encapsulated Microcircuits (PEMs), are candidate packaging technologies for spacecrafts due to their lower cost, lower weight and enhanced functionality. Establishing and implementing a parts program that effectively and reliably makes use of these potentially less reliable, but state-of-the-art devices, has become a significant portion of the job for the parts engineer. Assembling a reliable high performance electronic system, which includes COTS components, requires that the end user assume a risk. To minimize the risk involved, companies have developed methodologies by which they use accelerated stress testing to assess the product and reduce the risk involved to the total system. Currently, there are no industry standard procedures for accomplishing this risk mitigation. This paper will present the approaches for reducing the risk of using PEMs devices in space flight systems as developed by two independent Laboratories. The JPL procedure involves primarily a tailored screening with accelerated stress philosophy while the APL procedure is primarily, a lot qualification procedure. Both Laboratories successfully have reduced the risk of using the particular devices for their respective systems and mission requirements.

  11. Some remarks on software reliability

    International Nuclear Information System (INIS)

    Gonzalez Hernando, J.; Sanchez Izquierdo, J.

    1978-01-01

    Trend in modern NPPCI is toward a broad use of programmable elements. Some aspects concerning present status of programmable digital systems reliability are reported. Basic differences between software and hardware concept require a specific approach in all the reliability topics concerning software systems. The software reliability theory was initialy developed upon hardware models analogies. At present this approach is changing and specific models are being developed. The growing use of programmable systems necessitates emphasizing the importance of more adequate regulatory requirements to include this technology in NPPCI. (author)

  12. A two-stage approach for multi-objective decision making with applications to system reliability optimization

    International Nuclear Information System (INIS)

    Li Zhaojun; Liao Haitao; Coit, David W.

    2009-01-01

    This paper proposes a two-stage approach for solving multi-objective system reliability optimization problems. In this approach, a Pareto optimal solution set is initially identified at the first stage by applying a multiple objective evolutionary algorithm (MOEA). Quite often there are a large number of Pareto optimal solutions, and it is difficult, if not impossible, to effectively choose the representative solutions for the overall problem. To overcome this challenge, an integrated multiple objective selection optimization (MOSO) method is utilized at the second stage. Specifically, a self-organizing map (SOM), with the capability of preserving the topology of the data, is applied first to classify those Pareto optimal solutions into several clusters with similar properties. Then, within each cluster, the data envelopment analysis (DEA) is performed, by comparing the relative efficiency of those solutions, to determine the final representative solutions for the overall problem. Through this sequential solution identification and pruning process, the final recommended solutions to the multi-objective system reliability optimization problem can be easily determined in a more systematic and meaningful way.

  13. Reliability analysis of reactor protection systems

    International Nuclear Information System (INIS)

    Alsan, S.

    1976-07-01

    A theoretical mathematical study of reliability is presented and the concepts subsequently defined applied to the study of nuclear reactor safety systems. The theory is applied to investigations of the operational reliability of the Siloe reactor from the point of view of rod drop. A statistical study conducted between 1964 and 1971 demonstrated that most rod drop incidents arose from circumstances associated with experimental equipment (new set-ups). The reliability of the most suitable safety system for some recently developed experimental equipment is discussed. Calculations indicate that if all experimental equipment were equipped with these new systems, only 1.75 rod drop accidents would be expected to occur per year on average. It is suggested that all experimental equipment should be equipped with these new safety systems and tested every 21 days. The reliability of the new safety system currently being studied for the Siloe reactor was also investigated. The following results were obtained: definite failures must be detected immediately as a result of the disturbances produced; the repair time must not exceed a few hours; the equipment must be tested every week. Under such conditions, the rate of accidental rod drops is about 0.013 on average per year. The level of nondefinite failures is less than 10 -6 per hour and the level of nonprotection 1 hour per year. (author)

  14. Reliability model analysis and primary experimental evaluation of laser triggered pulse trigger

    International Nuclear Information System (INIS)

    Chen Debiao; Yang Xinglin; Li Yuan; Li Jin

    2012-01-01

    High performance pulse trigger can enhance performance and stability of the PPS. It is necessary to evaluate the reliability of the LTGS pulse trigger, so we establish the reliability analysis model of this pulse trigger based on CARMES software, the reliability evaluation is accord with the statistical results. (authors)

  15. Investment in new product reliability

    International Nuclear Information System (INIS)

    Murthy, D.N.P.; Rausand, M.; Virtanen, S.

    2009-01-01

    Product reliability is of great importance to both manufacturers and customers. Building reliability into a new product is costly, but the consequences of inadequate product reliability can be costlier. This implies that manufacturers need to decide on the optimal investment in new product reliability by achieving a suitable trade-off between the two costs. This paper develops a framework and proposes an approach to help manufacturers decide on the investment in new product reliability.

  16. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  17. A Step by Step Approach for Evaluating the Reliability of the Main Engine Lube Oil System for a Ship's Propulsion System

    Directory of Open Access Journals (Sweden)

    Mohan Anantharaman

    2014-09-01

    Full Text Available Effective and efficient maintenance is essential to ensure reliability of a ship's main propulsion system, which in turn is interdependent on the reliability of a number of associated sub- systems. A primary step in evaluating the reliability of the ship's propulsion system will be to evaluate the reliability of each of the sub- system. This paper discusses the methodology adopted to quantify reliability of one of the vital sub-system viz. the lubricating oil system, and development of a model, based on Markov analysis thereof. Having developed the model, means to improve reliability of the system should be considered. The cost of the incremental reliability should be measured to evaluate cost benefits. A maintenance plan can then be devised to achieve the higher level of reliability. Similar approach could be considered to evaluate the reliability of all other sub-systems. This will finally lead to development of a model to evaluate and improve the reliability of the main propulsion system.

  18. Strength and Reliability of Wood for the Components of Low-cost Wind Turbines: Computational and Experimental Analysis and Applications

    DEFF Research Database (Denmark)

    Mishnaevsky, Leon; Freere, Peter; Sharma, Ranjan

    2009-01-01

    of experiments and computational investigations. Low cost testing machines have been designed, and employed for the systematic analysis of different sorts of Nepali wood, to be used for the wind turbine construction. At the same time, computational micromechanical models of deformation and strength of wood......This paper reports the latest results of the comprehensive program of experimental and computational analysis of strength and reliability of wooden parts of low cost wind turbines. The possibilities of prediction of strength and reliability of different types of wood are studied in the series...... are developed, which should provide the basis for microstructure-based correlating of observable and service properties of wood. Some correlations between microstructure, strength and service properties of wood have been established....

  19. The Bayes linear approach to inference and decision-making for a reliability programme

    International Nuclear Information System (INIS)

    Goldstein, Michael; Bedford, Tim

    2007-01-01

    In reliability modelling it is conventional to build sophisticated models of the probabilistic behaviour of the component lifetimes in a system in order to deduce information about the probabilistic behaviour of the system lifetime. Decision modelling of the reliability programme requires a priori, therefore, an even more sophisticated set of models in order to capture the evidence the decision maker believes may be obtained from different types of data acquisition. Bayes linear analysis is a methodology that uses expectation rather than probability as the fundamental expression of uncertainty. By working only with expected values, a simpler level of modelling is needed as compared to full probability models. In this paper we shall consider the Bayes linear approach to the estimation of a mean time to failure MTTF of a component. The model built will take account of the variance in our estimate of the MTTF, based on a variety of sources of information

  20. Reliability and validity of risk analysis

    International Nuclear Information System (INIS)

    Aven, Terje; Heide, Bjornar

    2009-01-01

    In this paper we investigate to what extent risk analysis meets the scientific quality requirements of reliability and validity. We distinguish between two types of approaches within risk analysis, relative frequency-based approaches and Bayesian approaches. The former category includes both traditional statistical inference methods and the so-called probability of frequency approach. Depending on the risk analysis approach, the aim of the analysis is different, the results are presented in different ways and consequently the meaning of the concepts reliability and validity are not the same.

  1. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    can be used to understand how an amino acid change affects the protein. The experimental methods that provide the most detailed structural information on proteins are X-ray crystallography and NMR spectroscopy. However, these methods are labor intensive and currently cannot be carried out on a genomic scale. Nonetheless, Structural Genomics projects are being pursued by more than a dozen groups and consortia worldwide and as a result the number of experimentally determined structures is rising exponentially. Based on the expectation that protein structures will continue to be determined at an ever-increasing rate, reliable structure prediction schemes will become increasingly valuable, leading to information on protein function and disease for many different proteins. Given known genetic variability and experimentally determined protein structures, can we accurately predict the effects of single amino acid substitutions? An objective assessment of this question would involve comparing predicted and experimentally determined structures, which thus far has not been rigorously performed. The completed research leveraged existing expertise at LLNL in computational and structural biology, as well as significant computing resources, to address this question.

  2. An Open Modelling Approach for Availability and Reliability of Systems - OpenMARS

    CERN Document Server

    Penttinen, Jussi-Pekka; Gutleber, Johannes

    2018-01-01

    This document introduces and gives specification for OpenMARS, which is an open modelling approach for availability and reliability of systems. It supports the most common risk assessment and operation modelling techniques. Uniquely OpenMARS allows combining and connecting models defined with different techniques. This ensures that a modeller has a high degree of freedom to accurately describe the modelled system without limitations imposed by an individual technique. Here the OpenMARS model definition is specified with a tool independent tabular format, which supports managing models developed in a collaborative fashion. Origin of our research is in Future Circular Collider (FCC) study, where we developed the unique features of our concept to model the availability and luminosity production of particle colliders. We were motivated to describe our approach in detail as we see potential further applications in performance and energy efficiency analyses of large scientific infrastructures or industrial processe...

  3. Experimentation on accuracy of non functional requirement prioritization approaches for different complexity projects

    OpenAIRE

    Raj Kumar Chopra; Varun Gupta; Durg Singh Chauhan

    2016-01-01

    Non functional requirements must be selected for implementation together with functional requirements to enhance the success of software projects. Three approaches exist for performing the prioritization of non functional requirements using the suitable prioritization technique. This paper performs experimentation on three different complexity versions of the industrial software project using cost-value prioritization technique employing three approaches. Experimentation is conducted to analy...

  4. Optimization of Reliability Centered Maintenance Bassed on Maintenance Costs and Reliability with Consideration of Location of Components

    Directory of Open Access Journals (Sweden)

    Mahdi Karbasian

    2011-03-01

    Full Text Available The reliability of designing systems such as electrical and electronic circuits, power generation/ distribution networks and mechanical systems, in which the failure of a component may cause the whole system failure, and even the reliability of cellular manufacturing systems that their machines are connected to as series are critically important. So far approaches for improving the reliability of these systems have been mainly based on the enhancement of inherent reliability of any system component or increasing system reliability based on maintenance strategies. Also in some of the resources, only the influence of the location of systems' components on reliability is studied. Therefore, it seems other approaches have been rarely applied. In this paper, a multi criteria model has been proposed to perform a balance among a system's reliability, location costs, and its system maintenance. Finally, a numerical example has been presented and solved by the Lingo software.

  5. Reliability of objects in aerospace technologies and beyond: Holistic risk management approach

    Science.gov (United States)

    Shai, Yair; Ingman, D.; Suhir, E.

    A “ high level” , deductive-reasoning-based (“ holistic” ), approach is aimed at the direct analysis of the behavior of a system as a whole, rather than with an attempt to understand the system's behavior by conducting first a “ low level” , inductive-reasoning-based, analysis of the behavior and the contributions of the system's elements. The holistic view on treatment is widely accepted in medical practice, and “ holistic health” concept upholds that all the aspects of people's needs (psychological, physical or social), should be seen as a whole, and that a disease is caused by the combined effect of physical, emotional, spiritual, social and environmental imbalances. Holistic reasoning is applied in our analysis to model the behavior of engineering products (“ species” ) subjected to various economic, marketing, and reliability “ health” factors. Vehicular products (cars, aircraft, boats, etc.), e.g., might be still robust enough, but could be out-of-date, or functionally obsolete, or their further use might be viewed as unjustifiably expensive. High-level-performance functions (HLPF) are the essential feature of the approach. HLPFs are, in effect, “ signatures” of the “ species” of interest. The HLPFs describe, in a “ holistic” , and certainly in a probabilistic, way, numerous complex multi-dependable relations among the representatives of the “ species” under consideration. ; umerous inter-related “ stresses” , both actual (“ physical” ) and nonphysical, which affect the probabilistic predictions are inherently being taken into account by the HLPFs. There is no need, and might even be counter-productive, to conduct tedious, time- and labor-consuming experimentations and to invest significant amount of time and resources to accumulate “ representative statistics” to predict - he governing probabilistic characteristics of the system behavior, such as, e.g., life expectancy of a particular type of products.

  6. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    OpenAIRE

    Xiao, Ning-Cong; Li, Yan-Feng; Wang, Zhonglai; Peng, Weiwen; Huang, Hong-Zhong

    2013-01-01

    In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to cal...

  7. Machine Learning Approach for Software Reliability Growth Modeling with Infinite Testing Effort Function

    Directory of Open Access Journals (Sweden)

    Subburaj Ramasamy

    2017-01-01

    Full Text Available Reliability is one of the quantifiable software quality attributes. Software Reliability Growth Models (SRGMs are used to assess the reliability achieved at different times of testing. Traditional time-based SRGMs may not be accurate enough in all situations where test effort varies with time. To overcome this lacuna, test effort was used instead of time in SRGMs. In the past, finite test effort functions were proposed, which may not be realistic as, at infinite testing time, test effort will be infinite. Hence in this paper, we propose an infinite test effort function in conjunction with a classical Nonhomogeneous Poisson Process (NHPP model. We use Artificial Neural Network (ANN for training the proposed model with software failure data. Here it is possible to get a large set of weights for the same model to describe the past failure data equally well. We use machine learning approach to select the appropriate set of weights for the model which will describe both the past and the future data well. We compare the performance of the proposed model with existing model using practical software failure data sets. The proposed log-power TEF based SRGM describes all types of failure data equally well and also improves the accuracy of parameter estimation more than existing TEF and can be used for software release time determination as well.

  8. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis

    Directory of Open Access Journals (Sweden)

    Adela-Eliza Dumitrascu

    2015-01-01

    Full Text Available Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram, which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.

  9. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis.

    Science.gov (United States)

    Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina

    2015-01-01

    Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.

  10. The effect of Web-based Braden Scale training on the reliability of Braden subscale ratings.

    Science.gov (United States)

    Magnan, Morris A; Maklebust, JoAnn

    2009-01-01

    The primary purpose of this study was to evaluate the effect of Web-based Braden Scale training on the reliability of Braden Scale subscale ratings made by nurses working in acute care hospitals. A secondary purpose was to describe the distribution of reliable Braden subscale ratings before and after Web-based Braden Scale training. Secondary analysis of data from a recently completed quasi-experimental, pretest-posttest, interrater reliability study. A convenience sample of RNs working at 3 Michigan medical centers voluntarily participated in the study. RN participants included nurses who used the Braden Scale regularly at their place of employment ("regular users") as well as nurses who did not use the Braden Scale at their place of employment ("new users"). Using a pretest-posttest, quasi-experimental design, pretest interrater reliability data were collected to identify the percentage of nurses making reliable Braden subscale assessments. Nurses then completed a Web-based Braden Scale training module after which posttest interrater reliability data were collected. The reliability of nurses' Braden subscale ratings was determined by examining the level of agreement/disagreement between ratings made by an RN and an "expert" rating the same patient. In total, 381 RN-to-expert dyads were available for analysis. During both the pretest and posttest periods, the percentage of reliable subscale ratings was highest for the activity subscale, lowest for the moisture subscale, and second lowest for the nutrition subscale. With Web-based Braden Scale training, the percentage of reliable Braden subscale ratings made by new users increased for all 6 subscales with statistically significant improvements in the percentage of reliable assessments made on 3 subscales: sensory-perception, moisture, and mobility. Training had virtually no effect on the percentage of reliable subscale ratings made by regular users of the Braden Scale. With Web-based Braden Scale training the

  11. Friction force experimental approach in High School Physics classes

    Directory of Open Access Journals (Sweden)

    Marco Aurélio Alvarenga Monteiro

    2012-12-01

    Full Text Available http://dx.doi.org/10.5007/2175-7941.2012v29n3p1121 In this paper we propose and describe the performance of an experimental activity to address the concept of friction in High School Physics practical classes. We use a low-cost and simple construction device that enables the determination of the coefficient of static friction between two materials through three different procedures. The results were coherent, with small percentage deviation, which gives reliability to the activity and can stimulate discussions in class. The activity also allows greater contextualization of concepts that are usually discussed only theoretically, requiring a higher abstraction level of the students. This can stimulate discussions and greater interaction between teacher and students.

  12. Experimentation on accuracy of non functional requirement prioritization approaches for different complexity projects

    Directory of Open Access Journals (Sweden)

    Raj Kumar Chopra

    2016-09-01

    Full Text Available Non functional requirements must be selected for implementation together with functional requirements to enhance the success of software projects. Three approaches exist for performing the prioritization of non functional requirements using the suitable prioritization technique. This paper performs experimentation on three different complexity versions of the industrial software project using cost-value prioritization technique employing three approaches. Experimentation is conducted to analyze the accuracy of individual approaches and the variation of accuracy with the complexity of the software project. The results indicate that selecting non functional requirements separately, but in accordance with functionality has higher accuracy amongst the other two approaches. Further, likewise other approaches, it witnesses the decrease in accuracy with increase in software complexity but the decrease is minimal.

  13. Personal Reflections on Observational and Experimental Research Approaches to Childhood Psychopathology

    Science.gov (United States)

    Rapoport, Judith L.

    2009-01-01

    The past 50 years have seen dramatic changes in childhood psychopathology research. The goal of this overview is to contrast observational and experimental research approaches; both have grown more complex such that the boundary between these approaches may be blurred. Both are essential. Landmark observational studies with long-term follow-up…

  14. The neural correlates of consciousness: new experimental approaches needed?

    Science.gov (United States)

    Hohwy, Jakob

    2009-06-01

    It appears that consciousness science is progressing soundly, in particular in its search for the neural correlates of consciousness. There are two main approaches to this search, one is content-based (focusing on the contrast between conscious perception of, e.g., faces vs. houses), the other is state-based (focusing on overall conscious states, e.g., the contrast between dreamless sleep vs. the awake state). Methodological and conceptual considerations of a number of concrete studies show that both approaches are problematic: the content-based approach seems to set aside crucial aspects of consciousness; and the state-based approach seems over-inclusive in a way that is hard to rectify without losing sight of the crucial conscious-unconscious contrast. Consequently, the search for the neural correlates of consciousness is in need of new experimental paradigms.

  15. STRUCTURAL FLUCTUATIONS, ELECTRICAL RESPONSE AND THE RELIABILITY OF NANOSTRUCTURES (FINAL REPORT)

    Energy Technology Data Exchange (ETDEWEB)

    Philip J. Rous; Ellen D. Williams; Michael S. Fuhrer

    2006-07-31

    The goal of the research supported by DOE-FG02-01ER45939 was to synthesize a number of experimental and theoretical approaches to understand the relationship between morphological fluctuations, the electrical response and the reliability (failure) of metallic nanostructures. The primary focus of our work was the study of metallic nanowires which we regard as prototypical of nanoscale interconnects. Our research plan has been to link together these materials properties and behaviors by understanding the phenomenon of, and the effects of electromigration at nanometer length scales. The thrust of our research has been founded on the concept that, for nanostructures where the surface-to-volume ratio is necessarily high, surface diffusion is the dominant mass transport mechanism that governs the fluctuations, electrical properties and failure modes of nanostructures. Our approach has been to develop experimental methods that permit the direct imaging of the electromagnetic distributions within nanostructures, their structural fluctuations and their electrical response. This experimental research is complemented by a parallel theoretical and computational program that describes the temporal evolution of nanostructures in response to current flow.

  16. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  17. ASSESSING AND COMBINING RELIABILITY OF PROTEIN INTERACTION SOURCES

    Science.gov (United States)

    LEACH, SONIA; GABOW, AARON; HUNTER, LAWRENCE; GOLDBERG, DEBRA S.

    2008-01-01

    Integrating diverse sources of interaction information to create protein networks requires strategies sensitive to differences in accuracy and coverage of each source. Previous integration approaches calculate reliabilities of protein interaction information sources based on congruity to a designated ‘gold standard.’ In this paper, we provide a comparison of the two most popular existing approaches and propose a novel alternative for assessing reliabilities which does not require a gold standard. We identify a new method for combining the resultant reliabilities and compare it against an existing method. Further, we propose an extrinsic approach to evaluation of reliability estimates, considering their influence on the downstream tasks of inferring protein function and learning regulatory networks from expression data. Results using this evaluation method show 1) our method for reliability estimation is an attractive alternative to those requiring a gold standard and 2) the new method for combining reliabilities is less sensitive to noise in reliability assignments than the similar existing technique. PMID:17990508

  18. Do intensity ratings and skin conductance responses reliably discriminate between different stimulus intensities in experimentally induced pain?

    Science.gov (United States)

    Breimhorst, Markus; Sandrock, Stephan; Fechir, Marcel; Hausenblas, Nadine; Geber, Christian; Birklein, Frank

    2011-01-01

    The present study addresses the question whether pain-intensity ratings and skin conductance responses (SCRs) are able to detect different intensities of phasic painful stimuli and to determine the reliability of this discrimination. For this purpose, 42 healthy participants of both genders were assigned to either electrical, mechanical, or laser heat-pain stimulation (each n = 14). A whole range of single brief painful stimuli were delivered on the right volar forearm of the dominant hand in a randomized order. Pain-intensity ratings and SCRs were analyzed. Using generalizability theory, individual and gender differences were the main contributors to the variability of both intensity ratings and SCRs. Most importantly, we showed that pain-intensity ratings are a reliable measure for the discrimination of different pain stimulus intensities in the applied modalities. The reliability of SCR was adequate when mechanical and heat stimuli were tested but failed for the discrimination of electrical stimuli. Further studies are needed to reveal the reason for this lack of accuracy for SCRs when applying electrical pain stimuli. Our study could help researchers to better understand the relationship between pain and activation of the sympathetic nervous system. Pain researchers are furthermore encouraged to consider individual and gender differences when measuring pain intensity and the concomitant SCRs in experimental settings. Copyright © 2011 American Pain Society. Published by Elsevier Inc. All rights reserved.

  19. Approaches of data combining for reliability assessments with taking into account the priority of data application

    International Nuclear Information System (INIS)

    Zelenyj, O.V.; Pecheritsa, A.V.

    2004-01-01

    Based upon the available experience on assessments of risk from Ukrainian NPP's operational events as well as on results of State review of PSA studies for pilot units it should be noted that historical information on domestic NPP's operation is not always available or used properly under implementation of mentioned activities. The several approaches for combining of available generic and specific information for reliability parameters assessment (taking into account the priority of data application) are briefly described in the article along with some recommendations how to apply those approaches

  20. Approaches to Demonstrating the Reliability and Validity of Core Diagnostic Criteria for Chronic Pain.

    Science.gov (United States)

    Bruehl, Stephen; Ohrbach, Richard; Sharma, Sonia; Widerstrom-Noga, Eva; Dworkin, Robert H; Fillingim, Roger B; Turk, Dennis C

    2016-09-01

    The Analgesic, Anesthetic, and Addiction Clinical Trial Translations, Innovations, Opportunities, and Networks-American Pain Society Pain Taxonomy (AAPT) is designed to be an evidence-based multidimensional chronic pain classification system that will facilitate more comprehensive and consistent chronic pain diagnoses, and thereby enhance research, clinical communication, and ultimately patient care. Core diagnostic criteria (dimension 1) for individual chronic pain conditions included in the initial version of AAPT will be the focus of subsequent empirical research to evaluate and provide evidence for their reliability and validity. Challenges to validating diagnostic criteria in the absence of clear and identifiable pathophysiological mechanisms are described. Based in part on previous experience regarding the development of evidence-based diagnostic criteria for psychiatric disorders, headache, and specific chronic pain conditions (fibromyalgia, complex regional pain syndrome, temporomandibular disorders, pain associated with spinal cord injuries), several potential approaches for documentation of the reliability and validity of the AAPT diagnostic criteria are summarized. The AAPT is designed to be an evidence-based multidimensional chronic pain classification system. Conceptual and methodological issues related to demonstrating the reliability and validity of the proposed AAPT chronic pain diagnostic criteria are discussed. Copyright © 2016 American Pain Society. Published by Elsevier Inc. All rights reserved.

  1. Experimental approaches to heavy ion fusion

    International Nuclear Information System (INIS)

    Obayashi, H.; Fujii-e, Y.; Yamaki, T.

    1986-01-01

    As a feasibility study on heavy-ion-beam induced inertial fusion (HIF) approach, a conceptual plant design called HIBLIC-I has been worked out since 1982. The characteristic features of this design are summarized. To experimentally confirm them and prove them at least in principle, considerations are made on possible experimental programs to give substantial information on these critical phenomena. In HIBLIC-I, an accelerator complex is adopted as driver system to provide 6 beams of 208 Pb +1 ions at 15 GeV, which will be simultaneously focussed on a single shell, three layered target. The target is designed to give an energy gain of 100, so that the total beam energy of 4 MJ with 160 TW power may release 400 MJ fusion energy. A reactor chamber is cylindrical with double-walled structure made of HT-9. There are three layers of liquid Li flow inside the reactor. The innermost layer forms a Li curtain which is effective to recover the residual cavity pressure. A thick upward flow serves as coolant and tritium breeder. Tritium will be recovered by yttrium gettering system. A driver system is operated at the repetition rate of 10 Hz and supplies beams for 10 reactor chambers. Then the plant yield of fusion power becomes 4000 MWt, corresponding a net electric output of 1.5 GW. Experimental programs related to HIBLIC-I is described and discussed, including those for heavy-ion-beam experiments and proposals for lithium curtain by electron beam to clarify the key phenomena in HIBLIC-I cavity. (Nogami, K.)

  2. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  3. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  4. Myth of the Master Detective: Reliability of Interpretations for Kaufman's "Intelligent Testing" Approach to the WISC-III.

    Science.gov (United States)

    Macmann, Gregg M.; Barnett, David W.

    1997-01-01

    Used computer simulation to examine the reliability of interpretations for Kaufman's "intelligent testing" approach to the Wechsler Intelligence Scale for Children (3rd ed.) (WISC-III). Findings indicate that factor index-score differences and other measures could not be interpreted with confidence. Argues that limitations of IQ testing…

  5. Performance and Reliability of Bonded Interfaces for High-Temperature Packaging (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Devoto, D.

    2014-11-01

    The thermal performance and reliability of sintered-silver is being evaluated for power electronics packaging applications. This will be experimentally accomplished by the synthesis of large-area bonded interfaces between metalized substrates that will be subsequently subjected to thermal cycles. A finite element model of crack initiation and propagation in these bonded interfaces will allow for the interpretation of degradation rates by a crack-velocity (V)-stress intensity factor (K) analysis. The experiment is outlined, and the modeling approach is discussed.

  6. Experimental evaluations of wearable ECG monitor.

    Science.gov (United States)

    Ha, Kiryong; Kim, Youngsung; Jung, Junyoung; Lee, Jeunwoo

    2008-01-01

    Healthcare industry is changing with ubiquitous computing environment and wearable ECG measurement is one of the most popular approaches in this healthcare industry. Reliability and performance of healthcare device is fundamental issue for widespread adoptions, and interdisciplinary perspectives of wearable ECG monitor make this more difficult. In this paper, we propose evaluation criteria considering characteristic of both ECG measurement and ubiquitous computing. With our wearable ECG monitors, various levels of experimental analysis are performed based on evaluation strategy.

  7. Reliability analysis of component-level redundant topologies for solid-state fault current limiter

    Science.gov (United States)

    Farhadi, Masoud; Abapour, Mehdi; Mohammadi-Ivatloo, Behnam

    2018-04-01

    Experience shows that semiconductor switches in power electronics systems are the most vulnerable components. One of the most common ways to solve this reliability challenge is component-level redundant design. There are four possible configurations for the redundant design in component level. This article presents a comparative reliability analysis between different component-level redundant designs for solid-state fault current limiter. The aim of the proposed analysis is to determine the more reliable component-level redundant configuration. The mean time to failure (MTTF) is used as the reliability parameter. Considering both fault types (open circuit and short circuit), the MTTFs of different configurations are calculated. It is demonstrated that more reliable configuration depends on the junction temperature of the semiconductor switches in the steady state. That junction temperature is a function of (i) ambient temperature, (ii) power loss of the semiconductor switch and (iii) thermal resistance of heat sink. Also, results' sensitivity to each parameter is investigated. The results show that in different conditions, various configurations have higher reliability. The experimental results are presented to clarify the theory and feasibility of the proposed approaches. At last, levelised costs of different configurations are analysed for a fair comparison.

  8. Reliability Evaluation and Improvement Approach of Chemical Production Man - Machine - Environment System

    Science.gov (United States)

    Miao, Yongchun; Kang, Rongxue; Chen, Xuefeng

    2017-12-01

    In recent years, with the gradual extension of reliability research, the study of production system reliability has become the hot topic in various industries. Man-machine-environment system is a complex system composed of human factors, machinery equipment and environment. The reliability of individual factor must be analyzed in order to gradually transit to the research of three-factor reliability. Meanwhile, the dynamic relationship among man-machine-environment should be considered to establish an effective blurry evaluation mechanism to truly and effectively analyze the reliability of such systems. In this paper, based on the system engineering, fuzzy theory, reliability theory, human error, environmental impact and machinery equipment failure theory, the reliabilities of human factor, machinery equipment and environment of some chemical production system were studied by the method of fuzzy evaluation. At last, the reliability of man-machine-environment system was calculated to obtain the weighted result, which indicated that the reliability value of this chemical production system was 86.29. Through the given evaluation domain it can be seen that the reliability of man-machine-environment integrated system is in a good status, and the effective measures for further improvement were proposed according to the fuzzy calculation results.

  9. An Indication of Reliability of the Two-Level Approach of the AWIN Welfare Assessment Protocol for Horses

    Directory of Open Access Journals (Sweden)

    Irena Czycholl

    2018-01-01

    Full Text Available To enhance feasibility, the Animal Welfare Indicators (AWIN assessment protocol for horses consists of two levels: the first is a visual inspection of a sample of horses performed from a distance, the second a close-up inspection of all horses. The aim was to analyse whether information would be lost if only the first level were performed. In this study, 112 first and 112 second level assessments carried out on a subsequent day by one observer were compared by calculating the Spearman’s Rank Correlation Coefficient (RS, Intraclass Correlation Coefficients (ICC, Smallest Detectable Changes (SDC and Limits of Agreements (LoA. Most indicators demonstrated sufficient reliability between the two levels. Exceptions were the Horse Grimace Scale, the Avoidance Distance Test and the Voluntary Human Approach Test (e.g., Voluntary Human Approach Test: RS: 0.38, ICC: 0.38, SDC: 0.21, LoA: −0.25–0.17, which could, however, be also interpreted as a lack of test-retest reliability. Further disagreement was found for the indicator consistency of manure (RS: 0.31, ICC: 0.38, SDC: 0.36, LoA: −0.38–0.36. For these indicators, an adaptation of the first level would be beneficial. Overall, in this study, the division into two levels was reliable and might therewith have the potential to enhance feasibility in other welfare assessment schemes.

  10. Electronic structure of BN-aromatics: Choice of reliable computational tools

    Science.gov (United States)

    Mazière, Audrey; Chrostowska, Anna; Darrigan, Clovis; Dargelos, Alain; Graciaa, Alain; Chermette, Henry

    2017-10-01

    The importance of having reliable calculation tools to interpret and predict the electronic properties of BN-aromatics is directly linked to the growing interest for these very promising new systems in the field of materials science, biomedical research, or energy sustainability. Ionization energy (IE) is one of the most important parameters to approach the electronic structure of molecules. It can be theoretically estimated, but in order to evaluate their persistence and propose the most reliable tools for the evaluation of different electronic properties of existent or only imagined BN-containing compounds, we took as reference experimental values of ionization energies provided by ultra-violet photoelectron spectroscopy (UV-PES) in gas phase—the only technique giving access to the energy levels of filled molecular orbitals. Thus, a set of 21 aromatic molecules containing B-N bonds and B-N-B patterns has been merged for a comparison between experimental IEs obtained by UV-PES and various theoretical approaches for their estimation. Time-Dependent Density Functional Theory (TD-DFT) methods using B3LYP and long-range corrected CAM-B3LYP functionals are used, combined with the Δ SCF approach, and compared with electron propagator theory such as outer valence Green's function (OVGF, P3) and symmetry adapted cluster-configuration interaction ab initio methods. Direct Kohn-Sham estimation and "corrected" Kohn-Sham estimation are also given. The deviation between experimental and theoretical values is computed for each molecule, and a statistical study is performed over the average and the root mean square for the whole set and sub-sets of molecules. It is shown that (i) Δ SCF+TDDFT(CAM-B3LYP), OVGF, and P3 are the most efficient way for a good agreement with UV-PES values, (ii) a CAM-B3LYP range-separated hybrid functional is significantly better than B3LYP for the purpose, especially for extended conjugated systems, and (iii) the "corrected" Kohn-Sham result is a

  11. A fast and reliable way to establish fluidic connections to planar microchips

    DEFF Research Database (Denmark)

    Snakenborg, Detlef; Perozziello, Gerardo; Geschke, Oliver

    2007-01-01

    In this work, we present a non-permanent method to connect microfluidic devices. The approach uses short flexible tubes that are plugged into bottom-flat holes and ensure fast and reliable interconnections. The small available dimensions allow the tube to be directly attached to the side of plana...... microchips. A theoretical model to estimate the maximum applicable pressure was developed, and verified with experimental data. Furthermore, the tube connections were compared to other non-permanent interconnection types....

  12. Análisis de confiabilidad y riesgo de una instalación experimental para el tratamiento de aguas residuales//Reliability and risk analysis of an experimental set-up for wastewater treatment

    Directory of Open Access Journals (Sweden)

    María Adelfa Abreu‐Zamora

    2014-01-01

    Full Text Available Una de las exigencias modernas para el uso de equipos en todas las ramas de la economía, la ciencia y la educación, es su explotación segura. En este trabajo, se realizó el análisis de confiabilidad y riesgo de una instalación experimental para el tratamiento de aguas residuales con radiación ultravioleta. Se empleó la técnica del árbol de fallos y se analizaron dos variantes de cálculo. La primera variante consideró fuentes no confiables de suministro de energía eléctrica y la segunda consideró la existencia de fuentes confiables. Como resultado se identificaron 20 conjuntos mínimos de corte, 12 de primer orden y 8 de tercer orden. Además, se infirió, la necesidad de contar con una fuente alternativa de electricidad y que esimportante establecer redundancia de grupo de componentes para instalaciones a escala industrial. El análisis demostró que la instalación es segura para su uso en la experimentación a escala de laboratorio.Palabras claves: confiabilidad, riesgo, instalación experimental, tratamiento de aguas residuales, radiación ultravioleta, árbol de fallos.______________________________________________________________________________AbstractOne of the modern requirements for using equipments in all the areas of economy, science and education, is its safety operation. In this work, it was carried out the reliability and risk analysis of an experimental setup for the wastewater treatment with ultraviolet radiation. The fault tree technique was used and two variants of calculation were analyzed. The first variant considered unreliable sources of electricity supply and the second considered the existence of reliable sources. As a result, 20 minimal cut sets were identified 12 of first-order and 8 of third-order. Besides, the necessity of an alternative supply electrical power source was inferred and it is important to establish redundant components group for industrial scalefacilities. The analysis demostrated the set

  13. Hydrodynamic cavitation: from theory towards a new experimental approach

    Science.gov (United States)

    Lucia, Umberto; Gervino, Gianpiero

    2009-09-01

    Hydrodynamic cavitation is analysed by a global thermodynamics principle following an approach based on the maximum irreversible entropy variation that has already given promising results for open systems and has been successfully applied in specific engineering problems. In this paper we present a new phenomenological method to evaluate the conditions inducing cavitation. We think this method could be useful in the design of turbo-machineries and related technologies: it represents both an original physical approach to cavitation and an economical saving in planning because the theoretical analysis could allow engineers to reduce the experimental tests and the costs of the design process.

  14. Fast and Reliable Mouse Picking Using Graphics Hardware

    Directory of Open Access Journals (Sweden)

    Hanli Zhao

    2009-01-01

    Full Text Available Mouse picking is the most commonly used intuitive operation to interact with 3D scenes in a variety of 3D graphics applications. High performance for such operation is necessary in order to provide users with fast responses. This paper proposes a fast and reliable mouse picking algorithm using graphics hardware for 3D triangular scenes. Our approach uses a multi-layer rendering algorithm to perform the picking operation in linear time complexity. The objectspace based ray-triangle intersection test is implemented in a highly parallelized geometry shader. After applying the hardware-supported occlusion queries, only a small number of objects (or sub-objects are rendered in subsequent layers, which accelerates the picking efficiency. Experimental results demonstrate the high performance of our novel approach. Due to its simplicity, our algorithm can be easily integrated into existing real-time rendering systems.

  15. Analysis of information security reliability: A tutorial

    International Nuclear Information System (INIS)

    Kondakci, Suleyman

    2015-01-01

    This article presents a concise reliability analysis of network security abstracted from stochastic modeling, reliability, and queuing theories. Network security analysis is composed of threats, their impacts, and recovery of the failed systems. A unique framework with a collection of the key reliability models is presented here to guide the determination of the system reliability based on the strength of malicious acts and performance of the recovery processes. A unique model, called Attack-obstacle model, is also proposed here for analyzing systems with immunity growth features. Most computer science curricula do not contain courses in reliability modeling applicable to different areas of computer engineering. Hence, the topic of reliability analysis is often too diffuse to most computer engineers and researchers dealing with network security. This work is thus aimed at shedding some light on this issue, which can be useful in identifying models, their assumptions and practical parameters for estimating the reliability of threatened systems and for assessing the performance of recovery facilities. It can also be useful for the classification of processes and states regarding the reliability of information systems. Systems with stochastic behaviors undergoing queue operations and random state transitions can also benefit from the approaches presented here. - Highlights: • A concise survey and tutorial in model-based reliability analysis applicable to information security. • A framework of key modeling approaches for assessing reliability of networked systems. • The framework facilitates quantitative risk assessment tasks guided by stochastic modeling and queuing theory. • Evaluation of approaches and models for modeling threats, failures, impacts, and recovery analysis of information systems

  16. A flexible latent class approach to estimating test-score reliability

    NARCIS (Netherlands)

    van der Palm, D.W.; van der Ark, L.A.; Sijtsma, K.

    2014-01-01

    The latent class reliability coefficient (LCRC) is improved by using the divisive latent class model instead of the unrestricted latent class model. This results in the divisive latent class reliability coefficient (DLCRC), which unlike LCRC avoids making subjective decisions about the best solution

  17. Surface physics theoretical models and experimental methods

    CERN Document Server

    Mamonova, Marina V; Prudnikova, I A

    2016-01-01

    The demands of production, such as thin films in microelectronics, rely on consideration of factors influencing the interaction of dissimilar materials that make contact with their surfaces. Bond formation between surface layers of dissimilar condensed solids-termed adhesion-depends on the nature of the contacting bodies. Thus, it is necessary to determine the characteristics of adhesion interaction of different materials from both applied and fundamental perspectives of surface phenomena. Given the difficulty in obtaining reliable experimental values of the adhesion strength of coatings, the theoretical approach to determining adhesion characteristics becomes more important. Surface Physics: Theoretical Models and Experimental Methods presents straightforward and efficient approaches and methods developed by the authors that enable the calculation of surface and adhesion characteristics for a wide range of materials: metals, alloys, semiconductors, and complex compounds. The authors compare results from the ...

  18. Reliability of Beam Loss Monitors System for the Large Hadron Collider

    Science.gov (United States)

    Guaglio, G.; Dehning, B.; Santoni, C.

    2004-11-01

    The employment of superconducting magnets in high energy colliders opens challenging failure scenarios and brings new criticalities for the whole system protection. For the LHC beam loss protection system, the failure rate and the availability requirements have been evaluated using the Safety Integrity Level (SIL) approach. A downtime cost evaluation is used as input for the SIL approach. The most critical systems, which contribute to the final SIL value, are the dump system, the interlock system, the beam loss monitors system and the energy monitor system. The Beam Loss Monitors System (BLMS) is critical for short and intense particle losses, while at medium and higher loss time it is assisted by other systems, such as the quench protection system and the cryogenic system. For BLMS, hardware and software have been evaluated in detail. The reliability input figures have been collected using historical data from the SPS, using temperature and radiation damage experimental data as well as using standard databases. All the data have been processed by reliability software (Isograph). The analysis ranges from the components data to the system configuration.

  19. Reliability of Beam Loss Monitors System for the Large Hadron Collider

    International Nuclear Information System (INIS)

    Guaglio, G.; Dehning, B.; Santoni, C.

    2004-01-01

    The employment of superconducting magnets in high energy colliders opens challenging failure scenarios and brings new criticalities for the whole system protection. For the LHC beam loss protection system, the failure rate and the availability requirements have been evaluated using the Safety Integrity Level (SIL) approach. A downtime cost evaluation is used as input for the SIL approach. The most critical systems, which contribute to the final SIL value, are the dump system, the interlock system, the beam loss monitors system and the energy monitor system. The Beam Loss Monitors System (BLMS) is critical for short and intense particle losses, while at medium and higher loss time it is assisted by other systems, such as the quench protection system and the cryogenic system. For BLMS, hardware and software have been evaluated in detail. The reliability input figures have been collected using historical data from the SPS, using temperature and radiation damage experimental data as well as using standard databases. All the data have been processed by reliability software (Isograph). The analysis ranges from the components data to the system configuration

  20. Fracture in quasi-brittle materials: experimental and numerical approach for the determination of an incremental model with generalized variables

    International Nuclear Information System (INIS)

    Morice, Erwan

    2014-01-01

    Fracture in quasi-brittle materials, such as ceramics or concrete, can be represented schematically by series of events of nucleation and coalescence of micro-cracks. Modeling this process is an important challenge for the reliability and life prediction of concrete structures, in particular the prediction of the permeability of damaged structures. A multi-scale approach is proposed. The global behavior is modeled within the fracture mechanics framework and the local behavior is modeled by the discrete element method. An approach was developed to condense the non linear behavior of the mortar. A model reduction technic is used to extract the relevant information from the discrete elements method. To do so, the velocity field is partitioned into mode I, II, linear and non-linear components, each component being characterized by an intensity factor and a fixed spatial distribution. The response of the material is hence condensed in the evolution of the intensity factors, used as non-local variables. A model was also proposed to predict the behavior of the crack for proportional and non-proportional mixed mode I+II loadings. An experimental campaign was finally conducted to characterize the fatigue and fracture behavior of mortar. The results show that fatigue crack growth can be of significant importance. The experimental velocity field determined, in the crack tip region, by DIC, were analyzed using the same technic as that used for analyzing the fields obtained by the discrete element method showing consistent results. (author)

  1. ERP Reliability Analysis (ERA) Toolbox: An open-source toolbox for analyzing the reliability of event-related brain potentials.

    Science.gov (United States)

    Clayson, Peter E; Miller, Gregory A

    2017-01-01

    Generalizability theory (G theory) provides a flexible, multifaceted approach to estimating score reliability. G theory's approach to estimating score reliability has important advantages over classical test theory that are relevant for research using event-related brain potentials (ERPs). For example, G theory does not require parallel forms (i.e., equal means, variances, and covariances), can handle unbalanced designs, and provides a single reliability estimate for designs with multiple sources of error. This monograph provides a detailed description of the conceptual framework of G theory using examples relevant to ERP researchers, presents the algorithms needed to estimate ERP score reliability, and provides a detailed walkthrough of newly-developed software, the ERP Reliability Analysis (ERA) Toolbox, that calculates score reliability using G theory. The ERA Toolbox is open-source, Matlab software that uses G theory to estimate the contribution of the number of trials retained for averaging, group, and/or event types on ERP score reliability. The toolbox facilitates the rigorous evaluation of psychometric properties of ERP scores recommended elsewhere in this special issue. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. A Reliable, Non-Invasive Approach to Data Center Monitoring and Management

    Directory of Open Access Journals (Sweden)

    Moises Levy

    2017-08-01

    Full Text Available Recent standards, legislation, and best practices point to data center infrastructure management systems to control and monitor data center performance. This work presents an innovative approach to address some of the challenges that currently hinder data center management. It explains how monitoring and management systems should be envisioned and implemented. Key parameters associated with data center infrastructure and information technology equipment can be monitored in real-time across an entire facility using low-cost, low-power wireless sensors. Given the data centers’ mission critical nature, the system must be reliable and deployable through a non-invasive process. The need for the monitoring system is also presented through a feedback control systems perspective, which allows higher levels of automation. The data center monitoring and management system enables data gathering, analysis, and decision-making to improve performance, and to enhance asset utilization.

  3. Personalized translational epilepsy research - Novel approaches and future perspectives: Part II: Experimental and translational approaches.

    Science.gov (United States)

    Bauer, Sebastian; van Alphen, Natascha; Becker, Albert; Chiocchetti, Andreas; Deichmann, Ralf; Deller, Thomas; Freiman, Thomas; Freitag, Christine M; Gehrig, Johannes; Hermsen, Anke M; Jedlicka, Peter; Kell, Christian; Klein, Karl Martin; Knake, Susanne; Kullmann, Dimitri M; Liebner, Stefan; Norwood, Braxton A; Omigie, Diana; Plate, Karlheinz; Reif, Andreas; Reif, Philipp S; Reiss, Yvonne; Roeper, Jochen; Ronellenfitsch, Michael W; Schorge, Stephanie; Schratt, Gerhard; Schwarzacher, Stephan W; Steinbach, Joachim P; Strzelczyk, Adam; Triesch, Jochen; Wagner, Marlies; Walker, Matthew C; von Wegner, Frederic; Rosenow, Felix

    2017-11-01

    Despite the availability of more than 15 new "antiepileptic drugs", the proportion of patients with pharmacoresistant epilepsy has remained constant at about 20-30%. Furthermore, no disease-modifying treatments shown to prevent the development of epilepsy following an initial precipitating brain injury or to reverse established epilepsy have been identified to date. This is likely in part due to the polyetiologic nature of epilepsy, which in turn requires personalized medicine approaches. Recent advances in imaging, pathology, genetics, and epigenetics have led to new pathophysiological concepts and the identification of monogenic causes of epilepsy. In the context of these advances, the First International Symposium on Personalized Translational Epilepsy Research (1st ISymPTER) was held in Frankfurt on September 8, 2016, to discuss novel approaches and future perspectives for personalized translational research. These included new developments and ideas in a range of experimental and clinical areas such as deep phenotyping, quantitative brain imaging, EEG/MEG-based analysis of network dysfunction, tissue-based translational studies, innate immunity mechanisms, microRNA as treatment targets, functional characterization of genetic variants in human cell models and rodent organotypic slice cultures, personalized treatment approaches for monogenic epilepsies, blood-brain barrier dysfunction, therapeutic focal tissue modification, computational modeling for target and biomarker identification, and cost analysis in (monogenic) disease and its treatment. This report on the meeting proceedings is aimed at stimulating much needed investments of time and resources in personalized translational epilepsy research. This Part II includes the experimental and translational approaches and a discussion of the future perspectives, while the diagnostic methods, EEG network analysis, biomarkers, and personalized treatment approaches were addressed in Part I [1]. Copyright © 2017

  4. Time-variant reliability assessment through equivalent stochastic process transformation

    International Nuclear Information System (INIS)

    Wang, Zequn; Chen, Wei

    2016-01-01

    Time-variant reliability measures the probability that an engineering system successfully performs intended functions over a certain period of time under various sources of uncertainty. In practice, it is computationally prohibitive to propagate uncertainty in time-variant reliability assessment based on expensive or complex numerical models. This paper presents an equivalent stochastic process transformation approach for cost-effective prediction of reliability deterioration over the life cycle of an engineering system. To reduce the high dimensionality, a time-independent reliability model is developed by translating random processes and time parameters into random parameters in order to equivalently cover all potential failures that may occur during the time interval of interest. With the time-independent reliability model, an instantaneous failure surface is attained by using a Kriging-based surrogate model to identify all potential failure events. To enhance the efficacy of failure surface identification, a maximum confidence enhancement method is utilized to update the Kriging model sequentially. Then, the time-variant reliability is approximated using Monte Carlo simulations of the Kriging model where system failures over a time interval are predicted by the instantaneous failure surface. The results of two case studies demonstrate that the proposed approach is able to accurately predict the time evolution of system reliability while requiring much less computational efforts compared with the existing analytical approach. - Highlights: • Developed a new approach for time-variant reliability analysis. • Proposed a novel stochastic process transformation procedure to reduce the dimensionality. • Employed Kriging models with confidence-based adaptive sampling scheme to enhance computational efficiency. • The approach is effective for handling random process in time-variant reliability analysis. • Two case studies are used to demonstrate the efficacy

  5. System ergonomics as an approach to improve human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1988-01-01

    The application of system technics on ergonomical problems is called system ergonomics. This enables improvements of human reliability by design measures. The precondition for this is the knowledge of how information processing is performed by man and machine. By a separate consideration of sensory processing, cognitive processing, and motory processing it is possible to have a more exact idea of the system element 'man'. The system element 'machine' is well described by differential equations which allow an ergonomical assessment of the manouverability. The knowledge of information processing of man and machine enables a task analysis. This makes appear on one hand the human boundaries depending on the different properties of the task and on the other hand suitable ergonomical solution proposals which improve the reliability of the total system. It is a disadvantage, however, that the change of human reliability by such measures may not be quoted numerically at the moment. (orig.)

  6. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.

  7. Reliability issues in PACS

    Science.gov (United States)

    Taira, Ricky K.; Chan, Kelby K.; Stewart, Brent K.; Weinberg, Wolfram S.

    1991-07-01

    Reliability is an increasing concern when moving PACS from the experimental laboratory to the clinical environment. Any system downtime may seriously affect patient care. The authors report on the several classes of errors encountered during the pre-clinical release of the PACS during the past several months and present the solutions implemented to handle them. The reliability issues discussed include: (1) environmental precautions, (2) database backups, (3) monitor routines of critical resources and processes, (4) hardware redundancy (networks, archives), and (5) development of a PACS quality control program.

  8. Evaluation of reliability assurance approaches to operational nuclear safety

    International Nuclear Information System (INIS)

    Mueller, C.J.; Bezella, W.A.

    1984-01-01

    This report discusses the results of research to evaluate existing and/or recommended safety/reliability assurance activities among nuclear and other high technology industries for potential nuclear industry implementation. Since the Three Mile Island (TMI) accident, there has been increased interest in the use of reliability programs (RP) to assure the performance of nuclear safety systems throughout the plant's lifetime. Recently, several Nuclear Regulatory Commission (NRC) task forces or safety issue review groups have recommended RPs for assuring the continuing safety of nuclear reactor plants. 18 references

  9. Analogical reasoning for reliability analysis based on generic data

    Energy Technology Data Exchange (ETDEWEB)

    Kozin, Igor O

    1996-10-01

    The paper suggests using the systemic concept 'analogy' for the foundation of an approach to analyze system reliability on the basis of generic data, describing the method of structuring the set that defines analogical models, an approach of transition from the analogical model to a reliability model and a way of obtaining reliability intervals of analogous objects.

  10. Analogical reasoning for reliability analysis based on generic data

    International Nuclear Information System (INIS)

    Kozin, Igor O.

    1996-01-01

    The paper suggests using the systemic concept 'analogy' for the foundation of an approach to analyze system reliability on the basis of generic data, describing the method of structuring the set that defines analogical models, an approach of transition from the analogical model to a reliability model and a way of obtaining reliability intervals of analogous objects

  11. New approaches for the reliability-oriented structural optimization considering time-variant aspects; Neue Ansaetze fuer die zuverlaessigkeitsorientierte Strukturoptimierung unter Beachtung zeitvarianter Aspekte

    Energy Technology Data Exchange (ETDEWEB)

    Kuschel, N.

    2000-07-01

    The optimization of structures with respect to cost, weight or performance is a well-known application of the nonlinear optimization. However reliability-based structural optimization has been subject of only very few studies. The approaches suggested up to now have been unsatisfactory regarding general possibility of application or easy handling by user. The objective of this thesis is the development of general approaches to solve both optimization problems, the minimization of cost with respect to constraint reliabilty and the maximization of reliability under cost constraint. The extented approach of an one-level-method will be introduced in detail for the time-invariant problems. Here, the reliability of the sturcture will be analysed in the framework of the First-Order-Reliability-Method (FORM). The use of time-variant reliability analysis is necessary for a realistic modelling of many practical problems. Therefore several generalizations of the new approaches will be derived for the time-variant reliability-based structural optimization. Some important properties of the optimization problems are proved. In addition some interesting extensions of the one-level-method, for example the cost optimization of structural series systems and the cost optimization in the frame of the Second-Order-Reliabiity-Method (SORM), are presented in the thesis. (orig.) [German] Die Optimierung von Tragwerken im Hinblick auf die Kosten, das Gewicht oder die Gestalt ist eine sehr bekannte Anwendung der nichtlinearen Optimierung. Die zuverlaessigkeitsorientierte Strukturoptimierung wurde dagegen weit seltener untersucht. Die bisher vorgeschlagenen Ansaetze koennen bezueglich ihrer allgemeinen Verwendbarkeit oder ihrer nutzerfreundlichen Handhabung nicht befriedigen. Das Ziel der vorliegenden Arbeit ist nun die Entwicklung allgemeiner Ansaetze zur Loesung der beiden Optimierungsprobleme, einer Kostenminimierung unter Zuverlaessigkeitsrestriktionen und einer

  12. Mathematical reliability an expository perspective

    CERN Document Server

    Mazzuchi, Thomas; Singpurwalla, Nozer

    2004-01-01

    In this volume consideration was given to more advanced theoretical approaches and novel applications of reliability to ensure that topics having a futuristic impact were specifically included. Topics like finance, forensics, information, and orthopedics, as well as the more traditional reliability topics were purposefully undertaken to make this collection different from the existing books in reliability. The entries have been categorized into seven parts, each emphasizing a theme that seems poised for the future development of reliability as an academic discipline with relevance. The seven parts are networks and systems; recurrent events; information and design; failure rate function and burn-in; software reliability and random environments; reliability in composites and orthopedics, and reliability in finance and forensics. Embedded within the above are some of the other currently active topics such as causality, cascading, exchangeability, expert testimony, hierarchical modeling, optimization and survival...

  13. Scaled CMOS Technology Reliability Users Guide

    Science.gov (United States)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  14. Commutative and Non-commutative Parallelogram Geometry: an Experimental Approach

    OpenAIRE

    Bertram, Wolfgang

    2013-01-01

    By "parallelogram geometry" we mean the elementary, "commutative", geometry corresponding to vector addition, and by "trapezoid geometry" a certain "non-commutative deformation" of the former. This text presents an elementary approach via exercises using dynamical software (such as geogebra), hopefully accessible to a wide mathematical audience, from undergraduate students and high school teachers to researchers, proceeding in three steps: (1) experimental geometry, (2) algebra (linear algebr...

  15. A Hybrid Approach for Reliability Analysis Based on Analytic Hierarchy Process and Bayesian Network

    International Nuclear Information System (INIS)

    Zubair, Muhammad

    2014-01-01

    By using analytic hierarchy process (AHP) and Bayesian Network (BN) the present research signifies the technical and non-technical issues of nuclear accidents. The study exposed that the technical faults was one major reason of these accidents. Keep an eye on other point of view it becomes clearer that human behavior like dishonesty, insufficient training, and selfishness are also play a key role to cause these accidents. In this study, a hybrid approach for reliability analysis based on AHP and BN to increase nuclear power plant (NPP) safety has been developed. By using AHP, best alternative to improve safety, design, operation, and to allocate budget for all technical and non-technical factors related with nuclear safety has been investigated. We use a special structure of BN based on the method AHP. The graphs of the BN and the probabilities associated with nodes are designed to translate the knowledge of experts on the selection of best alternative. The results show that the improvement in regulatory authorities will decrease failure probabilities and increase safety and reliability in industrial area.

  16. Structural Reliability Methods for Wind Power Converter System Component Reliability Assessment

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Wind power converter systems are essential subsystems in both off-shore and on-shore wind turbines. It is the main interface between generator and grid connection. This system is affected by numerous stresses where the main contributors might be defined as vibration and temperature loadings....... The temperature variations induce time-varying stresses and thereby fatigue loads. A probabilistic model is used to model fatigue failure for an electrical component in the power converter system. This model is based on a linear damage accumulation and physics of failure approaches, where a failure criterion...... is defined by the threshold model. The attention is focused on crack propagation in solder joints of electrical components due to the temperature loadings. Structural Reliability approaches are used to incorporate model, physical and statistical uncertainties. Reliability estimation by means of structural...

  17. A New Approach for Reliability Life Prediction of Rail Vehicle Axle by Considering Vibration Measurement

    Directory of Open Access Journals (Sweden)

    Meral Bayraktar

    2014-01-01

    Full Text Available The effect of vibration on the axle has been considered. Vibration measurements at different speeds have been performed on the axle of a running rail vehicle to figure out displacement, acceleration, time, and frequency response. Based on the experimental works, equivalent stress has been used to find out life of the axles for 90% and 10% reliability. Calculated life values of the rail vehicle axle have been compared with the real life data and it is found that the life of a vehicle axle taking into account the vibration effects is in good agreement with the real life of the axle.

  18. What Shapes the Intention to Study Abroad? An Experimental Approach

    Science.gov (United States)

    Petzold, Knut; Moog, Petra

    2018-01-01

    In contrast to previous studies, this investigation aims to get deeper insights into the causes of the intention to study abroad by using an experimental approach. Although international experience is often considered as important, many students at German universities do not even consider abroad. Referring to the Theory of Rational Choice (RCT)…

  19. Reliability and Availability of Cloud Computing

    CERN Document Server

    Bauer, Eric

    2012-01-01

    A holistic approach to service reliability and availability of cloud computing Reliability and Availability of Cloud Computing provides IS/IT system and solution architects, developers, and engineers with the knowledge needed to assess the impact of virtualization and cloud computing on service reliability and availability. It reveals how to select the most appropriate design for reliability diligence to assure that user expectations are met. Organized in three parts (basics, risk analysis, and recommendations), this resource is accessible to readers of diverse backgrounds and experience le

  20. A Markovian Approach Applied to Reliability Modeling of Bidirectional DC-DC Converters Used in PHEVs and Smart Grids

    Directory of Open Access Journals (Sweden)

    M. Khalilzadeh

    2016-12-01

    Full Text Available In this paper, a stochastic approach is proposed for reliability assessment of bidirectional DC-DC converters, including the fault-tolerant ones. This type of converters can be used in a smart DC grid, feeding DC loads such as home appliances and plug-in hybrid electric vehicles (PHEVs. The reliability of bidirectional DC-DC converters is of such an importance, due to the key role of the expected increasingly utilization of DC grids in modern Smart Grid. Markov processes are suggested for reliability modeling and consequently calculating the expected effective lifetime of bidirectional converters. A three-leg bidirectional interleaved converter using data of Toyota Prius 2012 hybrid electric vehicle is used as a case study. Besides, the influence of environment and ambient temperature on converter lifetime is studied. The impact of modeling the reliability of the converter and adding reliability constraints on the technical design procedure of the converter is also investigated. In order to investigate the effect of leg increase on the lifetime of the converter, single leg to five-leg interleave DC-DC converters are studied considering economical aspect and the results are extrapolated for six and seven-leg converters. The proposed method could be generalized so that the number of legs and input and output capacitors could be an arbitrary number.

  1. Training and Maintaining System-Wide Reliability in Outcome Management.

    Science.gov (United States)

    Barwick, Melanie A; Urajnik, Diana J; Moore, Julia E

    2014-01-01

    The Child and Adolescent Functional Assessment Scale (CAFAS) is widely used for outcome management, for providing real time client and program level data, and the monitoring of evidence-based practices. Methods of reliability training and the assessment of rater drift are critical for service decision-making within organizations and systems of care. We assessed two approaches for CAFAS training: external technical assistance and internal technical assistance. To this end, we sampled 315 practitioners trained by external technical assistance approach from 2,344 Ontario practitioners who had achieved reliability on the CAFAS. To assess the internal technical assistance approach as a reliable alternative training method, 140 practitioners trained internally were selected from the same pool of certified raters. Reliabilities were high for both practitioners trained by external technical assistance and internal technical assistance approaches (.909-.995, .915-.997, respectively). 1 and 3-year estimates showed some drift on several scales. High and consistent reliabilities over time and training method has implications for CAFAS training of behavioral health care practitioners, and the maintenance of CAFAS as a global outcome management tool in systems of care.

  2. A quantitative approach to wind farm diversification and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Degeilh, Yannick; Singh, Chanan [Department of Electrical and Computer Engineering, Texas A and M University, College Station, TX 77843 (United States)

    2011-02-15

    This paper proposes a general planning method to minimize the variance of aggregated wind farm power output by optimally distributing a predetermined number of wind turbines over a preselected number of potential wind farming sites. The objective is to facilitate high wind power penetration through the search for steadier overall power output. Another optimization formulation that takes into account the correlations between wind power outputs and load is also presented. Three years of wind data from the recent NREL/3TIER study in the western US provides the statistics for evaluating each site upon their mean power output, variance and correlation with each other so that the best allocations can be determined. The reliability study reported in this paper investigates the impact of wind power output variance reduction on a power system composed of a virtual wind power plant and a load modeled from the 1996 IEEE RTS. Some traditional reliability indices such as the LOLP are calculated and it is eventually shown that configurations featuring minimal global power output variances generally prove the most reliable provided the sites are not significantly correlated with the modeled load. Consequently, the choice of uncorrelated/negatively correlated sites is favored. (author)

  3. A quantitative approach to wind farm diversification and reliability

    International Nuclear Information System (INIS)

    Degeilh, Yannick; Singh, Chanan

    2011-01-01

    This paper proposes a general planning method to minimize the variance of aggregated wind farm power output by optimally distributing a predetermined number of wind turbines over a preselected number of potential wind farming sites. The objective is to facilitate high wind power penetration through the search for steadier overall power output. Another optimization formulation that takes into account the correlations between wind power outputs and load is also presented. Three years of wind data from the recent NREL/3TIER study in the western US provides the statistics for evaluating each site upon their mean power output, variance and correlation with each other so that the best allocations can be determined. The reliability study reported in this paper investigates the impact of wind power output variance reduction on a power system composed of a virtual wind power plant and a load modeled from the 1996 IEEE RTS. Some traditional reliability indices such as the LOLP are calculated and it is eventually shown that configurations featuring minimal global power output variances generally prove the most reliable provided the sites are not significantly correlated with the modeled load. Consequently, the choice of uncorrelated/negatively correlated sites is favored. (author)

  4. Reliable multicast for the Grid: a case study in experimental computer science.

    Science.gov (United States)

    Nekovee, Maziar; Barcellos, Marinho P; Daw, Michael

    2005-08-15

    In its simplest form, multicast communication is the process of sending data packets from a source to multiple destinations in the same logical multicast group. IP multicast allows the efficient transport of data through wide-area networks, and its potentially great value for the Grid has been highlighted recently by a number of research groups. In this paper, we focus on the use of IP multicast in Grid applications, which require high-throughput reliable multicast. These include Grid-enabled computational steering and collaborative visualization applications, and wide-area distributed computing. We describe the results of our extensive evaluation studies of state-of-the-art reliable-multicast protocols, which were performed on the UK's high-speed academic networks. Based on these studies, we examine the ability of current reliable multicast technology to meet the Grid's requirements and discuss future directions.

  5. Reliability Analysis of Tubular Joints in Offshore Structures

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Sørensen, John Dalsgaard

    1987-01-01

    Reliability analysis of single tubular joints and offshore platforms with tubular joints is" presented. The failure modes considered are yielding, punching, buckling and fatigue failure. Element reliability as well as systems reliability approaches are used and illustrated by several examples....... Finally, optimal design of tubular.joints with reliability constraints is discussed and illustrated by an example....

  6. Identification of Black Spots Based on Reliability Approach

    Directory of Open Access Journals (Sweden)

    Ahmadreza Ghaffari

    2013-12-01

    Full Text Available Identifying crash “black-spots”, “hot-spots” or “high-risk” locations is one of the most important and prevalent concerns in traffic safety and various methods have been devised and presented for solving this issue until now. In this paper, a new method based on the reliability analysis is presented to identify black-spots. Reliability analysis has an ordered framework to consider the probabilistic nature of engineering problems, so crashes with their probabilistic na -ture can be applied. In this study, the application of this new method was compared with the commonly implemented Frequency and Empirical Bayesian methods using simulated data. The results indicated that the traditional methods can lead to an inconsistent prediction due to their inconsider -ation of the variance of the number of crashes in each site and their dependence on the mean of the data.

  7. Management of reliability and maintainability; a disciplined approach to fleet readiness

    Science.gov (United States)

    Willoughby, W. J., Jr.

    1981-01-01

    Material acquisition fundamentals were reviewed and include: mission profile definition, stress analysis, derating criteria, circuit reliability, failure modes, and worst case analysis. Military system reliability was examined with emphasis on the sparing of equipment. The Navy's organizational strategy for 1980 is presented.

  8. Reliability metrics extraction for power electronics converter stressed by thermal cycles

    DEFF Research Database (Denmark)

    Ma, Ke; Choi, Uimin; Blaabjerg, Frede

    2017-01-01

    Due to the continuous demands for highly reliable and cost-effective power conversion, the quantified reliability performances of the power electronics converter are becoming emerging needs. The existing reliability modelling approaches for the power electronics converter mainly focuses on the pr...... performance of power electronics system. The final predicted results showed good accuracy with much more reliability information compared to the existing approaches, and the quantified reliability correlation to the mission profiles of converter is mathematically established....

  9. Experimental design techniques in statistical practice a practical software-based approach

    CERN Document Server

    Gardiner, W P

    1998-01-01

    Provides an introduction to the diverse subject area of experimental design, with many practical and applicable exercises to help the reader understand, present and analyse the data. The pragmatic approach offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry. Provides an introduction to the diverse subject area of experimental design and includes practical and applicable exercises to help understand, present and analyse the data Offers technical training for use of designs and teaches statistical and non-statistical skills in design and analysis of project studies throughout science and industry Discusses one-factor designs and blocking designs, factorial experimental designs, Taguchi methods and response surface methods, among other topics.

  10. Field reliability of electronic systems

    International Nuclear Information System (INIS)

    Elm, T.

    1984-02-01

    This report investigates, through several examples from the field, the reliability of electronic units in a broader sense. That is, it treats not just random parts failure, but also inadequate reliability design and (externally and internally) induced failures. The report is not meant to be merely an indication of the state of the art for the reliability prediction methods we know, but also as a contribution to the investigation of man-machine interplay in the operation and repair of electronic equipment. The report firmly links electronics reliability to safety and risk analyses approaches with a broader, system oriented view of reliability prediction and with postfailure stress analysis. It is intended to reveal, in a qualitative manner, the existence of symptom and cause patterns. It provides a background for further investigations to identify the detailed mechanisms of the faults and the remedical actions and precautions for achieving cost effective reliability. (author)

  11. Experimental semiotics: a new approach for studying communication as a form of joint action.

    Science.gov (United States)

    Galantucci, Bruno

    2009-04-01

    In the last few years, researchers have begun to investigate the emergence of novel forms of human communication in the laboratory. I survey this growing line of research, which may be called experimental semiotics, from three distinct angles. First, I situate the new approach in its theoretical and historical context. Second, I review a sample of studies that exemplify experimental semiotics. Third, I present an empirical study that illustrates how the new approach can help us understand the socio-cognitive underpinnings of human communication. The main conclusion of the paper will be that, by reproducing micro samples of historical processes in the laboratory, experimental semiotics offers new powerful tools for investigating human communication as a form of joint action. Copyright © 2009 Cognitive Science Society, Inc.

  12. Fuel reliability experience in Finland

    International Nuclear Information System (INIS)

    Kekkonen, L.

    2015-01-01

    Four nuclear reactors have operated in Finland now for 35-38 years. The two VVER-440 units at Loviisa Nuclear Power Plant are operated by Fortum and two BWR’s in Olkiluoto are operated by Teollisuuden Voima Oyj (TVO). The fuel reliability experience of the four reactors operating currently in Finland has been very good and the fuel failure rates have been very low. Systematic inspection of spent fuel assemblies, and especially all failed assemblies, is a good practice that is employed in Finland in order to improve fuel reliability and operational safety. Investigation of the root cause of fuel failures is important in developing ways to prevent similar failures in the future. The operational and fuel reliability experience at the Loviisa Nuclear Power Plant has been reported also earlier in the international seminars on WWER Fuel Performance, Modelling and Experimental Support. In this paper the information on fuel reliability experience at Loviisa NPP is updated and also a short summary of the fuel reliability experience at Olkiluoto NPP is given. Keywords: VVER-440, fuel reliability, operational experience, poolside inspections, fuel failure identification. (author)

  13. Guarantee of reliability of devices complexes for plastic tube welding

    International Nuclear Information System (INIS)

    Voskresenskij, L.A.; Zajtsev, A.I.; Nelyubov, V.I.; Fedorov, M.A.

    1988-01-01

    Results of calculations and experimental studies on providing reliability of complex for plastic tube welding are presented. Choice of reliability indeces and standards is based. Reliability levels of components are determined. The most waded details are calculated. It is shown that they meet the reqrurements of strength and reliability. Service life tests supported the correct choice of springs. Recommendations on elevating reliability are given. Directions of further developments are shown. 8 refs.; 2 figs.; 1 tab

  14. Orotracheal Intubation Using the Retromolar Space: A Reliable Alternative Intubation Approach to Prevent Dental Injury

    Directory of Open Access Journals (Sweden)

    Linh T. Nguyen

    2016-01-01

    Full Text Available Despite recent advances in airway management, perianesthetic dental injury remains one of the most common anesthesia-related adverse events and cause for malpractice litigation against anesthesia providers. Recommended precautions for prevention of dental damage may not always be effective because these techniques involve contact and pressure exerted on vulnerable teeth. We describe a novel approach using the retromolar space to insert a flexible fiberscope for tracheal tube placement as a reliable method to achieve atraumatic tracheal intubation. Written consent for publication has been obtained from the patient.

  15. Empirical evidence of bias in the design of experimental stroke studies - A metaepidemiologic approach

    NARCIS (Netherlands)

    Crossley, Nicolas A.; Sena, Emily; Goehler, Jos; Horn, Jannekke; van der Worp, Bart; Bath, Philip M. W.; Macleod, Malcolm; Dirnagl, Ulrich

    2008-01-01

    Background and Purpose - At least part of the failure in the transition from experimental to clinical studies in stroke has been attributed to the imprecision introduced by problems in the design of experimental stroke studies. Using a metaepidemiologic approach, we addressed the effect of

  16. ON CONSTRUCTION OF A RELIABLE GROUND TRUTH FOR EVALUATION OF VISUAL SLAM ALGORITHMS

    Directory of Open Access Journals (Sweden)

    Jan Bayer

    2016-11-01

    Full Text Available In this work we are concerning the problem of localization accuracy evaluation of visual-based Simultaneous Localization and Mapping (SLAM techniques. Quantitative evaluation of the SLAM algorithm performance is usually done using the established metrics of Relative pose error and Absolute trajectory error which require a precise and reliable ground truth. Such a ground truth is usually hard to obtain, while it requires an expensive external localization system. In this work we are proposing to use the SLAM algorithm itself to construct a reliable ground-truth by offline frame-by-frame processing. The generated ground-truth is suitable for evaluation of different SLAM systems, as well as for tuning the parametrization of the on-line SLAM. The presented practical experimental results indicate the feasibility of the proposed approach.

  17. Scale Reliability Evaluation with Heterogeneous Populations

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.

    2015-01-01

    A latent variable modeling approach for scale reliability evaluation in heterogeneous populations is discussed. The method can be used for point and interval estimation of reliability of multicomponent measuring instruments in populations representing mixtures of an unknown number of latent classes or subpopulations. The procedure is helpful also…

  18. Reliability of power electronic converter systems

    CERN Document Server

    Chung, Henry Shu-hung; Blaabjerg, Frede; Pecht, Michael

    2016-01-01

    This book outlines current research into the scientific modeling, experimentation, and remedial measures for advancing the reliability, availability, system robustness, and maintainability of Power Electronic Converter Systems (PECS) at different levels of complexity.

  19. Optimizing laboratory animal stress paradigms: The H-H* experimental design.

    Science.gov (United States)

    McCarty, Richard

    2017-01-01

    Major advances in behavioral neuroscience have been facilitated by the development of consistent and highly reproducible experimental paradigms that have been widely adopted. In contrast, many different experimental approaches have been employed to expose laboratory mice and rats to acute versus chronic intermittent stress. An argument is advanced in this review that more consistent approaches to the design of chronic intermittent stress experiments would provide greater reproducibility of results across laboratories and greater reliability relating to various neural, endocrine, immune, genetic, and behavioral adaptations. As an example, the H-H* experimental design incorporates control, homotypic (H), and heterotypic (H*) groups and allows for comparisons across groups, where each animal is exposed to the same stressor, but that stressor has vastly different biological and behavioral effects depending upon each animal's prior stress history. Implementation of the H-H* experimental paradigm makes possible a delineation of transcriptional changes and neural, endocrine, and immune pathways that are activated in precisely defined stressor contexts. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Reliability demonstration test planning: A three dimensional consideration

    International Nuclear Information System (INIS)

    Yadav, Om Prakash; Singh, Nanua; Goel, Parveen S.

    2006-01-01

    Increasing customer demand for reliability, fierce market competition on time-to-market and cost, and highly reliable products are making reliability testing more challenging task. This paper presents a systematic approach for identifying critical elements (subsystems and components) of the system and deciding the types of test to be performed to demonstrate reliability. It decomposes the system into three dimensions (i.e. physical, functional and time) and identifies critical elements in the design by allocating system level reliability to each candidate. The decomposition of system level reliability is achieved by using criticality index. The numerical value of criticality index for each candidate is derived based on the information available from failure mode and effects analysis (FMEA) document or warranty data from a prior system. It makes use of this information to develop reliability demonstration test plan for the identified (critical) failure mechanisms and physical elements. It also highlights the benefits of using prior information in order to locate critical spots in the design and in subsequent development of test plans. A case example is presented to demonstrate the proposed approach

  1. Systems reliability analysis for the national ignition facility

    International Nuclear Information System (INIS)

    Majumdar, K.C.; Annese, C.E.; MacIntyre, A.T.; Sicherman, A.

    1996-01-01

    A Reliability, Availability and Maintainability (RAM) analysis was initiated for the National Ignition Facility (NIF). The NIF is an inertial confinement fusion research facility designed to achieve controlled thermonuclear reaction; the preferred site for the NIF is the Lawrence Livermore National Laboratory (LLNL). The NIF RAM analysis has three purposes: (1) to allocate top level reliability and availability goals for the systems, (2) to develop an operability model for optimum maintainability, and (3) to determine the achievability of the allocated goals of the RAM parameters for the NIF systems and the facility operation as a whole. An allocation model assigns the reliability and availability goals for front line and support systems by a top-down approach; reliability analysis uses a bottom-up approach to determine the system reliability and availability from component level to system level

  2. Experimental Validation of a Differential Variational Inequality-Based Approach for Handling Friction and Contact in Vehicle

    Science.gov (United States)

    2015-11-20

    terrain modeled using the discrete element method (DEM). Experimental Validation of a Differential Variational Inequality -Based Approach for Handling...COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Experimental Validation of a Differential Variational Inequality -Based Approach for...sinkage, and single wheel tests. 1.1. Modeling Frictional Contact Via Differential Variational Inequalities Consider a three dimensional (3D) system of

  3. Distribution system reliability evaluation using credibility theory | Xu ...

    African Journals Online (AJOL)

    In this paper, a hybrid algorithm based on fuzzy simulation and Failure Mode and Effect Analysis (FMEA) is applied to determine fuzzy reliability indices of distribution system. This approach can obtain fuzzy expected values and their variances of reliability indices, and the credibilities of reliability indices meeting specified ...

  4. A novel approach for reliable detection of cathepsin S activities in mouse antigen presenting cells.

    Science.gov (United States)

    Steimle, Alex; Kalbacher, Hubert; Maurer, Andreas; Beifuss, Brigitte; Bender, Annika; Schäfer, Andrea; Müller, Ricarda; Autenrieth, Ingo B; Frick, Julia-Stefanie

    2016-05-01

    Cathepsin S (CTSS) is a eukaryotic protease mostly expressed in professional antigen presenting cells (APCs). Since CTSS activity regulation plays a role in the pathogenesis of various autoimmune diseases like multiple sclerosis, atherosclerosis, Sjögren's syndrome and psoriasis as well as in cancer progression, there is an ongoing interest in the reliable detection of cathepsin S activity. Various applications have been invented for specific detection of this enzyme. However, most of them have only been shown to be suitable for human samples, do not deliver quantitative results or the experimental procedure requires technical equipment that is not commonly available in a standard laboratory. We have tested a fluorogen substrate, Mca-GRWPPMGLPWE-Lys(Dnp)-DArg-NH2, that has been described to specifically detect CTSS activities in human APCs for its potential use for mouse samples. We have modified the protocol and thereby offer a cheap, easy, reproducible and quick activity assay to detect CTSS activities in mouse APCs. Since most of basic research on CTSS is performed in mice, this method closes a gap and offers a possibility for reliable and quantitative CTSS activity detection that can be performed in almost every laboratory. Copyright © 2016. Published by Elsevier B.V.

  5. Assessment of structural reliability of precast concrete buildings

    Directory of Open Access Journals (Sweden)

    Koyankin Alexandr

    2018-01-01

    Full Text Available Precast housing construction is currently being under rapid development, however, reliability of building structures made from precast reinforced concrete cannot be assessed rationally due to insufficient research data on that subject. In this regard, experimental and numerical studies were conducted to assess structural reliability of precast buildings as described in the given paper. Experimental studies of full-scale and model samples were conducted; numerical studies were held based on finite element models using “Lira” software. The objects under study included fragment of flooring of a building under construction, full-size fragment of flooring, full-scale models of precast cross-beams-to-columns joints and joints between hollow-core floor slabs and precast and cast-in-place cross-beams. Conducted research enabled to perform an objective assessment of structural reliability of precast buildings.

  6. Uncertainty propagation and sensitivity analysis in system reliability assessment via unscented transformation

    International Nuclear Information System (INIS)

    Rocco Sanseverino, Claudio M.; Ramirez-Marquez, José Emmanuel

    2014-01-01

    The reliability of a system, notwithstanding it intended function, can be significantly affected by the uncertainty in the reliability estimate of the components that define the system. This paper implements the Unscented Transformation to quantify the effects of the uncertainty of component reliability through two approaches. The first approach is based on the concept of uncertainty propagation, which is the assessment of the effect that the variability of the component reliabilities produces on the variance of the system reliability. This assessment based on UT has been previously considered in the literature but only for system represented through series/parallel configuration. In this paper the assessment is extended to systems whose reliability cannot be represented through analytical expressions and require, for example, Monte Carlo Simulation. The second approach consists on the evaluation of the importance of components, i.e., the evaluation of the components that most contribute to the variance of the system reliability. An extension of the UT is proposed to evaluate the so called “main effects” of each component, as well to assess high order component interaction. Several examples with excellent results illustrate the proposed approach. - Highlights: • Simulation based approach for computing reliability estimates. • Computation of reliability variance via 2n+1 points. • Immediate computation of component importance. • Application to network systems

  7. Identifying factors influencing reliability of professional systems

    NARCIS (Netherlands)

    Balasubramanian, A.; Kevrekidis, K.; Sonnemans, P.J.M.; Newby, M.J.

    2008-01-01

    Modern product development strategies call for a more proactive approach to fight intense global competition in terms of technological innovation, shorter time to market, quality and reliability and accommodative price. From a reliability engineering perspective, development managers would like to

  8. Experimental methods for the analysis of optimization algorithms

    CERN Document Server

    Bartz-Beielstein, Thomas; Paquete, Luis; Preuss, Mike

    2010-01-01

    In operations research and computer science it is common practice to evaluate the performance of optimization algorithms on the basis of computational results, and the experimental approach should follow accepted principles that guarantee the reliability and reproducibility of results. However, computational experiments differ from those in other sciences, and the last decade has seen considerable methodological research devoted to understanding the particular features of such experiments and assessing the related statistical methods. This book consists of methodological contributions on diffe

  9. An innovative approach for planning and execution of pre-experimental runs for Design of Experiments

    Directory of Open Access Journals (Sweden)

    Muhammad Arsalan Farooq

    2016-09-01

    Full Text Available This paper addresses the study of the pre-experimental planning phase of the Design of Experiments (DoE in order to improve the final product quality. The pre-experimental planning phase includes a clear identification of the problem statement, selection of control factors and their respective levels and ranges. To improve production quality based on the DoE a new approach for the pre-experimental planning phase, called Non-Conformity Matrix (NCM, is presented. This article also addresses the key steps of the pre-experimental runs considering a consumer goods manufacturing process. Results of the application for an industrial case show that this methodology can support a clear definition of the problem and also a correct identification of the factor ranges in particular situations. The proposed new approach allows modeling the entire manufacturing system holistically and correctly defining the factor ranges and respective levels for a more effective application of DoE. This new approach can be a useful resource for both research and industrial practitioners who are dedicated to large DoE projects with unknown factor interactions, when the operational levels and ranges are not completely defined.

  10. Teaching psychomotor skills to beginning nursing students using a web-enhanced approach: a quasi-experimental study.

    Science.gov (United States)

    Salyers, Vincent L

    2007-01-01

    To begin to address the problem of psychomotor skills deficiencies observed in many new graduate nurses, a skills laboratory course was developed using a web-enhanced approach. In this quasi-experimental study, the control group attended weekly lectures, observed skill demonstrations by faculty, practiced skills, and were evaluated on skill performance. The experimental group learned course content using a web-enhanced approach. This allowed students to learn course material outside of class at times convenient for them, thus they had more time during class to perfect psychomotor skills. The experimental group performed better on the final cognitive examination. Students in the traditional sections were more satisfied with the course, however. It was concluded that a web-enhanced approach for teaching psychomotor skills can provide a valid alternative to traditional skills laboratory formats.

  11. Electronic properties of Fe charge transfer complexes – A combined experimental and theoretical approach

    International Nuclear Information System (INIS)

    Ferreira, Hendrik; Eschwege, Karel G. von; Conradie, Jeanet

    2016-01-01

    Highlights: • Experimental and computational study of Fe II -phen, -bpy & -tpy compleesx. • Close correlations between experimental redox and spectral, and computational data. • Computational methods fast-track DSSC research. - Abstract: Dye-sensitized solar cell technology holds huge potential in renewable electricity generation of the future. Due to demand urgency, ways need to be explored to reduce research time and cost. Against this background, quantum computational chemistry is illustrated to be a reliable tool at the onset of studies in this field, simulating charge transfer, spectral (solar energy absorbed) and electrochemical (ease by which electrons may be liberated) tuning of related photo-responsive dyes. Comparative experimental and theoretical DFT studies were done under similar conditions, involving an extended series of electrochemically altered phenanthrolines, bipyridyl and terpyridyl complexes of Fe II . Fe II/III oxidation waves vary from 0.363 V for tris(3,6-dimethoxybipyridyl)Fe II to 0.894 V (versus Fc/Fc + ) for the 5-nitrophenanthroline complex. Theoretical DFT computed ionization potentials in the bipyridyl sub-series achieved an almost 100% linear correlation with experimental electrochemical oxidation potentials, while the phenanthroline sub-series gave R 2 = 0.95. Apart from the terpyridyl complex which accorded an almost perfect match, in general, TDDFT oscillators were computed at slightly lower energies than what was observed experimentally, while molecular HOMO and LUMO renderings reveal desired complexes with directional charge transfer propensities.

  12. Enhancing product robustness in reliability-based design optimization

    International Nuclear Information System (INIS)

    Zhuang, Xiaotian; Pan, Rong; Du, Xiaoping

    2015-01-01

    Different types of uncertainties need to be addressed in a product design optimization process. In this paper, the uncertainties in both product design variables and environmental noise variables are considered. The reliability-based design optimization (RBDO) is integrated with robust product design (RPD) to concurrently reduce the production cost and the long-term operation cost, including quality loss, in the process of product design. This problem leads to a multi-objective optimization with probabilistic constraints. In addition, the model uncertainties associated with a surrogate model that is derived from numerical computation methods, such as finite element analysis, is addressed. A hierarchical experimental design approach, augmented by a sequential sampling strategy, is proposed to construct the response surface of product performance function for finding optimal design solutions. The proposed method is demonstrated through an engineering example. - Highlights: • A unifying framework for integrating RBDO and RPD is proposed. • Implicit product performance function is considered. • The design problem is solved by sequential optimization and reliability assessment. • A sequential sampling technique is developed for improving design optimization. • The comparison with traditional RBDO is provided

  13. Responsibility and Reliability | Williams | Philosophical Papers

    African Journals Online (AJOL)

    'Responsibilist\\' approaches to epistemology link knowledge and justification with epistemically responsible belief management, where responsible management is understood to involve an essential element of guidance by recognized epistemic norms. By contrast, reliabilist approaches stress the de facto reliability of ...

  14. Study for increasing micro-drill reliability by vibrating drilling

    International Nuclear Information System (INIS)

    Yang Zhaojun; Li Wei; Chen Yanhong; Wang Lijiang

    1998-01-01

    A study for increasing micro-drill reliability by vibrating drilling is described. Under the experimental conditions of this study it is observed, from reliability testing and the fitting of a life-distribution function, that the lives of micro-drills under ordinary drilling follow the log-normal distribution and the lives of micro-drills under vibrating drilling follow the Weibull distribution. Calculations for reliability analysis show that vibrating drilling can increase the lives of micro-drills and correspondingly reduce the scatter of drill lives. Therefore, vibrating drilling increases the reliability of micro-drills

  15. Role of network dynamics in shaping spike timing reliability

    International Nuclear Information System (INIS)

    Bazhenov, Maxim; Rulkov, Nikolai F.; Fellous, Jean-Marc; Timofeev, Igor

    2005-01-01

    We study the reliability of cortical neuron responses to periodically modulated synaptic stimuli. Simple map-based models of two different types of cortical neurons are constructed to replicate the intrinsic resonances of reliability found in experimental data and to explore the effects of those resonance properties on collective behavior in a cortical network model containing excitatory and inhibitory cells. We show that network interactions can enhance the frequency range of reliable responses and that the latter can be controlled by the strength of synaptic connections. The underlying dynamical mechanisms of reliability enhancement are discussed

  16. A Vision for Spaceflight Reliability: NASA's Objectives Based Strategy

    Science.gov (United States)

    Groen, Frank; Evans, John; Hall, Tony

    2015-01-01

    In defining the direction for a new Reliability and Maintainability standard, OSMA has extracted the essential objectives that our programs need, to undertake a reliable mission. These objectives have been structured to lead mission planning through construction of an objective hierarchy, which defines the critical approaches for achieving high reliability and maintainability (R M). Creating a hierarchy, as a basis for assurance implementation, is a proven approach; yet, it holds the opportunity to enable new directions, as NASA moves forward in tackling the challenges of space exploration.

  17. Classifier Fusion With Contextual Reliability Evaluation.

    Science.gov (United States)

    Liu, Zhunga; Pan, Quan; Dezert, Jean; Han, Jun-Wei; He, You

    2018-05-01

    Classifier fusion is an efficient strategy to improve the classification performance for the complex pattern recognition problem. In practice, the multiple classifiers to combine can have different reliabilities and the proper reliability evaluation plays an important role in the fusion process for getting the best classification performance. We propose a new method for classifier fusion with contextual reliability evaluation (CF-CRE) based on inner reliability and relative reliability concepts. The inner reliability, represented by a matrix, characterizes the probability of the object belonging to one class when it is classified to another class. The elements of this matrix are estimated from the -nearest neighbors of the object. A cautious discounting rule is developed under belief functions framework to revise the classification result according to the inner reliability. The relative reliability is evaluated based on a new incompatibility measure which allows to reduce the level of conflict between the classifiers by applying the classical evidence discounting rule to each classifier before their combination. The inner reliability and relative reliability capture different aspects of the classification reliability. The discounted classification results are combined with Dempster-Shafer's rule for the final class decision making support. The performance of CF-CRE have been evaluated and compared with those of main classical fusion methods using real data sets. The experimental results show that CF-CRE can produce substantially higher accuracy than other fusion methods in general. Moreover, CF-CRE is robust to the changes of the number of nearest neighbors chosen for estimating the reliability matrix, which is appealing for the applications.

  18. Characterizing reliability in a product/process design-assurance program

    Energy Technology Data Exchange (ETDEWEB)

    Kerscher, W.J. III [Delphi Energy and Engine Management Systems, Flint, MI (United States); Booker, J.M.; Bement, T.R.; Meyer, M.A. [Los Alamos National Lab., NM (United States)

    1997-10-01

    Over the years many advancing techniques in the area of reliability engineering have surfaced in the military sphere of influence, and one of these techniques is Reliability Growth Testing (RGT). Private industry has reviewed RGT as part of the solution to their reliability concerns, but many practical considerations have slowed its implementation. It`s objective is to demonstrate the reliability requirement of a new product with a specified confidence. This paper speaks directly to that objective but discusses a somewhat different approach to achieving it. Rather than conducting testing as a continuum and developing statistical confidence bands around the results, this Bayesian updating approach starts with a reliability estimate characterized by large uncertainty and then proceeds to reduce the uncertainty by folding in fresh information in a Bayesian framework.

  19. Dynamic reliability of digital-based transmitters

    Energy Technology Data Exchange (ETDEWEB)

    Brissaud, Florent, E-mail: florent.brissaud.2007@utt.f [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France) and Universite de Technologie de Troyes - UTT, Institut Charles Delaunay - ICD and UMR CNRS 6279 STMR, 12 rue Marie Curie, BP 2060, 10010 Troyes Cedex (France); Smidts, Carol [Ohio State University (OSU), Nuclear Engineering Program, Department of Mechanical Engineering, Scott Laboratory, 201 W 19th Ave, Columbus OH 43210 (United States); Barros, Anne; Berenguer, Christophe [Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and UMR CNRS 6279 STMR, 12 rue Marie Curie, BP 2060, 10010 Troyes Cedex (France)

    2011-07-15

    Dynamic reliability explicitly handles the interactions between the stochastic behaviour of system components and the deterministic behaviour of process variables. While dynamic reliability provides a more efficient and realistic way to perform probabilistic risk assessment than 'static' approaches, its industrial level applications are still limited. Factors contributing to this situation are the inherent complexity of the theory and the lack of a generic platform. More recently the increased use of digital-based systems has also introduced additional modelling challenges related to specific interactions between system components. Typical examples are the 'intelligent transmitters' which are able to exchange information, and to perform internal data processing and advanced functionalities. To make a contribution to solving these challenges, the mathematical framework of dynamic reliability is extended to handle the data and information which are processed and exchanged between systems components. Stochastic deviations that may affect system properties are also introduced to enhance the modelling of failures. A formalized Petri net approach is then presented to perform the corresponding reliability analyses using numerical methods. Following this formalism, a versatile model for the dynamic reliability modelling of digital-based transmitters is proposed. Finally the framework's flexibility and effectiveness is demonstrated on a substantial case study involving a simplified model of a nuclear fast reactor.

  20. Exploitation examination of reliability of coal dust systems

    International Nuclear Information System (INIS)

    Dojchinovski, Ilija; Trajkovski, Kole

    1997-01-01

    Designers and operators wish is, long, failure free operation at designed parameters of every system. Always we know the system start up time, but we don't know how long this system will operate successfully. Because of that in this article is given a method how, step by step, to determine the reliability of the system. Reliability parameters are obtained from experimental and operational data. When reliability parameters are determined then it is very easy to compare reliability of similar systems, for example excavators, or different systems, such as truck and rubber band transport system. Practical use of the theory of reliability is by purchasing of the systems when manufacturers have to have and present reliability parameters and on this way we can decide which system satisfies our needs regarding the quality-price-reliability. Reliability can be practically used in system operation where: 1) system reliability is maintained with proper start, use and shutdown of the system; 2) a system reliability is maintained with good maintenance organization; 3) a system reliability is maintained with innovations and improvements with final purpose removing of the imperfections experienced through the operation. Reliability is very important parameter in power generation plants. (Author)

  1. Decision Making under Ecological Regime Shift: An Experimental Economic Approach

    OpenAIRE

    Kawata, Yukichika

    2011-01-01

    Environmental economics postulates the assumption of homo economicus and presumes that externality occurs as a result of the rational economic activities of economic agents. This paper examines this assumption using an experimental economic approach in the context of regime shift, which has been receiving increasing attention. We observe that when externality does not exist, economic agents (subjects of experimemt) act economically rationally, but when externality exists, economic agents avoi...

  2. Dynamic reliability networks with self-healing units

    International Nuclear Information System (INIS)

    Jenab, K.; Seyed Hosseini, S.M.; Dhillon, B.S.

    2008-01-01

    This paper presents an analytical approach for dynamic reliability networks used for the failure limit strategy in maintenance optimization. The proposed approach utilizes the moment generating function (MGF) and the flow-graph concept to depict the functional and reliability diagrams of the system comprised of series, parallel or mix configuration of self-healing units. The self-healing unit is featured by the embedded failure detection and recovery mechanisms presented by self-loop in flow-graph networks. The newly developed analytical approach provides the probability of the system failure and time-to-failure data i.e., mean and standard deviation time-to-failure used for maintenance optimization

  3. Study of Photovoltaic Energy Storage by Supercapacitors through Both Experimental and Modelling Approaches

    Directory of Open Access Journals (Sweden)

    Pierre-Olivier Logerais

    2013-01-01

    Full Text Available The storage of photovoltaic energy by supercapacitors is studied by using two approaches. An overview on the integration of supercapacitors in solar energy conversion systems is previously provided. First, a realized experimental setup of charge/discharge of supercapacitors fed by a photovoltaic array has been operated with fine data acquisition. The second approach consists in simulating photovoltaic energy storage by supercapacitors with a faithful and accessible model composed of solar irradiance evaluation, equivalent electrical circuit for photovoltaic conversion, and a multibranch circuit for supercapacitor. Both the experimental and calculated results are confronted, and an error of 1% on the stored energy is found with a correction largely within ±10% of the transmission line capacitance according to temperature.

  4. Reliability assessment based on subjective inferences

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    The reliability information which comes from subjective analysis is often incomplete prior. This information can be generally assumed to exist in the form of either a stated prior mean of R (reliability) or a stated prior credibility interval on R. An efficient approach is developed to determine a complete beta prior distribution from the subjective information according to the principle of maximum entropy, and the the reliability of survival/failure product is assessed via Bayes theorem. Numerical examples are presented to illustrate the methods

  5. Distinguishing between forensic science and forensic pseudoscience: testing of validity and reliability, and approaches to forensic voice comparison.

    Science.gov (United States)

    Morrison, Geoffrey Stewart

    2014-05-01

    In this paper it is argued that one should not attempt to directly assess whether a forensic analysis technique is scientifically acceptable. Rather one should first specify what one considers to be appropriate principles governing acceptable practice, then consider any particular approach in light of those principles. This paper focuses on one principle: the validity and reliability of an approach should be empirically tested under conditions reflecting those of the case under investigation using test data drawn from the relevant population. Versions of this principle have been key elements in several reports on forensic science, including forensic voice comparison, published over the last four-and-a-half decades. The aural-spectrographic approach to forensic voice comparison (also known as "voiceprint" or "voicegram" examination) and the currently widely practiced auditory-acoustic-phonetic approach are considered in light of this principle (these two approaches do not appear to be mutually exclusive). Approaches based on data, quantitative measurements, and statistical models are also considered in light of this principle. © 2013.

  6. Reliability-based design optimization using a generalized subset simulation method and posterior approximation

    Science.gov (United States)

    Ma, Yuan-Zhuo; Li, Hong-Shuang; Yao, Wei-Xing

    2018-05-01

    The evaluation of the probabilistic constraints in reliability-based design optimization (RBDO) problems has always been significant and challenging work, which strongly affects the performance of RBDO methods. This article deals with RBDO problems using a recently developed generalized subset simulation (GSS) method and a posterior approximation approach. The posterior approximation approach is used to transform all the probabilistic constraints into ordinary constraints as in deterministic optimization. The assessment of multiple failure probabilities required by the posterior approximation approach is achieved by GSS in a single run at all supporting points, which are selected by a proper experimental design scheme combining Sobol' sequences and Bucher's design. Sequentially, the transformed deterministic design optimization problem can be solved by optimization algorithms, for example, the sequential quadratic programming method. Three optimization problems are used to demonstrate the efficiency and accuracy of the proposed method.

  7. A generic Approach for Reliability Predictions considering non-uniformly Deterioration Behaviour

    International Nuclear Information System (INIS)

    Krause, Jakob; Kabitzsch, Klaus

    2012-01-01

    Predictive maintenance offers the possibility to prognosticate the remaining time until a maintenance action of a machine has to be scheduled. Unfortunately, current predictive maintenance solutions are only suitable for very specific use cases like reliability predictions based on vibration monitoring. Furthermore, they do not consider the fact that machines may deteriorate non-uniformly, depending on external influences (e.g., the work piece material in a milling machine or the changing fruit acid concentration in a bottling plant). In this paper two concepts for a generic predictive maintenance solution which also considers non-uniformly aging behaviour are introduced. The first concept is based on system models representing the health state of a technical system. As these models are usually statically (viz. without a timely dimension) their coefficients are determined periodically and the resulting time series is used as aging indicator. The second concept focuses on external influences (contexts) which change the behaviour of the previous mentioned aging indicators in order to increase the accuracy of reliability predictions. Therefore, context-depended time series models are determined and used to predict machine reliability. Both concepts were evaluated on data of an air ventilation system. Thereby, it could be shown that they are suitable to determine aging indicators in a generic way and to incorporate external influences in the reliability prediction. Through this, the quality of reliability predictions can be significantly increased. In reality this leads to a more accurate scheduling of maintenance actions. Furthermore, the generic character of the solutions makes the concepts suitable for a wide range of aging processes.

  8. Experimental approaches for evaluating the invasion risk of biofuel crops

    International Nuclear Information System (INIS)

    Luke Flory, S; Sollenberger, Lynn E; Lorentz, Kimberly A; Gordon, Doria R

    2012-01-01

    There is growing concern that non-native plants cultivated for bioenergy production might escape and result in harmful invasions in natural areas. Literature-derived assessment tools used to evaluate invasion risk are beneficial for screening, but cannot be used to assess novel cultivars or genotypes. Experimental approaches are needed to help quantify invasion risk but protocols for such tools are lacking. We review current methods for evaluating invasion risk and make recommendations for incremental tests from small-scale experiments to widespread, controlled introductions. First, local experiments should be performed to identify conditions that are favorable for germination, survival, and growth of candidate biofuel crops. Subsequently, experimental introductions in semi-natural areas can be used to assess factors important for establishment and performance such as disturbance, founder population size, and timing of introduction across variable habitats. Finally, to fully characterize invasion risk, experimental introductions should be conducted across the expected geographic range of cultivation over multiple years. Any field-based testing should be accompanied by safeguards and monitoring for early detection of spread. Despite the costs of conducting experimental tests of invasion risk, empirical screening will greatly improve our ability to determine if the benefits of a proposed biofuel species outweigh the projected risks of invasions. (letter)

  9. Semi-Automatic Registration of Airborne and Terrestrial Laser Scanning Data Using Building Corner Matching with Boundaries as Reliability Check

    Directory of Open Access Journals (Sweden)

    Liang Cheng

    2013-11-01

    Full Text Available Data registration is a prerequisite for the integration of multi-platform laser scanning in various applications. A new approach is proposed for the semi-automatic registration of airborne and terrestrial laser scanning data with buildings without eaves. Firstly, an automatic calculation procedure for thresholds in density of projected points (DoPP method is introduced to extract boundary segments from terrestrial laser scanning data. A new algorithm, using a self-extending procedure, is developed to recover the extracted boundary segments, which then intersect to form the corners of buildings. The building corners extracted from airborne and terrestrial laser scanning are reliably matched through an automatic iterative process in which boundaries from two datasets are compared for the reliability check. The experimental results illustrate that the proposed approach provides both high reliability and high geometric accuracy (average error of 0.44 m/0.15 m in horizontal/vertical direction for corresponding building corners for the final registration of airborne laser scanning (ALS and tripod mounted terrestrial laser scanning (TLS data.

  10. Proceedings of the SRESA national conference on reliability and safety engineering

    International Nuclear Information System (INIS)

    Varde, P.V.; Vaishnavi, P.; Sujatha, S.; Valarmathi, A.

    2014-01-01

    The objective of this conference was to provide a forum for technical discussions on recent developments in the area of risk based approach and Prognostic Health Management of critical systems in decision making. The reliability and safety engineering methods are concerned with the way which the product fails, and the effects of failure is to understand how a product works and assures acceptable levels of safety. The reliability engineering addresses all the anticipated and possibly unanticipated causes of failure to ensure the occurrence of failure is prevented or minimized. The topics discussed in the conference were: Reliability in Engineering Design, Safety Assessment and Management, Reliability analysis and Assessment , Stochastic Petri nets for reliability Modeling, Dynamic Reliability, Reliability Prediction, Hardware Reliability, Software Reliability in Safety Critical Issues, Probabilistic Safety Assessment, Risk Informed Approach, Dynamic Models for Reliability Analysis, Reliability based Design and Analysis, Prognostics and Health Management, Remaining Useful Life (RUL), Human Reliability Modeling, Risk Based Applications, Hazard and Operability Study (HAZOP), Reliability in Network Security and Quality Assurance and Management etc. The papers relevant to INIS are indexed separately

  11. Prediction of software operational reliability using testing environment factor

    International Nuclear Information System (INIS)

    Jung, Hoan Sung

    1995-02-01

    Software reliability is especially important to customers these days. The need to quantify software reliability of safety-critical systems has been received very special attention and the reliability is rated as one of software's most important attributes. Since the software is an intellectual product of human activity and since it is logically complex, the failures are inevitable. No standard models have been established to prove the correctness and to estimate the reliability of software systems by analysis and/or testing. For many years, many researches have focused on the quantification of software reliability and there are many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is on the operational reliability rather than on the test reliability, however. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, testing environment factor comprising the aging factor and the coverage factor are defined in this work to predict the ultimate operational reliability with the failure data. It is by incorporating test environments applied beyond the operational profile into testing environment factor Test reliability can also be estimated with this approach without any model change. The application results are close to the actual data. The approach used in this thesis is expected to be applicable to ultra high reliable software systems that are used in nuclear power plants, airplanes, and other safety-critical applications

  12. Effectiveness of different approaches to disseminating traveler information on travel time reliability.

    Science.gov (United States)

    2014-01-01

    The second Strategic Highway Research Program (SHRP 2) Reliability program aims to improve trip time reliability by reducing the frequency and effects of events that cause travel times to fluctuate unpredictably. Congestion caused by unreliable, or n...

  13. The REPAS approach to the evaluation of passive safety systems reliability

    International Nuclear Information System (INIS)

    Bianchi, F.; Burgazzi, L.; D'Auria, F.; Ricotti, M.E.

    2002-01-01

    Scope of this research, carried out by ENEA in collaboration with University of Pisa and Polytechnic of Milano since 1999, is the identification of a methodology allowing the evaluation of the reliability of passive systems as a whole, in a more physical and phenomenal way. The paper describe the study, named REPAS (Reliability Evaluation of Passive Safety systems), carried out by the partners and finalised to the development and validation of such a procedure. The strategy of engagement moves from the consideration that a passive system should be theoretically more reliable than an active one. In fact it does not need any external input or energy to operate and it relies only upon natural physical laws (e.g. gravity, natural circulation, internally stored energy, etc.) and/or 'intelligent' use of the energy inherently available in the system (e.g. chemical reaction, decay heat, etc.). Nevertheless the passive system may fail its mission not only as a consequence of classical mechanical failure of components, but also for deviation from the expected behaviour, due to physical phenomena mainly related to thermal-hydraulics or due to different boundary and initial conditions. The main sources of physical failure are identified and a probability of occurrence is assigned. The reliability analysis is performed on a passive system which operates in two-phase, natural circulation. The selected system is a loop including a heat source and a heat sink where the condensation occurs. The system behaviour under different configurations has been simulated via best-estimate code (Relap5 mod3.2). The results are shown and can be treated in such a way to give qualitative and quantitative information on the system reliability. Main routes of development of the methodology are also depicted. The analysis of the results shows that the procedure is suitable to evaluate the performance of a passive system on a probabilistic / deterministic basis. Important information can also be

  14. Vanadium supersaturated silicon system: a theoretical and experimental approach

    Science.gov (United States)

    Garcia-Hemme, Eric; García, Gregorio; Palacios, Pablo; Montero, Daniel; García-Hernansanz, Rodrigo; Gonzalez-Diaz, Germán; Wahnon, Perla

    2017-12-01

    The effect of high dose vanadium ion implantation and pulsed laser annealing on the crystal structure and sub-bandgap optical absorption features of V-supersaturated silicon samples has been studied through the combination of experimental and theoretical approaches. Interest in V-supersaturated Si focusses on its potential as a material having a new band within the Si bandgap. Rutherford backscattering spectrometry measurements and formation energies computed through quantum calculations provide evidence that V atoms are mainly located at interstitial positions. The response of sub-bandgap spectral photoconductance is extended far into the infrared region of the spectrum. Theoretical simulations (based on density functional theory and many-body perturbation in GW approximation) bring to light that, in addition to V atoms at interstitial positions, Si defects should also be taken into account in explaining the experimental profile of the spectral photoconductance. The combination of experimental and theoretical methods provides evidence that the improved spectral photoconductance up to 6.2 µm (0.2 eV) is due to new sub-bandgap transitions, for which the new band due to V atoms within the Si bandgap plays an essential role. This enables the use of V-supersaturated silicon in the third generation of photovoltaic devices.

  15. Reliability of Beam Loss Monitor Systems for the Large Hadron Collider

    International Nuclear Information System (INIS)

    Guaglio, G.; Dehning, B.; Santoni, C.

    2005-01-01

    The increase of beam energy and beam intensity, together with the use of super conducting magnets, opens new failure scenarios and brings new criticalities for the whole accelerator protection system. For the LHC beam loss protection system, the failure rate and the availability requirements have been evaluated using the Safety Integrity Level (SIL) approach. A downtime cost evaluation is used as input for the SIL approach. The most critical systems, which contribute to the final SIL value, are the dump system, the interlock system, the beam loss monitors system, and the energy monitor system. The Beam Loss Monitors System (BLMS) is critical for short and intense particles losses at 7 TeV and assisted by the Fast Beam Current Decay Monitors at 450 GeV. At medium and higher loss time it is assisted by other systems, such as the quench protection system and the cryogenic system. For BLMS, hardware and software have been evaluated in detail. The reliability input figures have been collected using historical data from the SPS, using temperature and radiation damage experimental data as well as using standard databases. All the data has been processed by reliability software (Isograph). The analysis spaces from the components data to the system configuration

  16. Reliability of Beam Loss Monitor Systems for the Large Hadron Collider

    Science.gov (United States)

    Guaglio, G.; Dehning, B.; Santoni, C.

    2005-06-01

    The increase of beam energy and beam intensity, together with the use of super conducting magnets, opens new failure scenarios and brings new criticalities for the whole accelerator protection system. For the LHC beam loss protection system, the failure rate and the availability requirements have been evaluated using the Safety Integrity Level (SIL) approach. A downtime cost evaluation is used as input for the SIL approach. The most critical systems, which contribute to the final SIL value, are the dump system, the interlock system, the beam loss monitors system, and the energy monitor system. The Beam Loss Monitors System (BLMS) is critical for short and intense particles losses at 7 TeV and assisted by the Fast Beam Current Decay Monitors at 450 GeV. At medium and higher loss time it is assisted by other systems, such as the quench protection system and the cryogenic system. For BLMS, hardware and software have been evaluated in detail. The reliability input figures have been collected using historical data from the SPS, using temperature and radiation damage experimental data as well as using standard databases. All the data has been processed by reliability software (Isograph). The analysis spaces from the components data to the system configuration.

  17. Experimental approach to explosive nucleosynthesis

    International Nuclear Information System (INIS)

    Kubono, S.

    1991-07-01

    Recent development of experimental studies on explosive nucleosynthesis, especially the rapid proton process and the primordial nucleosynthesis were discussed with a stress on unstable nuclei. New development in the experimental methods for the nuclear astrophysics is also discussed which use unstable nuclear beams. (author)

  18. Reliability of nuclear power plants and equipment

    International Nuclear Information System (INIS)

    1985-01-01

    The standard sets the general principles, a list of reliability indexes and demands on their selection. Reliability indexes of nuclear power plants include the simple indexes of fail-safe operation, life and maintainability, and of storage capability. All terms and notions are explained and methods of evaluating the indexes briefly listed - statistical, and calculation experimental. The dates when the standard comes in force in the individual CMEA countries are given. (M.D.)

  19. A DFT+nonhomogeneous DMFT approach for finite systems

    International Nuclear Information System (INIS)

    Kabir, Alamgir; Turkowski, Volodymyr; Rahman, Talat S

    2015-01-01

    For reliable and efficient inclusion of electron–electron correlation effects in nanosystems we formulate a combined density functional theory/nonhomogeneous dynamical mean-field theory (DFT+DMFT) approach which employs an approximate iterated perturbation theory impurity solver. We further apply the method to examine the size-dependent magnetic properties of iron nanoparticles containing 11–100 atoms. We show that for the majority of clusters the DFT+DMFT solution is in very good agreement with experimental data, much better compared to the DFT and DFT+U results. In particular, it reproduces the oscillations in magnetic moment with size as observed experimentally. We thus demonstrate that the DFT+DMFT approach can be used for accurate and realistic description of nanosystems containing about hundred atoms. (paper)

  20. Proposed Reliability/Cost Model

    Science.gov (United States)

    Delionback, L. M.

    1982-01-01

    New technique estimates cost of improvement in reliability for complex system. Model format/approach is dependent upon use of subsystem cost-estimating relationships (CER's) in devising cost-effective policy. Proposed methodology should have application in broad range of engineering management decisions.

  1. Peltier cells as temperature control elements: Experimental characterization and modeling

    International Nuclear Information System (INIS)

    Mannella, Gianluca A.; La Carrubba, Vincenzo; Brucato, Valerio

    2014-01-01

    The use of Peltier cells to realize compact and precise temperature controlled devices is under continuous extension in recent years. In order to support the design of temperature control systems, a simplified modeling of heat transfer dynamics for thermoelectric devices is presented. By following a macroscopic approach, the heat flux removed at the cold side of Peltier cell can be expressed as Q . c =γ(T c −T c eq ), where γ is a coefficient dependent on the electric current, T c and T c eq are the actual and steady state cold side temperature, respectively. On the other hand, a microscopic modeling approach was pursued via finite element analysis software packages. To validate the models, an experimental apparatus was designed and build-up, consisting in a sample vial with the surfaces in direct contact with Peltier cells. Both modeling approaches led to reliable prediction of transient and steady state sample temperature. -- Highlights: • Simplified modeling of heat transfer dynamics in Peltier cells. • Coupled macroscopic and microscopic approach. • Experimental apparatus: temperature control of a sample vial. • Both modeling approaches predict accurately the transient and steady state sample temperature

  2. Characteristics and application study of AP1000 NPPs equipment reliability classification method

    International Nuclear Information System (INIS)

    Guan Gao

    2013-01-01

    AP1000 nuclear power plant applies an integrated approach to establish equipment reliability classification, which includes probabilistic risk assessment technique, maintenance rule administrative, power production reliability classification and functional equipment group bounding method, and eventually classify equipment reliability into 4 levels. This classification process and result are very different from classical RCM and streamlined RCM. It studied the characteristic of AP1000 equipment reliability classification approach, considered that equipment reliability classification should effectively support maintenance strategy development and work process control, recommended to use a combined RCM method to establish the future equipment reliability program of AP1000 nuclear power plants. (authors)

  3. Rapid, Reliable Shape Setting of Superelastic Nitinol for Prototyping Robots.

    Science.gov (United States)

    Gilbert, Hunter B; Webster, Robert J

    Shape setting Nitinol tubes and wires in a typical laboratory setting for use in superelastic robots is challenging. Obtaining samples that remain superelastic and exhibit desired precurvatures currently requires many iterations, which is time consuming and consumes a substantial amount of Nitinol. To provide a more accurate and reliable method of shape setting, in this paper we propose an electrical technique that uses Joule heating to attain the necessary shape setting temperatures. The resulting high power heating prevents unintended aging of the material and yields consistent and accurate results for the rapid creation of prototypes. We present a complete algorithm and system together with an experimental analysis of temperature regulation. We experimentally validate the approach on Nitinol tubes that are shape set into planar curves. We also demonstrate the feasibility of creating general space curves by shape setting a helical tube. The system demonstrates a mean absolute temperature error of 10°C.

  4. Reliability assessment of restructured power systems using reliability network equivalent and pseudo-sequential simulation techniques

    International Nuclear Information System (INIS)

    Ding, Yi; Wang, Peng; Goel, Lalit; Billinton, Roy; Karki, Rajesh

    2007-01-01

    This paper presents a technique to evaluate reliability of a restructured power system with a bilateral market. The proposed technique is based on the combination of the reliability network equivalent and pseudo-sequential simulation approaches. The reliability network equivalent techniques have been implemented in the Monte Carlo simulation procedure to reduce the computational burden of the analysis. Pseudo-sequential simulation has been used to increase the computational efficiency of the non-sequential simulation method and to model the chronological aspects of market trading and system operation. Multi-state Markov models for generation and transmission systems are proposed and implemented in the simulation. A new load shedding scheme is proposed during generation inadequacy and network congestion to minimize the load curtailment. The IEEE reliability test system (RTS) is used to illustrate the technique. (author)

  5. [Study of the relationship between human quality and reliability].

    Science.gov (United States)

    Long, S; Wang, C; Wang, L i; Yuan, J; Liu, H; Jiao, X

    1997-02-01

    To clarify the relationship between human quality and reliability, 1925 experiments in 20 subjects were carried out to study the relationship between disposition character, digital memory, graphic memory, multi-reaction time and education level and simulated aircraft operation. Meanwhile, effects of task difficulty and enviromental factor on human reliability were also studied. The results showed that human quality can be predicted and evaluated through experimental methods. The better the human quality, the higher the human reliability.

  6. A hybrid load flow and event driven simulation approach to multi-state system reliability evaluation

    International Nuclear Information System (INIS)

    George-Williams, Hindolo; Patelli, Edoardo

    2016-01-01

    Structural complexity of systems, coupled with their multi-state characteristics, renders their reliability and availability evaluation difficult. Notwithstanding the emergence of various techniques dedicated to complex multi-state system analysis, simulation remains the only approach applicable to realistic systems. However, most simulation algorithms are either system specific or limited to simple systems since they require enumerating all possible system states, defining the cut-sets associated with each state and monitoring their occurrence. In addition to being extremely tedious for large complex systems, state enumeration and cut-set definition require a detailed understanding of the system's failure mechanism. In this paper, a simple and generally applicable simulation approach, enhanced for multi-state systems of any topology is presented. Here, each component is defined as a Semi-Markov stochastic process and via discrete-event simulation, the operation of the system is mimicked. The principles of flow conservation are invoked to determine flow across the system for every performance level change of its components using the interior-point algorithm. This eliminates the need for cut-set definition and overcomes the limitations of existing techniques. The methodology can also be exploited to account for effects of transmission efficiency and loading restrictions of components on system reliability and performance. The principles and algorithms developed are applied to two numerical examples to demonstrate their applicability. - Highlights: • A discrete event simulation model based on load flow principles. • Model does not require system path or cut sets. • Applicable to binary and multi-state systems of any topology. • Supports multiple output systems with competing demand. • Model is intuitive and generally applicable.

  7. A unified approach to validation, reliability, and education study design for surgical technical skills training.

    Science.gov (United States)

    Sweet, Robert M; Hananel, David; Lawrenz, Frances

    2010-02-01

    To present modern educational psychology theory and apply these concepts to validity and reliability of surgical skills training and assessment. In a series of cross-disciplinary meetings, we applied a unified approach of behavioral science principles and theory to medical technical skills education given the recent advances in the theories in the field of behavioral psychology and statistics. While validation of the individual simulation tools is important, it is only one piece of a multimodal curriculum that in and of itself deserves examination and study. We propose concurrent validation throughout the design of simulation-based curriculum rather than once it is complete. We embrace the concept that validity and curriculum development are interdependent, ongoing processes that are never truly complete. Individual predictive, construct, content, and face validity aspects should not be considered separately but as interdependent and complementary toward an end application. Such an approach could help guide our acceptance and appropriate application of these exciting new training and assessment tools for technical skills training in medicine.

  8. Experimental Methods for the Analysis of Optimization Algorithms

    DEFF Research Database (Denmark)

    , computational experiments differ from those in other sciences, and the last decade has seen considerable methodological research devoted to understanding the particular features of such experiments and assessing the related statistical methods. This book consists of methodological contributions on different...... in algorithm design, statistical design, optimization and heuristics, and most chapters provide theoretical background and are enriched with case studies. This book is written for researchers and practitioners in operations research and computer science who wish to improve the experimental assessment......In operations research and computer science it is common practice to evaluate the performance of optimization algorithms on the basis of computational results, and the experimental approach should follow accepted principles that guarantee the reliability and reproducibility of results. However...

  9. Experimental approach to Chernobyl hot particles

    International Nuclear Information System (INIS)

    Tcherkezian, V.; Shkinev, V.; Khitrov, L.; Kolesov, G.

    1994-01-01

    An experimental approach to the investigation of Chernobyl hot particles and some results are presented in this study. Hot particles (HP) were picked out from soil samples collected during the 1986-1990 radiogeochemical expeditions in the contaminated zone (within 30 km of the Nuclear Power Plant). A number of hot particles were studied to estimate their contribution to the total activity, investigate their surface morphology and determine the size distribution. Hot particles contribution to the total activity in the 30 km zone was found to be not less than 65%. Investigation of HP element composition (by neutron activation analysis and EPMA) and radionuclide composition (direct alpha- and gamma-spectrometry, including determination of Pu and Am in Hp) revealed certain peculiarities of HP, collected in the vicinity of the damaged Nuclear Power Plant. Some particles were shown to contain uranium and fission products in proportion to one another, correlating with those in the partially burnt fuel, which proves their 'fuel' origin. Another part of the HP samples has revealed element fractionation as well as the presence of some terrestrial components. (Author)

  10. Issues in cognitive reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Hitchler, M.J.; Rumancik, J.A.

    1984-01-01

    This chapter examines some problems in current methods to assess reactor operator reliability at cognitive tasks and discusses new approaches to solve these problems. The two types of human failures are errors in the execution of an intention and errors in the formation/selection of an intention. Topics considered include the types of description, error correction, cognitive performance and response time, the speed-accuracy tradeoff function, function based task analysis, and cognitive task analysis. One problem of human reliability analysis (HRA) techniques in general is the question of what are the units of behavior whose reliability are to be determined. A second problem for HRA is that people often detect and correct their errors. The use of function based analysis, which maps the problem space for plant control, is recommended

  11. The DYLAM approach to systems safety and reliability assessment

    International Nuclear Information System (INIS)

    Amendola, A.

    1988-01-01

    A survey of the principal features and applications of DYLAM (Dynamic Logical Analytical Methodology) is presented, whose basic principles can be summarized as follows: after a particular modelling of the component states, computerized heuristical procedures generate stochastic configurations of the system, whereas the resulting physical processes are simultaneously simulated to give account of the possible interactions between physics and states and, on the other hand, to search for system dangerous configurations and related probabilities. The association of probabilistic techniques for describing the states with physical equations for describing the process results in a very powerful tool for safety and reliability assessment of systems potentially subjected to dangerous incidental transients. A comprehensive picture of DYLAM capability for manifold applications can be obtained by the review of the study cases analyzed (LMFBR core accident, systems reliability assessment, accident simulation, man-machine interaction analysis, chemical reactors safety, etc.)

  12. Control and Reliability of Optical Networks in Multiprocessors

    Science.gov (United States)

    Olsen, James Jonathan

    1993-01-01

    Optical communication links have great potential to improve the performance of interconnection networks within large parallel multiprocessors, but the problems of semiconductor laser drive control and reliability inhibit their wide use. These problems have been solved in the telecommunications context, but the telecommunications solutions, based on a small number of links, are often too bulky, complex, power-hungry, and expensive to be feasible for use in a multiprocessor network with thousands of optical links. The main problems with the telecommunications approaches are that they are, by definition, designed for long-distance communication and therefore deal with communications links in isolation, instead of in an overall systems context. By taking a system-level approach to solving the laser reliability problem in a multiprocessor, and by exploiting the short -distance nature of the links, one can achieve small, simple, low-power, and inexpensive solutions, practical for implementation in the thousands of optical links that might be used in a multiprocessor. Through modeling and experimentation, I demonstrate that such system-level solutions exist, and are feasible for use in a multiprocessor network. I divide semiconductor laser reliability problems into two classes: transient errors and hard failures, and develop solutions to each type of problem in the context of a large multiprocessor. I find that for transient errors, the computer system would require a very low bit-error-rate (BER), such as 10^{-23}, if no provision were made for error control. Optical links cannot achieve such rates directly, but I find that a much more reasonable link-level BER (such as 10^{-7} ) would be acceptable with simple error detection coding. I then propose a feedback system that will enable lasers to achieve these error levels even when laser threshold current varies. Instead of telecommunications techniques, which require laser output power monitors, I describe a software

  13. Nonparametric predictive inference in reliability

    International Nuclear Information System (INIS)

    Coolen, F.P.A.; Coolen-Schrijner, P.; Yan, K.J.

    2002-01-01

    We introduce a recently developed statistical approach, called nonparametric predictive inference (NPI), to reliability. Bounds for the survival function for a future observation are presented. We illustrate how NPI can deal with right-censored data, and discuss aspects of competing risks. We present possible applications of NPI for Bernoulli data, and we briefly outline applications of NPI for replacement decisions. The emphasis is on introduction and illustration of NPI in reliability contexts, detailed mathematical justifications are presented elsewhere

  14. Breaking Through the Glass Ceiling: Recent Experimental Approaches to Probe the Properties of Supercooled Liquids near the Glass Transition.

    Science.gov (United States)

    Smith, R Scott; Kay, Bruce D

    2012-03-15

    Experimental measurements of the properties of supercooled liquids at temperatures near their glass transition temperatures, Tg, are requisite for understanding the behavior of glasses and amorphous solids. Unfortunately, many supercooled molecular liquids rapidly crystallize at temperatures far above their Tg, making such measurements difficult to nearly impossible. In this Perspective, we discuss some recent alternative approaches to obtain experimental data in the temperature regime near Tg. These new approaches may yield the additional experimental data necessary to test current theoretical models of the dynamical slowdown that occurs in supercooled liquids approaching the glass transition.

  15. Reliability of power system with open access

    International Nuclear Information System (INIS)

    Ehsani, A.; Ranjbar, A. M.; Fotuhi Firuzabad, M.; Ehsani, M.

    2003-01-01

    Recently, in many countries, electric utility industry is undergoing considerable changes in regard to its structure and regulation. It can be clearly seen that the thrust towards privatization and deregulation or re regulation of the electric utility industry will introduce numerous reliability problems that will require new criteria and analytical tools that recognize the residual uncertainties in the new environment. In this paper, different risks and uncertainties in competitive electricity markets are briefly introduced; the approach of customers, operators, planners, generation bodies and network providers to the reliability of deregulated system is studied; the impact of dispersed generation on system reliability is evaluated; and finally, the reliability cost/reliability worth issues in the new competitive environment are considered

  16. Reliability-oriented energy storage sizing in wind power systems

    DEFF Research Database (Denmark)

    Qin, Zian; Liserre, Marco; Blaabjerg, Frede

    2014-01-01

    Energy storage can be used to suppress the power fluctuations in wind power systems, and thereby reduce the thermal excursion and improve the reliability. Since the cost of the energy storage in large power application is high, it is crucial to have a better understanding of the relationship...... between the size of the energy storage and the reliability benefit it can generate. Therefore, a reliability-oriented energy storage sizing approach is proposed for the wind power systems, where the power, energy, cost and the control strategy of the energy storage are all taken into account....... With the proposed approach, the computational effort is reduced and the impact of the energy storage system on the reliability of the wind power converter can be quantified....

  17. Operator reliability analysis during NPP small break LOCA

    International Nuclear Information System (INIS)

    Zhang Jiong; Chen Shenglin

    1990-01-01

    To assess the human factor characteristic of a NPP main control room (MCR) design, the MCR operator reliability during a small break LOCA is analyzed, and some approaches for improving the MCR operator reliability are proposed based on the analyzing results

  18. power system reliability in supplying nuclear reactors

    International Nuclear Information System (INIS)

    Gad, M.M.M.

    2007-01-01

    this thesis presents a simple technique for deducing minimal cut set (MCS) from the defined minimal path set (MPS) of generic distribution system and this technique have been used to evaluate the basic reliability indices of Egypt's second research reactor (ETRR-2) electrical distribution network. the alternative system configurations are then studied to evaluate their impact on service reliability. the proposed MCS approach considers both sustained and temporary outage. the temporary outage constitutes an important parameter in characterizing the system reliability indices for critical load point in distribution system. it is also consider the power quality impact on the reliability indices

  19. Fast Monte Carlo reliability evaluation using support vector machine

    International Nuclear Information System (INIS)

    Rocco, Claudio M.; Moreno, Jose Ali

    2002-01-01

    This paper deals with the feasibility of using support vector machine (SVM) to build empirical models for use in reliability evaluation. The approach takes advantage of the speed of SVM in the numerous model calculations typically required to perform a Monte Carlo reliability evaluation. The main idea is to develop an estimation algorithm, by training a model on a restricted data set, and replace system performance evaluation by a simpler calculation, which provides reasonably accurate model outputs. The proposed approach is illustrated by several examples. Excellent system reliability results are obtained by training a SVM with a small amount of information

  20. Design for Reliability of Power Electronic Systems

    DEFF Research Database (Denmark)

    Wang, Huai; Ma, Ke; Blaabjerg, Frede

    2012-01-01

    Advances in power electronics enable efficient and flexible processing of electric power in the application of renewable energy sources, electric vehicles, adjustable-speed drives, etc. More and more efforts are devoted to better power electronic systems in terms of reliability to ensure high......). A collection of methodologies based on Physics-of-Failure (PoF) approach and mission profile analysis are presented in this paper to perform reliability-oriented design of power electronic systems. The corresponding design procedures and reliability prediction models are provided. Further on, a case study...... on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical components IGBTs. Different aspects of improving the reliability of the power converter are mapped. Finally, the challenges and opportunities to achieve more reliable power electronic systems are addressed....

  1. Re-analysis of fatigue data for welded joints using the notch stress approach

    DEFF Research Database (Denmark)

    Pedersen, Mikkel Melters; Mouritsen, Ole Ø.; Hansen, Michael Rygaard

    2010-01-01

    Experimental fatigue data for welded joints have been collected and subjected to re-analysis using the notch stress approach according to IIW recommendations. This leads to an overview regarding the reliability of the approach, based on a large number of results (767 specimens). Evidently......-welded joints agree quite well with the FAT 225 curve; however a reduction to FAT 200 is suggested in order to achieve approximately the same safety as observed in the nominal stress approach....

  2. Experimental annotation of the human genome using microarray technology.

    Science.gov (United States)

    Shoemaker, D D; Schadt, E E; Armour, C D; He, Y D; Garrett-Engele, P; McDonagh, P D; Loerch, P M; Leonardson, A; Lum, P Y; Cavet, G; Wu, L F; Altschuler, S J; Edwards, S; King, J; Tsang, J S; Schimmack, G; Schelter, J M; Koch, J; Ziman, M; Marton, M J; Li, B; Cundiff, P; Ward, T; Castle, J; Krolewski, M; Meyer, M R; Mao, M; Burchard, J; Kidd, M J; Dai, H; Phillips, J W; Linsley, P S; Stoughton, R; Scherer, S; Boguski, M S

    2001-02-15

    The most important product of the sequencing of a genome is a complete, accurate catalogue of genes and their products, primarily messenger RNA transcripts and their cognate proteins. Such a catalogue cannot be constructed by computational annotation alone; it requires experimental validation on a genome scale. Using 'exon' and 'tiling' arrays fabricated by ink-jet oligonucleotide synthesis, we devised an experimental approach to validate and refine computational gene predictions and define full-length transcripts on the basis of co-regulated expression of their exons. These methods can provide more accurate gene numbers and allow the detection of mRNA splice variants and identification of the tissue- and disease-specific conditions under which genes are expressed. We apply our technique to chromosome 22q under 69 experimental condition pairs, and to the entire human genome under two experimental conditions. We discuss implications for more comprehensive, consistent and reliable genome annotation, more efficient, full-length complementary DNA cloning strategies and application to complex diseases.

  3. Operational present status and reliability analysis of the upgraded EAST cryogenic system

    Science.gov (United States)

    Zhou, Z. W.; Y Zhang, Q.; Lu, X. F.; Hu, L. B.; Zhu, P.

    2017-12-01

    Since the first commissioning in 2005, the cryogenic system for EAST (Experimental Advanced Superconducting Tokamak) has been cooled down and warmed up for thirteen experimental campaigns. In order to promote the refrigeration efficiencies and reliability, the EAST cryogenic system was upgraded gradually with new helium screw compressors and new dynamic gas bearing helium turbine expanders with eddy current brake to improve the original poor mechanical and operational performance from 2012 to 2015. Then the totally upgraded cryogenic system was put into operation in the eleventh cool-down experiment, and has been operated for the latest several experimental campaigns. The upgraded system has successfully coped with various normal operational modes during cool-down and 4.5 K steady-state operation under pulsed heat load from the tokamak as well as the abnormal fault modes including turbines protection stop. In this paper, the upgraded EAST cryogenic system including its functional analysis and new cryogenic control networks will be presented in detail. Also, its operational present status in the latest cool-down experiments will be presented and the system reliability will be analyzed, which shows a high reliability and low fault rate after upgrade. In the end, some future necessary work to meet the higher reliability requirement for future uninterrupted long-term experimental operation will also be proposed.

  4. Reliability testing of failed fuel location system

    International Nuclear Information System (INIS)

    Vieru, G.

    1996-01-01

    This paper presents the experimental reliability tests performed in order to prove the reliability parameters for Failed Fuel Location System (FFLS), equipment used to detect in which channel of a particular heat transport loop a fuel failure is located, and to find in which channel what particular bundle pair is failed. To do so, D20 samples from each reactor channel are sequentially monitored to detect a comparatively high level of delayed neutron activity. 15 refs, 8 figs, 2 tabs

  5. Improved microbial conversion of de-oiled Jatropha waste into biohydrogen via inoculum pretreatment: process optimization by experimental design approach

    Directory of Open Access Journals (Sweden)

    Gopalakrishnan Kumar

    2015-03-01

    Full Text Available In this study various pretreatment methods of sewage sludge inoculum and the statistical process optimization of de-oiled jatropha waste have been reported. Peak hydrogen production rate (HPR and hydrogen yield (HY of 0.36 L H2/L-d and 20 mL H2/g Volatile Solid (VS were obtained when heat shock pretreatment (95 oC, 30 min was employed. Afterwards, an experimental design was applied to find the optimal conditions for H2 production using heat-pretreated seed culture. The optimal substrate concentration, pH and temperature were determined by using response surface methodology as 205 g/L, 6.53 and 55.1 oC, respectively. Under these circumstances, the highest HPR of 1.36 L H2/L-d was predicted. Verification tests proved the reliability of the statistical approach. As a result of the heat pretreatment and fermentation optimization, a significant (~ 4 folds increase in HPR was achieved. PCR-DGGE results revealed that Clostridium sp. were majorly present under the optimal conditions.

  6. Reliability analysis and operator modelling

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    1996-01-01

    The paper considers the state of operator modelling in reliability analysis. Operator models are needed in reliability analysis because operators are needed in process control systems. HRA methods must therefore be able to account both for human performance variability and for the dynamics of the interaction. A selected set of first generation HRA approaches is briefly described in terms of the operator model they use, their classification principle, and the actual method they propose. In addition, two examples of second generation methods are also considered. It is concluded that first generation HRA methods generally have very simplistic operator models, either referring to the time-reliability relationship or to elementary information processing concepts. It is argued that second generation HRA methods must recognise that cognition is embedded in a context, and be able to account for that in the way human reliability is analysed and assessed

  7. Reliability concepts applied to cutting tool change time

    Energy Technology Data Exchange (ETDEWEB)

    Patino Rodriguez, Carmen Elena, E-mail: cpatino@udea.edu.c [Department of Industrial Engineering, University of Antioquia, Medellin (Colombia); Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil); Francisco Martha de Souza, Gilberto [Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil)

    2010-08-15

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  8. Reliability concepts applied to cutting tool change time

    International Nuclear Information System (INIS)

    Patino Rodriguez, Carmen Elena; Francisco Martha de Souza, Gilberto

    2010-01-01

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  9. [Animal experimentation, computer simulation and surgical research].

    Science.gov (United States)

    Carpentier, Alain

    2009-11-01

    We live in a digital world In medicine, computers are providing new tools for data collection, imaging, and treatment. During research and development of complex technologies and devices such as artificial hearts, computer simulation can provide more reliable information than experimentation on large animals. In these specific settings, animal experimentation should serve more to validate computer models of complex devices than to demonstrate their reliability.

  10. Reliability assurance for regulation of advanced reactors

    International Nuclear Information System (INIS)

    Fullwood, R.; Lofaro, R.; Samanta, P.

    1992-01-01

    The advanced nuclear power plants must achieve higher levels of safety than the first generation of plants. Showing that this is indeed true provides new challenges to reliability and risk assessment methods in the analysis of the designs employing passive and semi-passive protection. Reliability assurance of the advanced reactor systems is important for determining the safety of the design and for determining the plant operability. Safety is the primary concern, but operability is considered indicative of good and safe operation. this paper discusses several concerns for reliability assurance of the advanced design encompassing reliability determination, level of detail required in advanced reactor submittals, data for reliability assurance, systems interactions and common cause effects, passive component reliability, PRA-based configuration control system, and inspection, training, maintenance and test requirements. Suggested approaches are provided for addressing each of these topics

  11. Reliability assurance for regulation of advanced reactors

    International Nuclear Information System (INIS)

    Fullwood, R.; Lofaro, R.; Samanta, P.

    1991-01-01

    The advanced nuclear power plants must achieve higher levels of safety than the first generation of plants. Showing that this is indeed true provides new challenges to reliability and risk assessment methods in the analysis of the designs employing passive and semi-passive protection. Reliability assurance of the advanced reactor systems is important for determining the safety of the design and for determining the plant operability. Safety is the primary concern, but operability is considered indicative of good and safe operation. This paper discusses several concerns for reliability assurance of the advanced design encompassing reliability determination, level of detail required in advanced reactor submittals, data for reliability assurance, systems interactions and common cause effects, passive component reliability, PRA-based configuration control system, and inspection, training, maintenance and test requirements. Suggested approaches are provided for addressing each of these topics

  12. Managing relationships between electric power industry restructuring and grid reliability

    International Nuclear Information System (INIS)

    Thomas, R.J.

    2005-01-01

    The electricity system is a critical infrastructure, and its continued and reliable functioning is essential to the nation's economy and well-being. However, the inter-dependency of electricity networks is not completely understood. The economic impact of outages was discussed in this white paper. It was suggested that moving to a restructured environment has degraded the reliability of the bulk system. New institutional arrangements and approaches to information management are needed. It was suggested that reliability practices caused the 2003 blackout, and not technical failures. Uncertainties in the restructured market were discussed, as well as incentives to maintain system adequacy. Examples of deregulation in other countries were presented. Organizational complexities were reviewed, including the Federal Energy Regulatory Commission's (FERC) requirements and the new layers of complexity that have been added to the decision-making process in the light of restructuring. Planning and connectivity issues were reviewed. The need for design standards in power grid control centres was discussed. Difficulties in collecting data from different control centres were considered. Issues concerning the lack of investment in research and development were discussed, with particular reference to the urgent need for coordinated research programs. The looming manpower crisis in the electric power industry was also discussed. Recommendations included ensuring that the transmission system can support a market structure; building a national reliability centre; solving the manpower crisis; and testing market designs before deploying them. It was concluded that good engineering design principles, including experimental economic testing, should be required of any new electricity market design before authorizing its use. 31 refs., 1 tab., 6 figs

  13. Treatment of secondary burn wound progression in contact burns-a systematic review of experimental approaches.

    Science.gov (United States)

    Schmauss, Daniel; Rezaeian, Farid; Finck, Tom; Machens, Hans-Guenther; Wettstein, Reto; Harder, Yves

    2015-01-01

    After a burn injury, superficial partial-thickness burn wounds may progress to deep partial-thickness or full-thickness burn wounds, if kept untreated. This phenomenon is called secondary burn wound progression or conversion. Burn wound depth is an important determinant of patient morbidity and mortality. Therefore, reduction or even the prevention of secondary burn wound progression is one goal of the acute care of burned patients. The objective of this study was to review preclinical approaches evaluating therapies to reduce burn wound progression. A systematic review of experimental approaches in animals that aim at reducing or preventing secondary burn wound progression was performed in accordance with the Preferred Reporting Items for Systematic Reviews and Meta Analysis (PRISMA) guidelines. The selected references consist of all the peer-reviewed studies performed in vivo in animals and review articles published in English, German, Italian, Spanish, or French language relevant to the topic of secondary burn wound progression. We searched MEDLINE, Cochrane Library, and Google Scholar including all the articles published from the beginning of notations to the present. The search was conducted between May 3, 2012 and December 26, 2013. We included 29 experimental studies in this review, investigating agents that maintain or increase local perfusion conditions, as well as agents that exhibit an anti-coagulatory, an anti-inflammatory, or an anti-apoptotic property. Warm water, simvastatin, EPO, or cerium nitrate may represent particularly promising approaches for the translation into clinical use in the near future. This review demonstrates promising experimental approaches that might reduce secondary burn wound progression. Nevertheless, a translation into clinical application needs to confirm the results compiled in experimental animal studies.

  14. Química geral experimental: uma nova abordagem didática Experimental general chemistry: a new teaching approach

    Directory of Open Access Journals (Sweden)

    Geraldo Eduardo da Luz Júnior

    2004-02-01

    Full Text Available This essay describes a new didactic approach, in according with the national curriculum guidelines for chemistry undergraduate courses in Brazil, employed during the one-semester course "Experimental General Chemistry" for chemistry undergraduate students at the Federal University of Piauí. The new approach has positively helped student's training by improving their reading skills and their understanding of scientific reports, by developing the use of electronic tools to search and to recover the required knowledge for their learning activities, and by improving their skills of understanding published texts and dealing with digital sources. At the same time the students are strongly stimulated to enter the research program for undergraduate students available at the University.

  15. Estimating Between-Person and Within-Person Subscore Reliability with Profile Analysis.

    Science.gov (United States)

    Bulut, Okan; Davison, Mark L; Rodriguez, Michael C

    2017-01-01

    Subscores are of increasing interest in educational and psychological testing due to their diagnostic function for evaluating examinees' strengths and weaknesses within particular domains of knowledge. Previous studies about the utility of subscores have mostly focused on the overall reliability of individual subscores and ignored the fact that subscores should be distinct and have added value over the total score. This study introduces a profile reliability approach that partitions the overall subscore reliability into within-person and between-person subscore reliability. The estimation of between-person reliability and within-person reliability coefficients is demonstrated using subscores from number-correct scoring, unidimensional and multidimensional item response theory scoring, and augmented scoring approaches via a simulation study and a real data study. The effects of various testing conditions, such as subtest length, correlations among subscores, and the number of subtests, are examined. Results indicate that there is a substantial trade-off between within-person and between-person reliability of subscores. Profile reliability coefficients can be useful in determining the extent to which subscores provide distinct and reliable information under various testing conditions.

  16. Notes on human factors problems in process plant reliability and safety prediction

    International Nuclear Information System (INIS)

    Rasmussen, J.; Taylor, J.R.

    1976-09-01

    The basis for plant operator reliability evaluation is described. Principles for plant design, necessary to permit reliability evaluation, are outlined. Five approaches to the plant operator reliability problem are described. Case stories, illustrating operator reliability problems, are given. (author)

  17. Innovations in power systems reliability

    CERN Document Server

    Santora, Albert H; Vaccaro, Alfredo

    2011-01-01

    Electrical grids are among the world's most reliable systems, yet they still face a host of issues, from aging infrastructure to questions of resource distribution. Here is a comprehensive and systematic approach to tackling these contemporary challenges.

  18. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    Directory of Open Access Journals (Sweden)

    Ning-Cong Xiao

    2013-12-01

    Full Text Available In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to calculate the maximum entropy density function of uncertainty parameters more accurately for it does not need any additional information and assumptions. Finally, two optimization models are presented which can be used to determine the lower and upper bounds of systems probability of failure under vague environment conditions. Two numerical examples are investigated to demonstrate the proposed method.

  19. Human Reliability Analysis in Support of Risk Assessment for Positive Train Control

    Science.gov (United States)

    2003-06-01

    This report describes an approach to evaluating the reliability of human actions that are modeled in a probabilistic risk assessment : (PRA) of train control operations. This approach to human reliability analysis (HRA) has been applied in the case o...

  20. Reliable computation from contextual correlations

    Science.gov (United States)

    Oestereich, André L.; Galvão, Ernesto F.

    2017-12-01

    An operational approach to the study of computation based on correlations considers black boxes with one-bit inputs and outputs, controlled by a limited classical computer capable only of performing sums modulo-two. In this setting, it was shown that noncontextual correlations do not provide any extra computational power, while contextual correlations were found to be necessary for the deterministic evaluation of nonlinear Boolean functions. Here we investigate the requirements for reliable computation in this setting; that is, the evaluation of any Boolean function with success probability bounded away from 1 /2 . We show that bipartite CHSH quantum correlations suffice for reliable computation. We also prove that an arbitrarily small violation of a multipartite Greenberger-Horne-Zeilinger noncontextuality inequality also suffices for reliable computation.

  1. Fault detection and reliability, knowledge based and other approaches

    International Nuclear Information System (INIS)

    Singh, M.G.; Hindi, K.S.; Tzafestas, S.G.

    1987-01-01

    These proceedings are split up into four major parts in order to reflect the most significant aspects of reliability and fault detection as viewed at present. The first part deals with knowledge-based systems and comprises eleven contributions from leading experts in the field. The emphasis here is primarily on the use of artificial intelligence, expert systems and other knowledge-based systems for fault detection and reliability. The second part is devoted to fault detection of technological systems and comprises thirteen contributions dealing with applications of fault detection techniques to various technological systems such as gas networks, electric power systems, nuclear reactors and assembly cells. The third part of the proceedings, which consists of seven contributions, treats robust, fault tolerant and intelligent controllers and covers methodological issues as well as several applications ranging from nuclear power plants to industrial robots to steel grinding. The fourth part treats fault tolerant digital techniques and comprises five contributions. Two papers, one on reactor noise analysis, the other on reactor control system design, are indexed separately. (author)

  2. Fuzzy Goal Programming Approach in Selective Maintenance Reliability Model

    Directory of Open Access Journals (Sweden)

    Neha Gupta

    2013-12-01

    Full Text Available 800x600 In the present paper, we have considered the allocation problem of repairable components for a parallel-series system as a multi-objective optimization problem and have discussed two different models. In first model the reliability of subsystems are considered as different objectives. In second model the cost and time spent on repairing the components are considered as two different objectives. These two models is formulated as multi-objective Nonlinear Programming Problem (MONLPP and a Fuzzy goal programming method is used to work out the compromise allocation in multi-objective selective maintenance reliability model in which we define the membership functions of each objective function and then transform membership functions into equivalent linear membership functions by first order Taylor series and finally by forming a fuzzy goal programming model obtain a desired compromise allocation of maintenance components. A numerical example is also worked out to illustrate the computational details of the method.  Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4

  3. Reliability of equipments and theory of frequency statistics and Bayesian decision

    International Nuclear Information System (INIS)

    Procaccia, H.; Clarotti, C.A.

    1992-01-01

    The rapid development of Bayesian techniques use in the domain of industrial risk is a recent phenomenon linked to the development of powerful computers. These techniques involve a reasoning well adapted to experimental logics, based on the dynamical knowledge enrichment with experience data. In the framework of reliability studies and statistical decision making, these methods differ slightly from the methods commonly used to evaluate the reliability of systems and from classical theoretical frequency statistics. This particular approach is described in this book and illustrated with many examples of application (power plants, pressure vessels, industrial installations etc..). These examples generally concern the risk management in the cases where the application of rules and the respect of norms become insufficient. It is now well known that the risk cannot be reduced to zero and that its evaluation must be performed using statistics, taking into account the possible accident processes and also the investments necessary to avoid them (service life, failure, maintenance costs and availability of materials). The result is the optimizing of a decision process about rare or uncertain events. (J.S.)

  4. Comparing Conventional Bank Credit Vis A Vis Shariah Bank Musharakah: Experimental Economic Approach

    Directory of Open Access Journals (Sweden)

    Muhamad Abduh

    2008-01-01

    Full Text Available Central Bank of Indonesia with dual banking system – i.e Shariah and Conventional Bank – keep on developing system that considered as an answer to generate the national economic growth. One of the banking activities that emphasized by the Central Bank of Indonesia is fund distribution through either conventional bank credit or shariah bank fi nancing. Having the Experimental Economic Approach based on Induced Value Theory and employing ANOVA, this paper found that shariah bank musharakah fi nancing system would come up with higher profi t opportunity compare to conventional credit system. One main reason is that musharakah fi nancing in shariah bank applies profi t and lost sharing (PLS scheme so that will not be a burden to the customer when he fi nd low profi t.Keywords: Credit Loan, Musharakah Financing, Induced Value Theory, Experimental Economic Approach, Analysis of Variance (ANOVA.

  5. System reliability developments in structural engineering

    International Nuclear Information System (INIS)

    Moses, F.

    1982-01-01

    Two major limitations occur in present structural design code developments utilizing reliability theory. The notional system reliabilities may differ significantly from calibrated component reliabilities. Secondly, actual failures are often due to gross errors not reflected in most present code formats. A review is presented of system reliability methods and further new concepts are developed. The incremental load approach for identifying and expressing collapse modes is expanded by employing a strategy to identify and enumerate the significant structural collapse modes. It further isolates the importance of critical components in the system performance. Ductile and brittle component behavior and strength correlation is reflected in the system model and illustrated in several examples. Modal combinations for the system reliability are also reviewed. From these developments a system factor can be addended to component safety checking equations. Values may be derived from system behavior by substituting in a damage model which accounts for the response range from component failure to collapse. Other strategies are discussed which emphasize quality assurance during design and in-service inspection for components whose behavior is critical to the system reliability. (Auth.)

  6. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  7. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  8. Reliability issues at the LHC

    CERN Multimedia

    CERN. Geneva. Audiovisual Unit; Gillies, James D

    2002-01-01

    The Lectures on reliability issues at the LHC will be focused on five main Modules on five days. Module 1: Basic Elements in Reliability Engineering Some basic terms, definitions and methods, from components up to the system and the plant, common cause failures and human factor issues. Module 2: Interrelations of Reliability & Safety (R&S) Reliability and risk informed approach, living models, risk monitoring. Module 3: The ideal R&S Process for Large Scale Systems From R&S goals via the implementation into the system to the proof of the compliance. Module 4: Some Applications of R&S on LHC Master logic, anatomy of risk, cause - consequence diagram, decomposition and aggregation of the system. Module 5: Lessons learned from R&S Application in various Technologies Success stories, pitfalls, constrains in data and methods, limitations per se, experienced in aviation, space, process, nuclear, offshore and transport systems and plants. The Lectures will reflect in summary the compromise in...

  9. Equipment Reliability Process in Krsko NPP

    International Nuclear Information System (INIS)

    Gluhak, M.

    2016-01-01

    To ensure long-term safe and reliable plant operation, equipment operability and availability must also be ensured by setting a group of processes to be established within the nuclear power plant. Equipment reliability process represents the integration and coordination of important equipment reliability activities into one process, which enables equipment performance and condition monitoring, preventive maintenance activities development, implementation and optimization, continuous improvement of the processes and long term planning. The initiative for introducing systematic approach for equipment reliability assuring came from US nuclear industry guided by INPO (Institute of Nuclear Power Operations) and by participation of several US nuclear utilities. As a result of the initiative, first edition of INPO document AP-913, 'Equipment Reliability Process Description' was issued and it became a basic document for implementation of equipment reliability process for the whole nuclear industry. The scope of equipment reliability process in Krsko NPP consists of following programs: equipment criticality classification, preventive maintenance program, corrective action program, system health reports and long-term investment plan. By implementation, supervision and continuous improvement of those programs, guided by more than thirty years of operating experience, Krsko NPP will continue to be on a track of safe and reliable operation until the end of prolonged life time. (author).

  10. Features of applying systems approach for evaluating the reliability of cryogenic systems for special purposes

    Directory of Open Access Journals (Sweden)

    E. D. Chertov

    2016-01-01

    Full Text Available Summary. The analysis of cryogenic installations confirms objective regularity of increase in amount of the tasks solved by systems of a special purpose. One of the most important directions of development of a cryogenics is creation of installations for air separation product receipt, namely oxygen and nitrogen. Modern aviation complexes require use of these gases in large numbers as in gaseous, and in the liquid state. The onboard gas systems applied in aircraft of the Russian Federation are subdivided on: oxygen system; air (nitric system; system of neutral gas; fire-proof system. Technological schemes ADI are in many respects determined by pressure of compressed air or, in a general sense, a refrigerating cycle. For the majority ADI a working body of a refrigerating cycle the divided air is, that is technological and refrigerating cycles in installation are integrated. By this principle differentiate installations: low pressure; average and high pressure; with detander; with preliminary chilling. There is also insignificant number of the ADI types in which refrigerating and technological cycles are separated. These are installations with external chilling. For the solution of tasks of control of technical condition of the BRV hardware in real time and estimates of indicators of reliability it is offered to use multi-agent technologies. Multi-agent approach is the most acceptable for creation of SPPR for reliability assessment as allows: to redistribute processing of information on elements of system that leads to increase in overall performance; to solve a problem of accumulating, storage and recycling of knowledge that will allow to increase significantly efficiency of the solution of tasks of an assessment of reliability; to considerably reduce intervention of the person in process of functioning of system that will save time of the person of the making decision (PMD and will not demand from it special skills of work with it.

  11. A Bayesian reliability approach to the performance assessment of a geological waste repository

    International Nuclear Information System (INIS)

    Flueck, J.A.; Singh, A.K.

    1992-01-01

    This paper discusses the task of selecting a suitable site for a high-level waste disposal repository (HLWR) which certainly is a complex one in that one must address both engineering and economic factors of the proposed facility and site as well as environmental, public health, safety, and sociopolitical factors. Acknowledging the complexity of the siting problem for a HLWR leads one to readily conclude that a formal analysis, including the use of a performance assessment model (PAM), is needed to assist the designated decision makers in their task of selecting a suitable site. The overall goal of a PAM is to aid the decision makers in making the best possible technical siting decision. For a number of reason, the authors believe that the combining of both Bayesian decision theory and reliability methodology provides the best approach to constructing a useful PAM for assisting in the siting of a HLWR. This combination allows one to formally integrate existing relevant information, professional judgement, and component model outputs to produce conditionally estimated probabilities for a decision tree approach to the radionuclide release problem of a proposed HLWR. If loss functions are available, this also allows one to calculate the expected costs or losses from possible radionuclide releases. This latter calculation may be very important in selecting the final site from among a number of alternative sites

  12. Nanoscale deformation measurements for reliability assessment of material interfaces

    Science.gov (United States)

    Keller, Jürgen; Gollhardt, Astrid; Vogel, Dietmar; Michel, Bernd

    2006-03-01

    With the development and application of micro/nano electronic mechanical systems (MEMS, NEMS) for a variety of market segments new reliability issues will arise. The understanding of material interfaces is the key for a successful design for reliability of MEMS/NEMS and sensor systems. Furthermore in the field of BIOMEMS newly developed advanced materials and well known engineering materials are combined despite of fully developed reliability concepts for such devices and components. In addition the increasing interface-to volume ratio in highly integrated systems and nanoparticle filled materials are challenges for experimental reliability evaluation. New strategies for reliability assessment on the submicron scale are essential to fulfil the needs of future devices. In this paper a nanoscale resolution experimental method for the measurement of thermo-mechanical deformation at material interfaces is introduced. The determination of displacement fields is based on scanning probe microscopy (SPM) data. In-situ SPM scans of the analyzed object (i.e. material interface) are carried out at different thermo-mechanical load states. The obtained images are compared by grayscale cross correlation algorithms. This allows the tracking of local image patterns of the analyzed surface structure. The measurement results are full-field displacement fields with nanometer resolution. With the obtained data the mixed mode type of loading at material interfaces can be analyzed with highest resolution for future needs in micro system and nanotechnology.

  13. Olive pomace based lightweight concrete, an experimental approach and contribution

    Directory of Open Access Journals (Sweden)

    Lynda Amel Chaabane

    2018-01-01

    Full Text Available Due to conventional aggregates resources depletion, material recycling has become an economic and ecologic alternative. In this paper, locally available natural residues such as olive pomace were investigated, when partially incorporated in the concrete formulation, since the mechanical characteristics of lightweight aggregate concrete strongly depend on its properties and proportions. Lightweight aggregates are more deformable than the cement matrix because of their high porosity, and their influence on the concrete strength remains complex. The purpose of this paper is to investigate the aggregates properties on lightweight concrete mechanical behaviour through an experimental approach. In addition, the different substitution sequences and the W/C ratio on lightweight concrete behaviour were evaluated, in order to determine the W/C ratio influence on the improvement of the lightweight concrete mechanical properties while knowing that the mixing water quantity gives the cement paste manoeuvrability and mechanical strength effects. The last part of this paper, therefore, was to provide statistical survey for estimating strength and weight reduction through the different natural aggregate substitutions to improve the lightweight concrete properties. The results achieved in a significant olive-pomace lower adhesion with the matrix after the cement setting, making the lightweight concrete mechanical strength weak. However, this work can open several perspectives: Results modeling and correlation with an experimental approach, the evolution and determination of lightweight concrete characteristics when exposed to high temperatures and thermohydric properties.

  14. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  15. Reliability Assessment for Low-cost Unmanned Aerial Vehicles

    Science.gov (United States)

    Freeman, Paul Michael

    Existing low-cost unmanned aerospace systems are unreliable, and engineers must blend reliability analysis with fault-tolerant control in novel ways. This dissertation introduces the University of Minnesota unmanned aerial vehicle flight research platform, a comprehensive simulation and flight test facility for reliability and fault-tolerance research. An industry-standard reliability assessment technique, the failure modes and effects analysis, is performed for an unmanned aircraft. Particular attention is afforded to the control surface and servo-actuation subsystem. Maintaining effector health is essential for safe flight; failures may lead to loss of control incidents. Failure likelihood, severity, and risk are qualitatively assessed for several effector failure modes. Design changes are recommended to improve aircraft reliability based on this analysis. Most notably, the control surfaces are split, providing independent actuation and dual-redundancy. The simulation models for control surface aerodynamic effects are updated to reflect the split surfaces using a first-principles geometric analysis. The failure modes and effects analysis is extended by using a high-fidelity nonlinear aircraft simulation. A trim state discovery is performed to identify the achievable steady, wings-level flight envelope of the healthy and damaged vehicle. Tolerance of elevator actuator failures is studied using familiar tools from linear systems analysis. This analysis reveals significant inherent performance limitations for candidate adaptive/reconfigurable control algorithms used for the vehicle. Moreover, it demonstrates how these tools can be applied in a design feedback loop to make safety-critical unmanned systems more reliable. Control surface impairments that do occur must be quickly and accurately detected. This dissertation also considers fault detection and identification for an unmanned aerial vehicle using model-based and model-free approaches and applies those

  16. Experimental approach towards shell structure at 100Sn and 78Ni

    International Nuclear Information System (INIS)

    Grawe, H.; Gorska, M.; Fahlander, C.

    2000-07-01

    The status of experimental approach to 100 Sn and 78 Ni is reviewed. Revised single particle energies for neutrons are deduced for the N=Z=50 shell closure and evidence for low lying I π =2 + and 3 - states is presented. Moderate E2 polarisation charges of 0.1 e and 0.6 e are found to reproduce the experimental data when core excitation of 100 Sn is properly accounted for in the shell model. For the neutron rich Ni region no conclusive evidence for a N=40 subshell is found, whereas firm evidence for the persistence of the N=50 shell at 78 Ni is inferred from the existence of seniority isomers. The disappearance of this isomerism in the mid νg 9/2 shell is discussed. (orig.)

  17. Experimental approaches for the development of gamma spectroscopy well logging system

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Jehyun; Hwang, Seho; Kim, Jongman [Korea Institute of Geoscience and Mineral Resources (124 Gwahang-no, Yuseong-gu, Daejeon, Korea) (Korea, Republic of); Won, Byeongho [Heesong Geotek Co., Ltd (146-8 Sangdaewon-dong, Jungwon-gu, Seongnam-si, Gyeonggi-do, Korea) (Korea, Republic of)

    2015-03-10

    This article discusses experimental approaches for the development of gamma spectroscopy well logging system. Considering the size of borehole sonde, we customize 2 x 2 inches inorganic scintillators and the system including high voltage, preamplifier, amplifier and multichannel analyzer (MCA). The calibration chart is made by test using standard radioactive sources so that the measured count rates are expressed by energy spectrum. Optimum high-voltage supplies and the measurement parameters of each detector are set up by experimental investigation. Also, the responses of scintillation detectors have been examined by analysis according to the distance between source and detector. Because gamma spectroscopy well logging needs broad spectrum, high sensitivity and resolution, the energy resolution and sensitivity as a function of gamma ray energy are investigated by analyzing the gamma ray activities of the radioactive sources.

  18. Probabilistic assessment of pressure vessel and piping reliability

    International Nuclear Information System (INIS)

    Sundararajan, C.

    1986-01-01

    The paper presents a critical review of the state-of-the-art in probabilistic assessment of pressure vessel and piping reliability. First the differences in assessing the reliability directly from historical failure data and indirectly by a probabilistic analysis of the failure phenomenon are discussed and the advantages and disadvantages are pointed out. The rest of the paper deals with the latter approach of reliability assessment. Methods of probabilistic reliability assessment are described and major projects where these methods are applied for pressure vessel and piping problems are discussed. An extensive list of references is provided at the end of the paper

  19. Cortical projection of the inferior choroidal point as a reliable landmark to place the corticectomy and reach the temporal horn through a middle temporal gyrus approach.

    Science.gov (United States)

    Frigeri, Thomas; Rhoton, Albert; Paglioli, Eliseu; Azambuja, Ney

    2014-10-01

    To establish preoperatively the localization of the cortical projection of the inferior choroidal point (ICP) and use it as a reliable landmark when approaching the temporal horn through a middle temporal gyrus access. To review relevant anatomical features regarding selective amigdalohippocampectomy (AH) for treatment of mesial temporal lobe epilepsy (MTLE). The cortical projection of the inferior choroidal point was used in more than 300 surgeries by one authors as a reliable landmark to reach the temporal horn. In the laboratory, forty cerebral hemispheres were examined. The cortical projection of the ICP is a reliable landmark for reaching the temporal horn.

  20. "A Comparison of Consensus, Consistency, and Measurement Approaches to Estimating Interrater Reliability"

    OpenAIRE

    Steven E. Stemler

    2004-01-01

    This article argues that the general practice of describing interrater reliability as a single, unified concept is..at best imprecise, and at worst potentially misleading. Rather than representing a single concept, different..statistical methods for computing interrater reliability can be more accurately classified into one of three..categories based upon the underlying goals of analysis. The three general categories introduced and..described in this paper are: 1) consensus estimates, 2) cons...

  1. Benchmarking Experimental and Computational Thermochemical Data: A Case Study of the Butane Conformers.

    Science.gov (United States)

    Barna, Dóra; Nagy, Balázs; Csontos, József; Császár, Attila G; Tasi, Gyula

    2012-02-14

    Due to its crucial importance, numerous studies have been conducted to determine the enthalpy difference between the conformers of butane. However, it is shown here that the most reliable experimental values are biased due to the statistical model utilized during the evaluation of the raw experimental data. In this study, using the appropriate statistical model, both the experimental expectation values and the associated uncertainties are revised. For the 133-196 and 223-297 K temperature ranges, 668 ± 20 and 653 ± 125 cal mol(-1), respectively, are recommended as reference values. Furthermore, to show that present-day quantum chemistry is a favorable alternative to experimental techniques in the determination of enthalpy differences of conformers, a focal-point analysis, based on coupled-cluster electronic structure computations, has been performed that included contributions of up to perturbative quadruple excitations as well as small correction terms beyond the Born-Oppenheimer and nonrelativistic approximations. For the 133-196 and 223-297 K temperature ranges, in exceptional agreement with the corresponding revised experimental data, our computations yielded 668 ± 3 and 650 ± 6 cal mol(-1), respectively. The most reliable enthalpy difference values for 0 and 298.15 K are also provided by the computational approach, 680.9 ± 2.5 and 647.4 ± 7.0 cal mol(-1), respectively.

  2. Systamatic approach to integration of a human reliability analysis into a NPP probabalistic risk assessment

    International Nuclear Information System (INIS)

    Fragola, J.R.

    1984-01-01

    This chapter describes the human reliability analysis tasks which were employed in the evaluation of the overall probability of an internal flood sequence and its consequences in terms of disabling vulnerable risk significant equipment. Topics considered include the problem familiarization process, the identification and classification of key human interactions, a human interaction review of potential initiators, a maintenance and operations review, human interaction identification, quantification model selection, the definition of operator-induced sequences, the quantification of specific human interactions, skill- and rule-based interactions, knowledge-based interactions, and the incorporation of human interaction-related events into the event tree structure. It is concluded that an integrated approach to the analysis of human interaction within the context of a Probabilistic Risk Assessment (PRA) is feasible

  3. Reliability Analysis for Adhesive Bonded Composite Stepped Lap Joints Loaded in Fatigue

    DEFF Research Database (Denmark)

    Kimiaeifar, Amin; Sørensen, John Dalsgaard; Lund, Erik

    2012-01-01

    -1, where partial safety factors are introduced together with characteristic values. Asymptotic sampling is used to estimate the reliability with support points generated by randomized Sobol sequences. The predicted reliability level is compared with the implicitly required target reliability level defined......This paper describes a probabilistic approach to calculate the reliability of adhesive bonded composite stepped lap joints loaded in fatigue using three- dimensional finite element analysis (FEA). A method for progressive damage modelling is used to assess fatigue damage accumulation and residual...... by the wind turbine standard IEC 61400-1. Finally, an approach for the assessment of the reliability of adhesive bonded composite stepped lap joints loaded in fatigue is presented. The introduced methodology can be applied in the same way to calculate the reliability level of wind turbine blade components...

  4. A new approach for interexaminer reliability data analysis on dental caries calibration

    Directory of Open Access Journals (Sweden)

    Andréa Videira Assaf

    2007-12-01

    Full Text Available Objectives: a to evaluate the interexaminer reliability in caries detection considering different diagnostic thresholds and b to indicate, by using Kappa statistics, the best way of measuring interexaminer agreement during the calibration process in dental caries surveys. Methods: Eleven dentists participated in the initial training, which was divided into theoretical discussions and practical activities, and calibration exercises, performed at baseline, 3 and 6 months after the initial training. For the examinations of 6-7-year-old schoolchildren, the World Health Organization (WHO recommendations were followed and different diagnostic thresholds were used: WHO (decayed/missing/filled teeth - DMFT index and WHO + IL (initial lesion diagnostic thresholds. The interexaminer reliability was calculated by Kappa statistics, according to WHO and WHO+IL thresholds considering: a the entire dentition; b upper/lower jaws; c sextants; d each tooth individually. Results: Interexaminer reliability was high for both diagnostic thresholds; nevertheless, it decreased in all calibration sections when considering teeth individually. Conclusion: The interexaminer reliability was possible during the period of 6 months, under both caries diagnosis thresholds. However, great disagreement was observed for posterior teeth, especially using the WHO+IL criteria. Analysis considering dental elements individually was the best way of detecting interexaminer disagreement during the calibration sections.

  5. Reliability of Beam Loss Monitors System for the Large Hadron Collider

    CERN Document Server

    Guaglio, Gianluca; Santoni, C

    2004-01-01

    The employment of superconducting magnets, in the high energies colliders, opens challenging failure scenarios and brings new criticalities for the whole system protection. For the LHC beam loss protection system, the failure rate and the availability requirements have been evaluated using the Safety Integrity Level (SIL) approach. A downtime cost evaluation is used as input for the SIL approach. The most critical systems, which contribute to the final SIL value, are the dump system, the interlock system, the beam loss monitors system and the energy monitor system. The Beam Loss Monitors System (BLMS) is critical for short and intense particles losses, while at medium and higher loss time it is assisted by other systems, such as the quench protection system and the cryogenic system. For BLMS, hardware and software have been evaluated in detail. The reliability input figures have been collected using historical data from the SPS, using temperature and radiation damage experimental data as well as using standar...

  6. Reliability analysis of component of affination centrifugal 1 machine by using reliability engineering

    Science.gov (United States)

    Sembiring, N.; Ginting, E.; Darnello, T.

    2017-12-01

    Problems that appear in a company that produces refined sugar, the production floor has not reached the level of critical machine availability because it often suffered damage (breakdown). This results in a sudden loss of production time and production opportunities. This problem can be solved by Reliability Engineering method where the statistical approach to historical damage data is performed to see the pattern of the distribution. The method can provide a value of reliability, rate of damage, and availability level, of an machine during the maintenance time interval schedule. The result of distribution test to time inter-damage data (MTTF) flexible hose component is lognormal distribution while component of teflon cone lifthing is weibull distribution. While from distribution test to mean time of improvement (MTTR) flexible hose component is exponential distribution while component of teflon cone lifthing is weibull distribution. The actual results of the flexible hose component on the replacement schedule per 720 hours obtained reliability of 0.2451 and availability 0.9960. While on the critical components of teflon cone lifthing actual on the replacement schedule per 1944 hours obtained reliability of 0.4083 and availability 0.9927.

  7. A novel reliability evaluation method for large engineering systems

    Directory of Open Access Journals (Sweden)

    Reda Farag

    2016-06-01

    Full Text Available A novel reliability evaluation method for large nonlinear engineering systems excited by dynamic loading applied in time domain is presented. For this class of problems, the performance functions are expected to be function of time and implicit in nature. Available first- or second-order reliability method (FORM/SORM will be challenging to estimate reliability of such systems. Because of its inefficiency, the classical Monte Carlo simulation (MCS method also cannot be used for large nonlinear dynamic systems. In the proposed approach, only tens instead of hundreds or thousands of deterministic evaluations at intelligently selected points are used to extract the reliability information. A hybrid approach, consisting of the stochastic finite element method (SFEM developed by the author and his research team using FORM, response surface method (RSM, an interpolation scheme, and advanced factorial schemes, is proposed. The method is clarified with the help of several numerical examples.

  8. Human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1989-08-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim of assessing the state of the art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participated in the HF-RBE. The HF-RBE was organised around two study cases: (1) analysis of routine functional Test and Maintenance (TPM) procedures: with the aim of assessing the probability of test induced failures, the probability of failures to remain unrevealed and the potential to initiate transients because of errors performed in the test; (2) analysis of human actions during an operational transient: with the aim of assessing the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. This report summarises the contributions received from the participants and analyses these contributions on a comparative basis. The aim of this analysis was to compare the procedures, modelling techniques and quantification methods used, to obtain insight in the causes and magnitude of the variability observed in the results, to try to identify preferred human reliability assessment approaches and to get an understanding of the current state of the art in the field identifying the limitations that are still inherent to the different approaches

  9. An approach based on defense-in-depth and diversity (3D) for the reliability assessment of digital instrument and control systems of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Paulo Adriano da; Saldanha, Pedro L.C., E-mail: pasilva@cnen.gov.b, E-mail: Saldanha@cnen.gov.b [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil). Coord. Geral de Reatores Nucleares; Melo, Paulo F. Frutuoso e, E-mail: frutuoso@nuclear.ufrj.b [Universidade Federal do Rio de Janeiro (PEN/COPPE/UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao em Engenharia. Programa de Engenharia Nuclear; Araujo, Ademir L. de [Associacao Brasileira de Ensino Universitario (UNIABEU), Angra dos Reis, RJ (Brazil)

    2011-07-01

    The adoption of instrumentation and control (I and C) digital technology has been slower in nuclear power plants. The reason has been unfruitful efforts to obtain evidence in order to prove that I and C systems can be used in nuclear safety systems, for example, the Reactor Protection System (RPS), ensuring the proper operation of all its functions. This technology offers a potential improvement for safety and reliability. However, there still no consensus about the model to be adopted for digital systems software to be used in reliability studies. This paper presents the 3D methodology approach to assess digital I and C reliability. It is based on the study of operational events occurring in NPPs. It is easy to identify, in general, the level of I and C system reliability, showing its key vulnerabilities, enabling to trace regulatory actions to minimize or avoid them. This approach makes it possible to identify the main types of digital I and C system failure, with the potential for common cause failures as well as evaluating the dominant failure modes. The MAFIC-D software was developed to assist the implementation of the relationships between the reliability criteria, the analysis of relationships and data collection. The results obtained through this tool proved to be satisfactory and complete the process of regulatory decision-making from licensing I and C digital of NPPs and call still be used to monitor the performance of I and C digital post-licensing during the lifetime of the system, providing the basis for the elaboration of checklists of regulatory inspections. (author)

  10. An approach based on defense-in-depth and diversity (3D) for the reliability assessment of digital instrument and control systems of nuclear power plants

    International Nuclear Information System (INIS)

    Silva, Paulo Adriano da; Saldanha, Pedro L.C.

    2011-01-01

    The adoption of instrumentation and control (I and C) digital technology has been slower in nuclear power plants. The reason has been unfruitful efforts to obtain evidence in order to prove that I and C systems can be used in nuclear safety systems, for example, the Reactor Protection System (RPS), ensuring the proper operation of all its functions. This technology offers a potential improvement for safety and reliability. However, there still no consensus about the model to be adopted for digital systems software to be used in reliability studies. This paper presents the 3D methodology approach to assess digital I and C reliability. It is based on the study of operational events occurring in NPPs. It is easy to identify, in general, the level of I and C system reliability, showing its key vulnerabilities, enabling to trace regulatory actions to minimize or avoid them. This approach makes it possible to identify the main types of digital I and C system failure, with the potential for common cause failures as well as evaluating the dominant failure modes. The MAFIC-D software was developed to assist the implementation of the relationships between the reliability criteria, the analysis of relationships and data collection. The results obtained through this tool proved to be satisfactory and complete the process of regulatory decision-making from licensing I and C digital of NPPs and call still be used to monitor the performance of I and C digital post-licensing during the lifetime of the system, providing the basis for the elaboration of checklists of regulatory inspections. (author)

  11. Reliable Portfolio Selection Problem in Fuzzy Environment: An mλ Measure Based Approach

    Directory of Open Access Journals (Sweden)

    Yuan Feng

    2017-04-01

    Full Text Available This paper investigates a fuzzy portfolio selection problem with guaranteed reliability, in which the fuzzy variables are used to capture the uncertain returns of different securities. To effectively handle the fuzziness in a mathematical way, a new expected value operator and variance of fuzzy variables are defined based on the m λ measure that is a linear combination of the possibility measure and necessity measure to balance the pessimism and optimism in the decision-making process. To formulate the reliable portfolio selection problem, we particularly adopt the expected total return and standard variance of the total return to evaluate the reliability of the investment strategies, producing three risk-guaranteed reliable portfolio selection models. To solve the proposed models, an effective genetic algorithm is designed to generate the approximate optimal solution to the considered problem. Finally, the numerical examples are given to show the performance of the proposed models and algorithm.

  12. Reliability Analysis of Retaining Walls Subjected to Blast Loading by Finite Element Approach

    Science.gov (United States)

    GuhaRay, Anasua; Mondal, Stuti; Mohiuddin, Hisham Hasan

    2018-02-01

    Conventional design methods adopt factor of safety as per practice and experience, which are deterministic in nature. The limit state method, though not completely deterministic, does not take into account effect of design parameters, which are inherently variable such as cohesion, angle of internal friction, etc. for soil. Reliability analysis provides a measure to consider these variations into analysis and hence results in a more realistic design. Several studies have been carried out on reliability of reinforced concrete walls and masonry walls under explosions. Also, reliability analysis of retaining structures against various kinds of failure has been done. However, very few research works are available on reliability analysis of retaining walls subjected to blast loading. Thus, the present paper considers the effect of variation of geotechnical parameters when a retaining wall is subjected to blast loading. However, it is found that the variation of geotechnical random variables does not have a significant effect on the stability of retaining walls subjected to blast loading.

  13. Development of student performance assessment based on scientific approach for a basic physics practicum in simple harmonic motion materials

    Science.gov (United States)

    Serevina, V.; Muliyati, D.

    2018-05-01

    This research aims to develop students’ performance assessment instrument based on scientific approach is valid and reliable in assessing the performance of students on basic physics lab of Simple Harmonic Motion (SHM). This study uses the ADDIE consisting of stages: Analyze, Design, Development, Implementation, and Evaluation. The student performance assessment developed can be used to measure students’ skills in observing, asking, conducting experiments, associating and communicate experimental results that are the ‘5M’ stages in a scientific approach. Each grain of assessment in the instrument is validated by the instrument expert and the evaluation with the result of all points of assessment shall be eligible to be used with a 100% eligibility percentage. The instrument is then tested for the quality of construction, material, and language by panel (lecturer) with the result: 85% or very good instrument construction aspect, material aspect 87.5% or very good, and language aspect 83% or very good. For small group trial obtained instrument reliability level of 0.878 or is in the high category, where r-table is 0.707. For large group trial obtained instrument reliability level of 0.889 or is in the high category, where r-table is 0.320. Instruments declared valid and reliable for 5% significance level. Based on the result of this research, it can be concluded that the student performance appraisal instrument based on the developed scientific approach is declared valid and reliable to be used in assessing student skill in SHM experimental activity.

  14. A New Approach for Analyzing the Reliability of the Repair Facility in a Series System with Vacations

    Directory of Open Access Journals (Sweden)

    Renbin Liu

    2012-01-01

    Full Text Available Based on the renewal process theory we develop a decomposition method to analyze the reliability of the repair facility in an n-unit series system with vacations. Using this approach, we study the unavailability and the mean replacement number during (0,t] of the repair facility. The method proposed in this work is novel and concise, which can make us see clearly the structures of the facility indices of a series system with an unreliable repair facility, two convolution relations. Special cases and numerical examples are given to show the validity of our method.

  15. Reliability analysis and updating of deteriorating systems with subset simulation

    DEFF Research Database (Denmark)

    Schneider, Ronald; Thöns, Sebastian; Straub, Daniel

    2017-01-01

    An efficient approach to reliability analysis of deteriorating structural systems is presented, which considers stochastic dependence among element deterioration. Information on a deteriorating structure obtained through inspection or monitoring is included in the reliability assessment through B...... is an efficient and robust sampling-based algorithm suitable for such analyses. The approach is demonstrated in two case studies considering a steel frame structure and a Daniels system subjected to high-cycle fatigue....

  16. Damage Model for Reliability Assessment of Solder Joints in Wind Turbines

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    environmental factors. Reliability assessment for such type of products conventionally is performed by classical reliability techniques based on test data. Usually conventional reliability approaches are time and resource consuming activities. Thus in this paper we choose a physics of failure approach to define...... damage model by Miner’s rule. Our attention is focused on crack propagation in solder joints of electrical components due to the temperature loadings. Based on the proposed method it is described how to find the damage level for a given temperature loading profile. The proposed method is discussed...

  17. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    Science.gov (United States)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  18. Design for Reliability in Renewable Energy Systems

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Zhou, Dao; Sangwongwanich, Ariya

    2017-01-01

    Power electronics are widely used in renewable energy systems to achieve lower cost of energy, higher efficiency and high power density. At the same time, the high reliability of the power electronics products is demanded, in order to reduce the failure rates and ensure cost-effective operation...... of the renewable energy systems. This paper thus describes the basic concepts used in reliability engineering, and presents the status and future trends of Design for Reliability (DfR) in power electronics, which is currently undergoing a paradigm shift to a physics-of-failure approach. Two case studies of a 2 MW...

  19. Random Fuzzy Extension of the Universal Generating Function Approach for the Reliability Assessment of Multi-State Systems Under Aleatory and Epistemic Uncertainties

    DEFF Research Database (Denmark)

    Li, Yan-Fu; Ding, Yi; Zio, Enrico

    2014-01-01

    . In this work, we extend the traditional universal generating function (UGF) approach for multi-state system (MSS) availability and reliability assessment to account for both aleatory and epistemic uncertainties. First, a theoretical extension, named hybrid UGF (HUGF), is made to introduce the use of random...... fuzzy variables (RFVs) in the approach. Second, the composition operator of HUGF is defined by considering simultaneously the probabilistic convolution and the fuzzy extension principle. Finally, an efficient algorithm is designed to extract probability boxes ($p$ -boxes) from the system HUGF, which...

  20. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    Directory of Open Access Journals (Sweden)

    Nielsen Morten

    2009-07-01

    Full Text Available Abstract Background Estimation of the reliability of specific real value predictions is nontrivial and the efficacy of this is often questionable. It is important to know if you can trust a given prediction and therefore the best methods associate a prediction with a reliability score or index. For discrete qualitative predictions, the reliability is conventionally estimated as the difference between output scores of selected classes. Such an approach is not feasible for methods that predict a biological feature as a single real value rather than a classification. As a solution to this challenge, we have implemented a method that predicts the relative surface accessibility of an amino acid and simultaneously predicts the reliability for each prediction, in the form of a Z-score. Results An ensemble of artificial neural networks has been trained on a set of experimentally solved protein structures to predict the relative exposure of the amino acids. The method assigns a reliability score to each surface accessibility prediction as an inherent part of the training process. This is in contrast to the most commonly used procedures where reliabilities are obtained by post-processing the output. Conclusion The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability score with the individual predictions. However, our implementation of reliability scores in the form of a Z-score is shown to be the more informative measure for discriminating good predictions from bad ones in the entire range from completely buried to fully exposed amino acids. This is evident when comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0

  1. Towards Reliable Multi-Hop Broadcast in VANETs: An Analytical Approach

    NARCIS (Netherlands)

    Gholibeigi, Mozhdeh; Baratchi, Mitra; van den Berg, Hans Leo; Heijenk, Geert

    2016-01-01

    Intelligent Transportation Systems in the domain of vehicular networking, have recently been subject to rapid development. In vehicular ad hoc networks, data broadcast is one of the main communication types and its reliability is crucial for high performance applications. However, due to the lack of

  2. Towards reliable multi-hop broadcast in VANETs : An analytical approach

    NARCIS (Netherlands)

    Gholibeigi, M.; Baratchi, M.; Berg, J.L. van den; Heijenk, G.

    2017-01-01

    Intelligent Transportation Systems in the domain of vehicular networking, have recently been subject to rapid development. In vehicular ad hoc networks, data broadcast is one of the main communication types and its reliability is crucial for high performance applications. However, due to the lack of

  3. Software engineering practices for control system reliability

    International Nuclear Information System (INIS)

    S. K. Schaffner; K. S White

    1999-01-01

    This paper will discuss software engineering practices used to improve Control System reliability. The authors begin with a brief discussion of the Software Engineering Institute's Capability Maturity Model (CMM) which is a framework for evaluating and improving key practices used to enhance software development and maintenance capabilities. The software engineering processes developed and used by the Controls Group at the Thomas Jefferson National Accelerator Facility (Jefferson Lab), using the Experimental Physics and Industrial Control System (EPICS) for accelerator control, are described. Examples are given of how their procedures have been used to minimized control system downtime and improve reliability. While their examples are primarily drawn from their experience with EPICS, these practices are equally applicable to any control system. Specific issues addressed include resource allocation, developing reliable software lifecycle processes and risk management

  4. Reliability factors of tube-to-tubesheet joints

    International Nuclear Information System (INIS)

    Sang, Z.F.; Zhu, Y.Z.; Widera, G.E.O.

    1992-01-01

    The main purpose of this paper is to provide an applicable method to establish reliability factors for expanded tube-to-tubesheet joints. The paper also reports on the results of a preliminary study to validate experimentally the reliability efficiencies listed in Table A-2 of Appendix A of Section VIII, Division 1, of the ASME Boiler and Pressure Vessel Code. A comparison between the actual reliability factors f r , determined from testing the damage strength of the joint and calculated according to Appendix A-4 of the ASME Code, and those of Table A-2 is carried out. The results are discussed in light of the restrictions inherent in Table A-2. It is confirmed that some existing values of f r are conservative while others are less so. (orig.)

  5. Reliable software for unreliable hardware a cross layer perspective

    CERN Document Server

    Rehman, Semeen; Henkel, Jörg

    2016-01-01

    This book describes novel software concepts to increase reliability under user-defined constraints. The authors’ approach bridges, for the first time, the reliability gap between hardware and software. Readers will learn how to achieve increased soft error resilience on unreliable hardware, while exploiting the inherent error masking characteristics and error (stemming from soft errors, aging, and process variations) mitigations potential at different software layers. · Provides a comprehensive overview of reliability modeling and optimization techniques at different hardware and software levels; · Describes novel optimization techniques for software cross-layer reliability, targeting unreliable hardware.

  6. An interval-valued reliability model with bounded failure rates

    DEFF Research Database (Denmark)

    Kozine, Igor; Krymsky, Victor

    2012-01-01

    The approach to deriving interval-valued reliability measures described in this paper is distinctive from other imprecise reliability models in that it overcomes the issue of having to impose an upper bound on time to failure. It rests on the presupposition that a constant interval-valued failure...... rate is known possibly along with other reliability measures, precise or imprecise. The Lagrange method is used to solve the constrained optimization problem to derive new reliability measures of interest. The obtained results call for an exponential-wise approximation of failure probability density...

  7. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  8. Reliability-based performance simulation for optimized pavement maintenance

    International Nuclear Information System (INIS)

    Chou, Jui-Sheng; Le, Thanh-Son

    2011-01-01

    Roadway pavement maintenance is essential for driver safety and highway infrastructure efficiency. However, regular preventive maintenance and rehabilitation (M and R) activities are extremely costly. Unfortunately, the funds available for the M and R of highway pavement are often given lower priority compared to other national development policies, therefore, available funds must be allocated wisely. Maintenance strategies are typically implemented by optimizing only the cost whilst the reliability of facility performance is neglected. This study proposes a novel algorithm using multi-objective particle swarm optimization (MOPSO) technique to evaluate the cost-reliability tradeoff in a flexible maintenance strategy based on non-dominant solutions. Moreover, a probabilistic model for regression parameters is employed to assess reliability-based performance. A numerical example of a highway pavement project is illustrated to demonstrate the efficacy of the proposed MOPSO algorithms. The analytical results show that the proposed approach can help decision makers to optimize roadway maintenance plans. - Highlights: →A novel algorithm using multi-objective particle swarm optimization technique. → Evaluation of the cost-reliability tradeoff in a flexible maintenance strategy. → A probabilistic model for regression parameters is employed to assess reliability-based performance. → The proposed approach can help decision makers to optimize roadway maintenance plans.

  9. Reliability-based performance simulation for optimized pavement maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Chou, Jui-Sheng, E-mail: jschou@mail.ntust.edu.tw [Department of Construction Engineering, National Taiwan University of Science and Technology (Taiwan Tech), 43 Sec. 4, Keelung Rd., Taipei 106, Taiwan (China); Le, Thanh-Son [Department of Construction Engineering, National Taiwan University of Science and Technology (Taiwan Tech), 43 Sec. 4, Keelung Rd., Taipei 106, Taiwan (China)

    2011-10-15

    Roadway pavement maintenance is essential for driver safety and highway infrastructure efficiency. However, regular preventive maintenance and rehabilitation (M and R) activities are extremely costly. Unfortunately, the funds available for the M and R of highway pavement are often given lower priority compared to other national development policies, therefore, available funds must be allocated wisely. Maintenance strategies are typically implemented by optimizing only the cost whilst the reliability of facility performance is neglected. This study proposes a novel algorithm using multi-objective particle swarm optimization (MOPSO) technique to evaluate the cost-reliability tradeoff in a flexible maintenance strategy based on non-dominant solutions. Moreover, a probabilistic model for regression parameters is employed to assess reliability-based performance. A numerical example of a highway pavement project is illustrated to demonstrate the efficacy of the proposed MOPSO algorithms. The analytical results show that the proposed approach can help decision makers to optimize roadway maintenance plans. - Highlights: > A novel algorithm using multi-objective particle swarm optimization technique. > Evaluation of the cost-reliability tradeoff in a flexible maintenance strategy. > A probabilistic model for regression parameters is employed to assess reliability-based performance. > The proposed approach can help decision makers to optimize roadway maintenance plans.

  10. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    Science.gov (United States)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential

  11. Fatigue Reliability of Offshore Wind Turbine Systems

    DEFF Research Database (Denmark)

    Marquez-Dominguez, Sergio; Sørensen, John Dalsgaard

    2012-01-01

    of appropriate partial safety factors / fatigue design factors (FDF) for steel substructures of offshore wind turbines (OWTs). The fatigue life is modeled by the SN approach. Design and limit state equations are established based on the accumulated fatigue damage. The acceptable reliability level for optimal...... fatigue design of OWTs is discussed and results for reliability assessment of typical fatigue critical design of offshore steel support structures are presented....

  12. Product quality, service reliability and management of operations at ...

    African Journals Online (AJOL)

    High product quality, service reliability, and management of operations are key factors in business growth and sustainability. Analyzing “The Starbucks Experience” is a pedagogical approach to reinforcing the concepts of control and management of quality, service reliability, and efficient operations in action. The objective ...

  13. Reliability of poly 3,4-ethylenedioxythiophene strain gauge

    DEFF Research Database (Denmark)

    Mateiu, Ramona Valentina; Lillemose, Michael; Hansen, Thomas Steen

    2007-01-01

    We report on the experimentally observed reliability of the piezoresistive effect in strained poly 3,4-ethylenedioxythiophene (PEDT). PEDT is an intrinsic conductive polymer which can be patterned by conventional Cleanroom processing, and thus presents a promising material for all-polymer Microsy......We report on the experimentally observed reliability of the piezoresistive effect in strained poly 3,4-ethylenedioxythiophene (PEDT). PEDT is an intrinsic conductive polymer which can be patterned by conventional Cleanroom processing, and thus presents a promising material for all......-polymer Microsystems. The measurements are made on microfabricated test chips with PEDT resistors patterned by conventional UV-lithography and reactive ion etching (RIE). We determine a gauge factor of 3.41 ± 0.42 for the strained PEDT and we see an increase in resistivity from 1.98 · 104 X m to 2.22 · 104 X m when...

  14. Energy requirements during sponge cake baking: Experimental and simulated approach

    International Nuclear Information System (INIS)

    Ureta, M. Micaela; Goñi, Sandro M.; Salvadori, Viviana O.; Olivera, Daniela F.

    2017-01-01

    Highlights: • Sponge cake energy consumption during baking was studied. • High oven temperature and forced convection mode favours oven energy savings. • Forced convection produced higher weight loss thus a higher product energy demand. • Product energy demand was satisfactorily estimated by the baking model applied. • The greatest energy efficiency corresponded to the forced convection mode. - Abstract: Baking is a high energy demanding process, which requires special attention in order to know and improve its efficiency. In this work, energy consumption associated to sponge cake baking is investigated. A wide range of operative conditions (two ovens, three convection modes, three oven temperatures) were compared. Experimental oven energy consumption was estimated taking into account the heating resistances power and a usage factor. Product energy demand was estimated from both experimental and modeling approaches considering sensible and latent heat. Oven energy consumption results showed that high oven temperature and forced convection mode favours energy savings. Regarding product energy demand, forced convection produced faster and higher weight loss inducing a higher energy demand. Besides, this parameter was satisfactorily estimated by the baking model applied, with an average error between experimental and simulated values in a range of 8.0–10.1%. Finally, the energy efficiency results indicated that it increased linearly with the effective oven temperature and that the greatest efficiency corresponded to the forced convection mode.

  15. Reliability of mechanisms with periodic random modal frequencies using an extreme value-based approach

    International Nuclear Information System (INIS)

    Savage, Gordon J.; Zhang, Xufang; Son, Young Kap; Pandey, Mahesh D.

    2016-01-01

    Resonance in a dynamic system is to be avoided since it often leads to impaired performance, overstressing, fatigue fracture and adverse human reactions. Thus, it is necessary to know the modal frequencies and ensure they do not coincide with any applied periodic loadings. For a rotating planar mechanism, the coefficients in the mass and stiffness matrices are periodically varying, and if the underlying geometry and material properties are treated as random variables then the modal frequencies are both position-dependent and probabilistic. The avoidance of resonance is now a complex problem. Herein, free vibration analysis helps determine ranges of modal frequencies that in turn, identify the running speeds of the mechanism to be avoided. This paper presents an efficient and accurate sample-based approach to determine probabilistic minimum and maximum extremes of the fundamental frequencies and the angular positions of their occurrence. Then, given critical lower and upper frequency constraints it is straightforward to determine reliability in terms of probability of exceedance. The novelty of the proposed approach is that the original expensive and implicit mechanistic model is replaced by an explicit meta-model that captures the tolerances of the design variables over the entire range of angular positions: position-dependent eigenvalues can be found easily and quickly. Extreme-value statistics of the modal frequencies and extreme-value statistics of the angular positions are readily computed through MCS. Limit-state surfaces that connect the frequencies to the design variables may be easily constructed. Error analysis identifies three errors and the paper presents ways to control them so the methodology can be sufficiently accurate. A numerical example of a flexible four-bar linkage shows the proposed methodology has engineering applications. The impact of the proposed methodology is two-fold: it presents a safe-side analysis based on free vibration methods to

  16. Assessment of landslide distribution map reliability in Niigata prefecture - Japan using frequency ratio approach

    Science.gov (United States)

    Rahardianto, Trias; Saputra, Aditya; Gomez, Christopher

    2017-07-01

    Research on landslide susceptibility has evolved rapidly over the few last decades thanks to the availability of large databases. Landslide research used to be focused on discreet events but the usage of large inventory dataset has become a central pillar of landslide susceptibility, hazard, and risk assessment. Indeed, extracting meaningful information from the large database is now at the forth of geoscientific research, following the big-data research trend. Indeed, the more comprehensive information of the past landslide available in a particular area is, the better the produced map will be, in order to support the effective decision making, planning, and engineering practice. The landslide inventory data which is freely accessible online gives an opportunity for many researchers and decision makers to prevent casualties and economic loss caused by future landslides. This data is advantageous especially for areas with poor landslide historical data. Since the construction criteria of landslide inventory map and its quality evaluation remain poorly defined, the assessment of open source landslide inventory map reliability is required. The present contribution aims to assess the reliability of open-source landslide inventory data based on the particular topographical setting of the observed area in Niigata prefecture, Japan. Geographic Information System (GIS) platform and statistical approach are applied to analyze the data. Frequency ratio method is utilized to model and assess the landslide map. The outcomes of the generated model showed unsatisfactory results with AUC value of 0.603 indicate the low prediction accuracy and unreliability of the model.

  17. Model-based experimental design for assessing effects of mixtures of chemicals

    Energy Technology Data Exchange (ETDEWEB)

    Baas, Jan, E-mail: jan.baas@falw.vu.n [Vrije Universiteit of Amsterdam, Dept of Theoretical Biology, De Boelelaan 1085, 1081 HV Amsterdam (Netherlands); Stefanowicz, Anna M., E-mail: anna.stefanowicz@uj.edu.p [Institute of Environmental Sciences, Jagiellonian University, Gronostajowa 7, 30-387 Krakow (Poland); Klimek, Beata, E-mail: beata.klimek@uj.edu.p [Institute of Environmental Sciences, Jagiellonian University, Gronostajowa 7, 30-387 Krakow (Poland); Laskowski, Ryszard, E-mail: ryszard.laskowski@uj.edu.p [Institute of Environmental Sciences, Jagiellonian University, Gronostajowa 7, 30-387 Krakow (Poland); Kooijman, Sebastiaan A.L.M., E-mail: bas@bio.vu.n [Vrije Universiteit of Amsterdam, Dept of Theoretical Biology, De Boelelaan 1085, 1081 HV Amsterdam (Netherlands)

    2010-01-15

    We exposed flour beetles (Tribolium castaneum) to a mixture of four poly aromatic hydrocarbons (PAHs). The experimental setup was chosen such that the emphasis was on assessing partial effects. We interpreted the effects of the mixture by a process-based model, with a threshold concentration for effects on survival. The behavior of the threshold concentration was one of the key features of this research. We showed that the threshold concentration is shared by toxicants with the same mode of action, which gives a mechanistic explanation for the observation that toxic effects in mixtures may occur in concentration ranges where the individual components do not show effects. Our approach gives reliable predictions of partial effects on survival and allows for a reduction of experimental effort in assessing effects of mixtures, extrapolations to other mixtures, other points in time, or in a wider perspective to other organisms. - We show a mechanistic approach to assess effects of mixtures in low concentrations.

  18. Model-based experimental design for assessing effects of mixtures of chemicals

    International Nuclear Information System (INIS)

    Baas, Jan; Stefanowicz, Anna M.; Klimek, Beata; Laskowski, Ryszard; Kooijman, Sebastiaan A.L.M.

    2010-01-01

    We exposed flour beetles (Tribolium castaneum) to a mixture of four poly aromatic hydrocarbons (PAHs). The experimental setup was chosen such that the emphasis was on assessing partial effects. We interpreted the effects of the mixture by a process-based model, with a threshold concentration for effects on survival. The behavior of the threshold concentration was one of the key features of this research. We showed that the threshold concentration is shared by toxicants with the same mode of action, which gives a mechanistic explanation for the observation that toxic effects in mixtures may occur in concentration ranges where the individual components do not show effects. Our approach gives reliable predictions of partial effects on survival and allows for a reduction of experimental effort in assessing effects of mixtures, extrapolations to other mixtures, other points in time, or in a wider perspective to other organisms. - We show a mechanistic approach to assess effects of mixtures in low concentrations.

  19. Cortical projection of the inferior choroidal point as a reliable landmark to place the corticectomy and reach the temporal horn through a middle temporal gyrus approach

    Directory of Open Access Journals (Sweden)

    Thomas Frigeri

    2014-10-01

    Full Text Available Objective To establish preoperatively the localization of the cortical projection of the inferior choroidal point (ICP and use it as a reliable landmark when approaching the temporal horn through a middle temporal gyrus access. To review relevant anatomical features regarding selective amigdalohippocampectomy (AH for treatment of mesial temporal lobe epilepsy (MTLE. Method The cortical projection of the inferior choroidal point was used in more than 300 surgeries by one authors as a reliable landmark to reach the temporal horn. In the laboratory, forty cerebral hemispheres were examined. Conclusion The cortical projection of the ICP is a reliable landmark for reaching the temporal horn.

  20. Seeking high reliability in primary care: Leadership, tools, and organization.

    Science.gov (United States)

    Weaver, Robert R

    2015-01-01

    Leaders in health care increasingly recognize that improving health care quality and safety requires developing an organizational culture that fosters high reliability and continuous process improvement. For various reasons, a reliability-seeking culture is lacking in most health care settings. Developing a reliability-seeking culture requires leaders' sustained commitment to reliability principles using key mechanisms to embed those principles widely in the organization. The aim of this study was to examine how key mechanisms used by a primary care practice (PCP) might foster a reliability-seeking, system-oriented organizational culture. A case study approach was used to investigate the PCP's reliability culture. The study examined four cultural artifacts used to embed reliability-seeking principles across the organization: leadership statements, decision support tools, and two organizational processes. To decipher their effects on reliability, the study relied on observations of work patterns and the tools' use, interactions during morning huddles and process improvement meetings, interviews with clinical and office staff, and a "collective mindfulness" questionnaire. The five reliability principles framed the data analysis. Leadership statements articulated principles that oriented the PCP toward a reliability-seeking culture of care. Reliability principles became embedded in the everyday discourse and actions through the use of "problem knowledge coupler" decision support tools and daily "huddles." Practitioners and staff were encouraged to report unexpected events or close calls that arose and which often initiated a formal "process change" used to adjust routines and prevent adverse events from recurring. Activities that foster reliable patient care became part of the taken-for-granted routine at the PCP. The analysis illustrates the role leadership, tools, and organizational processes play in developing and embedding a reliable-seeking culture across an

  1. Culture Representation in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    David Gertman; Julie Marble; Steven Novack

    2006-12-01

    Understanding human-system response is critical to being able to plan and predict mission success in the modern battlespace. Commonly, human reliability analysis has been used to predict failures of human performance in complex, critical systems. However, most human reliability methods fail to take culture into account. This paper takes an easily understood state of the art human reliability analysis method and extends that method to account for the influence of culture, including acceptance of new technology, upon performance. The cultural parameters used to modify the human reliability analysis were determined from two standard industry approaches to cultural assessment: Hofstede’s (1991) cultural factors and Davis’ (1989) technology acceptance model (TAM). The result is called the Culture Adjustment Method (CAM). An example is presented that (1) reviews human reliability assessment with and without cultural attributes for a Supervisory Control and Data Acquisition (SCADA) system attack, (2) demonstrates how country specific information can be used to increase the realism of HRA modeling, and (3) discusses the differences in human error probability estimates arising from cultural differences.

  2. A G-function-based reliability-based design methodology applied to a cam roller system

    International Nuclear Information System (INIS)

    Wang, W.; Sui, P.; Wu, Y.T.

    1996-01-01

    Conventional reliability-based design optimization methods treats the reliability function as an ordinary function and applies existing mathematical programming techniques to solve the design problem. As a result, the conventional approach requires nested loops with respect to g-function, and is very time consuming. A new reliability-based design method is proposed in this paper that deals with the g-function directly instead of the reliability function. This approach has the potential of significantly reducing the number of calls for g-function calculations since it requires only one full reliability analysis in a design iteration. A cam roller system in a typical high pressure fuel injection diesel engine is designed using both the proposed and the conventional approach. The proposed method is much more efficient for this application

  3. A probabilistic approach to safety/reliability of space nuclear power systems

    International Nuclear Information System (INIS)

    Medford, G.; Williams, K.; Kolaczkowski, A.

    1989-01-01

    An ongoing effort is investigating the feasibility of using probabilistic risk assessment (PRA) modeling techniques to construct a living model of a space nuclear power system. This is being done in conjunction with a traditional reliability and survivability analysis of the SP-100 space nuclear power system. The initial phase of the project consists of three major parts with the overall goal of developing a top-level system model and defining initiating events of interest for the SP-100 system. The three major tasks were performing a traditional survivability analysis, performing a simple system reliability analysis, and constructing a top-level system fault-tree model. Each of these tasks and their interim results are discussed in this paper. Initial results from the study support the conclusion that PRA modeling techniques can provide a valuable design and decision-making tool for space reactors. The ability of the model to rank and calculate relative contributions from various failure modes allows design optimization for maximum safety and reliability. Future efforts in the SP-100 program will see data development and quantification of the model to allow parametric evaluations of the SP-100 system. Current efforts have shown the need for formal data development and test programs within such a modeling framework

  4. Reliability of the sliding scale for collecting affective responses to words.

    Science.gov (United States)

    Imbault, C; Shore, D; Kuperman, V

    2018-01-25

    Warriner, Shore, Schmidt, Imbault, and Kuperman, Canadian Journal of Experimental Psychology, 71; 71-88 (2017) have recently proposed a slider task in which participants move a manikin on a computer screen toward or further away from a word, and the distance (in pixels) is a measure of the word's valence. Warriner, Shore, Schmidt, Imbault, and Kuperman, Canadian Journal of Experimental Psychology, 71; 71-88 (2017) showed this task to be more valid than the widely used rating task, but they did not examine the reliability of the new methodology. In this study we investigated multiple aspects of this task's reliability. In Experiment 1 (Exps. 1.1-1.6), we showed that the sliding scale has high split-half reliability (r = .868 to .931). In Experiment 2, we also showed that the slider task elicits consistent repeated responses both within a single session (Exp. 2: r = .804) and across two sessions separated by one week (Exp. 3: r = .754). Overall, the slider task, in addition to having high validity, is highly reliable.

  5. Reference gene identification for reliable normalisation of quantitative RT-PCR data in Setaria viridis.

    Science.gov (United States)

    Nguyen, Duc Quan; Eamens, Andrew L; Grof, Christopher P L

    2018-01-01

    Quantitative real-time polymerase chain reaction (RT-qPCR) is the key platform for the quantitative analysis of gene expression in a wide range of experimental systems and conditions. However, the accuracy and reproducibility of gene expression quantification via RT-qPCR is entirely dependent on the identification of reliable reference genes for data normalisation. Green foxtail ( Setaria viridis ) has recently been proposed as a potential experimental model for the study of C 4 photosynthesis and is closely related to many economically important crop species of the Panicoideae subfamily of grasses, including Zea mays (maize), Sorghum bicolor (sorghum) and Sacchurum officinarum (sugarcane). Setaria viridis (Accession 10) possesses a number of key traits as an experimental model, namely; (i) a small sized, sequenced and well annotated genome; (ii) short stature and generation time; (iii) prolific seed production, and; (iv) is amendable to Agrobacterium tumefaciens -mediated transformation. There is currently however, a lack of reference gene expression information for Setaria viridis ( S. viridis ). We therefore aimed to identify a cohort of suitable S. viridis reference genes for accurate and reliable normalisation of S. viridis RT-qPCR expression data. Eleven putative candidate reference genes were identified and examined across thirteen different S. viridis tissues. Of these, the geNorm and NormFinder analysis software identified SERINE / THERONINE - PROTEIN PHOSPHATASE 2A ( PP2A ), 5 '- ADENYLYLSULFATE REDUCTASE 6 ( ASPR6 ) and DUAL SPECIFICITY PHOSPHATASE ( DUSP ) as the most suitable combination of reference genes for the accurate and reliable normalisation of S. viridis RT-qPCR expression data. To demonstrate the suitability of the three selected reference genes, PP2A , ASPR6 and DUSP , were used to normalise the expression of CINNAMYL ALCOHOL DEHYDROGENASE ( CAD ) genes across the same tissues. This approach readily demonstrated the suitably of the three

  6. Experimental Approach to Teaching Fluids

    Science.gov (United States)

    Stern, Catalina

    2015-11-01

    For the last 15 years we have promoted experimental work even in the theoretical courses. Fluids appear in the Physics curriculum of the National University of Mexico in two courses: Collective Phenomena in their sophomore year and Continuum Mechanics in their senior year. In both, students are asked for a final project. Surprisingly, at least 85% choose an experimental subject even though this means working extra hours every week. Some of the experiments were shown in this congress two years ago. This time we present some new results and the methodology we use in the classroom. I acknowledge support from the Physics Department, Facultad de Ciencias, UNAM.

  7. Monolithic QCL design approaches for improved reliability and affordability

    Science.gov (United States)

    Law, K. K.

    2013-12-01

    Many advances have been made recently in mid-wave infrared and long-wave infrared quantum cascade lasers (QCLs) technologies, and there is an increasing demand for these laser sources for ever expanding Naval, DoD and homeland security applications. We will discuss in this paper a portfolio of various Naval Air Warfare Weapons Division's current and future small business innovative research programs and efforts on significantly improving QCLs' performance, affordability, and reliability.

  8. Chip-Level Electromigration Reliability for Cu Interconnects

    International Nuclear Information System (INIS)

    Gall, M.; Oh, C.; Grinshpon, A.; Zolotov, V.; Panda, R.; Demircan, E.; Mueller, J.; Justison, P.; Ramakrishna, K.; Thrasher, S.; Hernandez, R.; Herrick, M.; Fox, R.; Boeck, B.; Kawasaki, H.; Haznedar, H.; Ku, P.

    2004-01-01

    Even after the successful introduction of Cu-based metallization, the electromigration (EM) failure risk has remained one of the most important reliability concerns for most advanced process technologies. Ever increasing operating current densities and the introduction of low-k materials in the backend process scheme are some of the issues that threaten reliable, long-term operation at elevated temperatures. The traditional method of verifying EM reliability only through current density limit checks is proving to be inadequate in general, or quite expensive at the best. A Statistical EM Budgeting (SEB) methodology has been proposed to assess more realistic chip-level EM reliability from the complex statistical distribution of currents in a chip. To be valuable, this approach requires accurate estimation of currents for all interconnect segments in a chip. However, no efficient technique to manage the complexity of such a task for very large chip designs is known. We present an efficient method to estimate currents exhaustively for all interconnects in a chip. The proposed method uses pre-characterization of cells and macros, and steps to identify and filter out symmetrically bi-directional interconnects. We illustrate the strength of the proposed approach using a high-performance microprocessor design for embedded applications as a case study

  9. A hybrid computational-experimental approach for automated crystal structure solution

    Science.gov (United States)

    Meredig, Bryce; Wolverton, C.

    2013-02-01

    Crystal structure solution from diffraction experiments is one of the most fundamental tasks in materials science, chemistry, physics and geology. Unfortunately, numerous factors render this process labour intensive and error prone. Experimental conditions, such as high pressure or structural metastability, often complicate characterization. Furthermore, many materials of great modern interest, such as batteries and hydrogen storage media, contain light elements such as Li and H that only weakly scatter X-rays. Finally, structural refinements generally require significant human input and intuition, as they rely on good initial guesses for the target structure. To address these many challenges, we demonstrate a new hybrid approach, first-principles-assisted structure solution (FPASS), which combines experimental diffraction data, statistical symmetry information and first-principles-based algorithmic optimization to automatically solve crystal structures. We demonstrate the broad utility of FPASS to clarify four important crystal structure debates: the hydrogen storage candidates MgNH and NH3BH3; Li2O2, relevant to Li-air batteries; and high-pressure silane, SiH4.

  10. Multiobjective Reliable Cloud Storage with Its Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Xiyang Liu

    2016-01-01

    Full Text Available Information abounds in all fields of the real life, which is often recorded as digital data in computer systems and treated as a kind of increasingly important resource. Its increasing volume growth causes great difficulties in both storage and analysis. The massive data storage in cloud environments has significant impacts on the quality of service (QoS of the systems, which is becoming an increasingly challenging problem. In this paper, we propose a multiobjective optimization model for the reliable data storage in clouds through considering both cost and reliability of the storage service simultaneously. In the proposed model, the total cost is analyzed to be composed of storage space occupation cost, data migration cost, and communication cost. According to the analysis of the storage process, the transmission reliability, equipment stability, and software reliability are taken into account in the storage reliability evaluation. To solve the proposed multiobjective model, a Constrained Multiobjective Particle Swarm Optimization (CMPSO algorithm is designed. At last, experiments are designed to validate the proposed model and its solution PSO algorithm. In the experiments, the proposed model is tested in cooperation with 3 storage strategies. Experimental results show that the proposed model is positive and effective. The experimental results also demonstrate that the proposed model can perform much better in alliance with proper file splitting methods.

  11. A Statistical Approach for Selecting Buildings for Experimental Measurement of HVAC Needs

    Directory of Open Access Journals (Sweden)

    Malinowski Paweł

    2017-03-01

    Full Text Available This article presents a statistical methodology for selecting representative buildings for experimentally evaluating the performance of HVAC systems, especially in terms of energy consumption. The proposed approach is based on the k-means method. The algorithm for this method is conceptually simple, allowing it to be easily implemented. The method can be applied to large quantities of data with unknown distributions. The method was tested using numerical experiments to determine the hourly, daily, and yearly heat values and the domestic hot water demands of residential buildings in Poland. Due to its simplicity, the proposed approach is very promising for use in engineering applications and is applicable to testing the performance of many HVAC systems.

  12. How reliable are Functional Movement Screening scores? A systematic review of rater reliability.

    Science.gov (United States)

    Moran, Robert W; Schneiders, Anthony G; Major, Katherine M; Sullivan, S John

    2016-05-01

    Several physical assessment protocols to identify intrinsic risk factors for injury aetiology related to movement quality have been described. The Functional Movement Screen (FMS) is a standardised, field-expedient test battery intended to assess movement quality and has been used clinically in preparticipation screening and in sports injury research. To critically appraise and summarise research investigating the reliability of scores obtained using the FMS battery. Systematic literature review. Systematic search of Google Scholar, Scopus (including ScienceDirect and PubMed), EBSCO (including Academic Search Complete, AMED, CINAHL, Health Source: Nursing/Academic Edition), MEDLINE and SPORTDiscus. Studies meeting eligibility criteria were assessed by 2 reviewers for risk of bias using the Quality Appraisal of Reliability Studies checklist. Overall quality of evidence was determined using van Tulder's levels of evidence approach. 12 studies were appraised. Overall, there was a 'moderate' level of evidence in favour of 'acceptable' (intraclass correlation coefficient ≥0.6) inter-rater and intra-rater reliability for composite scores derived from live scoring. For inter-rater reliability of composite scores derived from video recordings there was 'conflicting' evidence, and 'limited' evidence for intra-rater reliability. For inter-rater reliability based on live scoring of individual subtests there was 'moderate' evidence of 'acceptable' reliability (κ≥0.4) for 4 subtests (Deep Squat, Shoulder Mobility, Active Straight-leg Raise, Trunk Stability Push-up) and 'conflicting' evidence for the remaining 3 (Hurdle Step, In-line Lunge, Rotary Stability). This review found 'moderate' evidence that raters can achieve acceptable levels of inter-rater and intra-rater reliability of composite FMS scores when using live ratings. Overall, there were few high-quality studies, and the quality of several studies was impacted by poor study reporting particularly in relation to

  13. Reliability payments to generation capacity in electricity markets

    International Nuclear Information System (INIS)

    Olsina, Fernando; Pringles, Rolando; Larisson, Carlos; Garcés, Francisco

    2014-01-01

    Electric power is a critical input to modern economies. Generation adequacy and security of supply in power systems running under competition are currently topics of high concern for consumers, regulators and governments. In a market setting, generation investments and adequacy can only be achieved by an appropriate regulatory framework that sets efficient remuneration to power capacity. Theoretically, energy-only electricity markets are efficient and no additional mechanism is needed. Nonetheless, the energy-only market design suffers from serious drawbacks. Therefore, jointly with the evolution of electricity markets, many remunerating mechanisms for generation capacity have been proposed. Explicit capacity payment was the first remunerating approach implemented and perhaps still the most applied. However, this price-based regulation has been applied no without severe difficulties and criticism. In this paper, a new reliability payment mechanism is envisioned. Capacity of each generating unit is paid according to its effective contribution to overall system reliability. The proposed scheme has many attractive features and preserves the theoretical efficiency properties of energy-only markets. Fairness, incentive compatibility, market power mitigation and settlement rules are investigated in this work. The article also examines the requirements for system data and models in order to implement the proposed capacity mechanism. A numerical example on a real hydrothermal system serves for illustrating the practicability of the proposed approach and the resulting reliability payments to the generation units. - Highlights: • A new approach for remunerating supply reliability provided by generation units is proposed. • The contribution of each generating unit to lessen power shortfalls is determined by simulations. • Efficiency, fairness and incentive compatibility of the proposed reliability payment are assessed

  14. Multivariate performance reliability prediction in real-time

    International Nuclear Information System (INIS)

    Lu, S.; Lu, H.; Kolarik, W.J.

    2001-01-01

    This paper presents a technique for predicting system performance reliability in real-time considering multiple failure modes. The technique includes on-line multivariate monitoring and forecasting of selected performance measures and conditional performance reliability estimates. The performance measures across time are treated as a multivariate time series. A state-space approach is used to model the multivariate time series. Recursive forecasting is performed by adopting Kalman filtering. The predicted mean vectors and covariance matrix of performance measures are used for the assessment of system survival/reliability with respect to the conditional performance reliability. The technique and modeling protocol discussed in this paper provide a means to forecast and evaluate the performance of an individual system in a dynamic environment in real-time. The paper also presents an example to demonstrate the technique

  15. Polynomial-time computability of the edge-reliability of graphs using Gilbert's formula

    Directory of Open Access Journals (Sweden)

    Marlowe Thomas J.

    1998-01-01

    Full Text Available Reliability is an important consideration in analyzing computer and other communication networks, but current techniques are extremely limited in the classes of graphs which can be analyzed efficiently. While Gilbert's formula establishes a theoretically elegant recursive relationship between the edge reliability of a graph and the reliability of its subgraphs, naive evaluation requires consideration of all sequences of deletions of individual vertices, and for many graphs has time complexity essentially Θ (N!. We discuss a general approach which significantly reduces complexity, encoding subgraph isomorphism in a finer partition by invariants, and recursing through the set of invariants. We illustrate this approach using threshhold graphs, and show that any computation of reliability using Gilbert's formula will be polynomial-time if and only if the number of invariants considered is polynomial; we then show families of graphs with polynomial-time, and non-polynomial reliability computation, and show that these encompass most previously known results. We then codify our approach to indicate how it can be used for other classes of graphs, and suggest several classes to which the technique can be applied.

  16. Reliability Prediction Approaches For Domestic Intelligent Electric Energy Meter Based on IEC62380

    Science.gov (United States)

    Li, Ning; Tong, Guanghua; Yang, Jincheng; Sun, Guodong; Han, Dongjun; Wang, Guixian

    2018-01-01

    The reliability of intelligent electric energy meter is a crucial issue considering its large calve application and safety of national intelligent grid. This paper developed a procedure of reliability prediction for domestic intelligent electric energy meter according to IEC62380, especially to identify the determination of model parameters combining domestic working conditions. A case study was provided to show the effectiveness and validation.

  17. Bayesian approach in the power electric systems study of reliability ...

    African Journals Online (AJOL)

    Subsequently, Bayesian methodologies are framed in an ampler problem list, based on the definition of an opportune "vector of state" and of a vector describing the system performances, aiming to the definition and the calculation or the estimation of system reliability. The purpose of our work is to establish a useful model ...

  18. Software reliability studies

    Science.gov (United States)

    Hoppa, Mary Ann; Wilson, Larry W.

    1994-01-01

    There are many software reliability models which try to predict future performance of software based on data generated by the debugging process. Our research has shown that by improving the quality of the data one can greatly improve the predictions. We are working on methodologies which control some of the randomness inherent in the standard data generation processes in order to improve the accuracy of predictions. Our contribution is twofold in that we describe an experimental methodology using a data structure called the debugging graph and apply this methodology to assess the robustness of existing models. The debugging graph is used to analyze the effects of various fault recovery orders on the predictive accuracy of several well-known software reliability algorithms. We found that, along a particular debugging path in the graph, the predictive performance of different models can vary greatly. Similarly, just because a model 'fits' a given path's data well does not guarantee that the model would perform well on a different path. Further we observed bug interactions and noted their potential effects on the predictive process. We saw that not only do different faults fail at different rates, but that those rates can be affected by the particular debugging stage at which the rates are evaluated. Based on our experiment, we conjecture that the accuracy of a reliability prediction is affected by the fault recovery order as well as by fault interaction.

  19. An examination of reliability critical items in liquid metal reactors: An analysis by the Centralized Reliability Data Organization (CREDO)

    International Nuclear Information System (INIS)

    Humphrys, B.L.; Haire, M.J.; Koger, K.H.; Manneschmidt, J.F.; Setoguchi, K.; Nakai, R.; Okubo, Y.

    1987-01-01

    The Centralized Reliability Data Organization (CREDO) is the largest repository of liquid metal reactor (LMR) component reliability data in the world. It is jointly sponsored by the US Department of Energy (DOE) and the Power Reactor and Nuclear Fuel Development Corporation (PNC) of Japan. The CREDO data base contains information on a population of more than 21,000 components and approximately 1300 event records. A conservative estimation is that the total component operating hours is approaching 3.5 billion hours. Because data gathering for CREDO concentrates on event (failure) information, the work reported here focuses on the reliability information contained in CREDO and the development of reliability critical items lists. That is, components are ranked in prioritized lists from worst to best performers from a reliability standpoint. For the data contained in the CREDO data base, FFTF and JOYO show reliability growth; EBR-II reveals a slight unreliability growth for those components tracked by CREDO. However, tabulations of events which cause reactor shutdowns decrease with time at each site

  20. Recommendations for certification or measurement of reliability for reliable digital archival repositories with emphasis on access

    Directory of Open Access Journals (Sweden)

    Paula Regina Ventura Amorim Gonçalez

    2017-04-01

    Full Text Available Introduction: Considering the guidelines of ISO 16363: 2012 (Space data and information transfer systems -- Audit and certification of trustworthy digital repositories and the text of CONARQ Resolution 39 for certification of Reliable Digital Archival Repository (RDC-Arq, verify the technical recommendations should be used as the basis for a digital archival repository to be considered reliable. Objective: Identify requirements for the creation of Reliable Digital Archival Repositories with emphasis on access to information from the ISO 16363: 2012 and CONARQ Resolution 39. Methodology: For the development of the study, the methodology consisted of an exploratory, descriptive and documentary theoretical investigation, since it is based on ISO 16363: 2012 and CONARQ Resolution 39. From the perspective of the problem approach, the study is qualitative and quantitative, since the data were collected, tabulated, and analyzed from the interpretation of their contents. Results: We presented a set of Checklist Recommendations for reliability measurement and/or certification for RDC-Arq with a clipping focused on the identification of requirements with emphasis on access to information is presented. Conclusions: The right to information as well as access to reliable information is a premise for Digital Archival Repositories, so the set of recommendations is directed to archivists who work in Digital Repositories and wish to verify the requirements necessary to evaluate the reliability of the Digital Repository or still guide the information professional in collecting requirements for repository reliability certification.

  1. Reliability-based evaluation of bridge components for consistent safety margins.

    Science.gov (United States)

    2010-10-01

    The Load and Resistant Factor Design (LRFD) approach is based on the concept of structural reliability. The approach is more : rational than the former design approaches such as Load Factor Design or Allowable Stress Design. The LRFD Specification fo...

  2. Standards in reliability and safety engineering

    International Nuclear Information System (INIS)

    O'Connor, Patrick

    1998-01-01

    This article explains how the highest 'world class' levels of reliability and safety are achieved, by adherence to the basic principles of excellence in design, production, support and maintenance, by continuous improvement, and by understanding that excellence and improvement lead to reduced costs. These principles are contrasted with the methods that have been developed and standardised, particularly military standards for reliability, ISO9000, and safety case regulations. The article concludes that the formal, standardised approaches are misleading and counterproductive, and recommends that they be replaced by a philosophy based on the realities of human performance

  3. Scyllac equipment reliability analysis

    International Nuclear Information System (INIS)

    Gutscher, W.D.; Johnson, K.J.

    1975-01-01

    Most of the failures in Scyllac can be related to crowbar trigger cable faults. A new cable has been designed, procured, and is currently undergoing evaluation. When the new cable has been proven, it will be worked into the system as quickly as possible without causing too much additional down time. The cable-tip problem may not be easy or even desirable to solve. A tightly fastened permanent connection that maximizes contact area would be more reliable than the plug-in type of connection in use now, but it would make system changes and repairs much more difficult. The balance of the failures have such a low occurrence rate that they do not cause much down time and no major effort is underway to eliminate them. Even though Scyllac was built as an experimental system and has many thousands of components, its reliability is very good. Because of this the experiment has been able to progress at a reasonable pace

  4. Optimal Bi-Objective Redundancy Allocation for Systems Reliability and Risk Management.

    Science.gov (United States)

    Govindan, Kannan; Jafarian, Ahmad; Azbari, Mostafa E; Choi, Tsan-Ming

    2016-08-01

    In the big data era, systems reliability is critical to effective systems risk management. In this paper, a novel multiobjective approach, with hybridization of a known algorithm called NSGA-II and an adaptive population-based simulated annealing (APBSA) method is developed to solve the systems reliability optimization problems. In the first step, to create a good algorithm, we use a coevolutionary strategy. Since the proposed algorithm is very sensitive to parameter values, the response surface method is employed to estimate the appropriate parameters of the algorithm. Moreover, to examine the performance of our proposed approach, several test problems are generated, and the proposed hybrid algorithm and other commonly known approaches (i.e., MOGA, NRGA, and NSGA-II) are compared with respect to four performance measures: 1) mean ideal distance; 2) diversification metric; 3) percentage of domination; and 4) data envelopment analysis. The computational studies have shown that the proposed algorithm is an effective approach for systems reliability and risk management.

  5. Electrochemical production and use of free chlorine for pollutant removal: an experimental design approach.

    Science.gov (United States)

    Antonelli, Raissa; de Araújo, Karla Santos; Pires, Ricardo Francisco; Fornazari, Ana Luiza de Toledo; Granato, Ana Claudia; Malpass, Geoffroy Roger Pointer

    2017-10-28

    The present paper presents the study of (1) the optimization of electrochemical-free chlorine production using an experimental design approach, and (2) the application of the optimum conditions obtained for the application in photo-assisted electrochemical degradation of simulated textile effluent. In the experimental design the influence of inter-electrode gap, pH, NaCl concentration and current was considered. It was observed that the four variables studied are significant for the process, with NaCl concentration and current being the most significant variables for free chlorine production. The maximum free chlorine production was obtained at a current of 2.33 A and NaCl concentrations in 0.96 mol dm -3 . The application of the optimized conditions with simultaneous UV irradiation resulted in up to 83.1% Total Organic Carbon removal and 100% of colour removal over 180 min of electrolysis. The results indicate that a systematic (statistical) approach to the electrochemical treatment of pollutants can save time and reagents.

  6. A GC/MS-based metabolomic approach for reliable diagnosis of phenylketonuria.

    Science.gov (United States)

    Xiong, Xiyue; Sheng, Xiaoqi; Liu, Dan; Zeng, Ting; Peng, Ying; Wang, Yichao

    2015-11-01

    Although the phenylalanine/tyrosine ratio in blood has been the gold standard for diagnosis of phenylketonuria (PKU), the disadvantages of invasive sample collection and false positive error limited the application of this discriminator in the diagnosis of PKU to some extent. The aim of this study was to develop a new standard with high sensitivity and specificity in a less invasive manner for diagnosing PKU. In this study, an improved oximation-silylation method together with GC/MS was utilized to obtain the urinary metabolomic information in 47 PKU patients compared with 47 non-PKU controls. Compared with conventional oximation-silylation methods, the present approach possesses the advantages of shorter reaction time and higher reaction efficiency at a considerably lower temperature, which is beneficial to the derivatization of some thermally unstable compounds, such as phenylpyruvic acid. Ninety-seven peaks in the chromatograms were identified as endogenous metabolites by the National Institute of Standards and Technology (NIST) mass spectra library, including amino acids, organic acids, carbohydrates, amides, and fatty acids. After normalization of data using creatinine as internal standard, 19 differentially expressed compounds with p values of <0.05 were selected by independent-sample t test for the separation of the PKU group and the control group. A principal component analysis (PCA) model constructed by these differentially expressed compounds showed that the PKU group can be discriminated from the control group. Receiver-operating characteristic (ROC) analysis with area under the curve (AUC), specificity, and sensitivity of each PKU marker obtained from these differentially expressed compounds was used to evaluate the possibility of using these markers for diagnosing PKU. The largest value of AUC (0.987) with high specificity (0.936) and sensitivity (1.000) was obtained by the ROC curve of phenylacetic acid at its cutoff value (17.244 mmol/mol creatinine

  7. Final Report for the Virtual Reliability Realization System LDRD

    Energy Technology Data Exchange (ETDEWEB)

    DELLIN, THEODORE A.; HENDERSON, CHRISTOPHER L.; O' TOOLE, EDWARD J.

    2000-12-01

    Current approaches to reliability are not adequate to keep pace with the need for faster, better and cheaper products and systems. This is especially true in high consequence of failure applications. The original proposal for the LDRD was to look at this challenge and see if there was a new paradigm that could make reliability predictions, along with a quantitative estimate of the risk in that prediction, in a way that was faster, better and cheaper. Such an approach would be based on the underlying science models that are the backbone of reliability predictions. The new paradigm would be implemented in two software tools: the Virtual Reliability Realization System (VRRS) and the Reliability Expert System (REX). The three-year LDRD was funded at a reduced level for the first year ($120K vs. $250K) and not renewed. Because of the reduced funding, we concentrated on the initial development of the expertise system. We developed an interactive semiconductor calculation tool needed for reliability analyses. We also were able to generate a basic functional system using Microsoft Siteserver Commerce Edition and Microsoft Sequel Server. The base system has the capability to store Office documents from multiple authors, and has the ability to track and charge for usage. The full outline of the knowledge model has been incorporated as well as examples of various types of content.

  8. Techniques for increasing the reliability of accelerator control system electronics

    International Nuclear Information System (INIS)

    Utterback, J.

    1993-09-01

    As the physical size of modern accelerators becomes larger and larger, the number of required control system circuit boards increases, and the probability of one of those circuit boards failing while in service also increases. In order to do physics, the experimenters need the accelerator to provide beam reliably with as little down time as possible. With the advent of colliding beams physics, reliability becomes even more important due to the fact that a control system failure can cause the loss of painstakingly produced antiprotons. These facts prove the importance of keeping reliability in mind when designing and maintaining accelerator control system electronics

  9. Optimal Bi-Objective Redundancy Allocation for Systems Reliability and Risk Management

    DEFF Research Database (Denmark)

    Govindan, Kannan; Jafarian, Ahmad; Azbari, Mostafa E.

    2016-01-01

    In the big data era, systems reliability is critical to effective systems risk management. In this paper, a novel multiobjective approach, with hybridization of a known algorithm called NSGA-II and an adaptive population-based simulated annealing (APBSA) method is developed to solve the systems...... of domination; and 4) data envelopment analysis. The computational studies have shown that the proposed algorithm is an effective approach for systems reliability and risk management....

  10. Development of a morphology-based modeling technique for tracking solid-body displacements: examining the reliability of a potential MRI-only approach for joint kinematics assessment

    International Nuclear Information System (INIS)

    Mahato, Niladri K.; Montuelle, Stephane; Cotton, John; Williams, Susan; Thomas, James; Clark, Brian

    2016-01-01

    Single or biplanar video radiography and Roentgen stereophotogrammetry (RSA) techniques used for the assessment of in-vivo joint kinematics involves application of ionizing radiation, which is a limitation for clinical research involving human subjects. To overcome this limitation, our long-term goal is to develop a magnetic resonance imaging (MRI)-only, three dimensional (3-D) modeling technique that permits dynamic imaging of joint motion in humans. Here, we present our initial findings, as well as reliability data, for an MRI-only protocol and modeling technique. We developed a morphology-based motion-analysis technique that uses MRI of custom-built solid-body objects to animate and quantify experimental displacements between them. The technique involved four major steps. First, the imaging volume was calibrated using a custom-built grid. Second, 3-D models were segmented from axial scans of two custom-built solid-body cubes. Third, these cubes were positioned at pre-determined relative displacements (translation and rotation) in the magnetic resonance coil and scanned with a T 1 and a fast contrast-enhanced pulse sequences. The digital imaging and communications in medicine (DICOM) images were then processed for animation. The fourth step involved importing these processed images into an animation software, where they were displayed as background scenes. In the same step, 3-D models of the cubes were imported into the animation software, where the user manipulated the models to match their outlines in the scene (rotoscoping) and registered the models into an anatomical joint system. Measurements of displacements obtained from two different rotoscoping sessions were tested for reliability using coefficient of variations (CV), intraclass correlation coefficients (ICC), Bland-Altman plots, and Limits of Agreement analyses. Between-session reliability was high for both the T 1 and the contrast-enhanced sequences. Specifically, the average CVs for translation were 4

  11. Development of a morphology-based modeling technique for tracking solid-body displacements: examining the reliability of a potential MRI-only approach for joint kinematics assessment.

    Science.gov (United States)

    Mahato, Niladri K; Montuelle, Stephane; Cotton, John; Williams, Susan; Thomas, James; Clark, Brian

    2016-05-18

    Single or biplanar video radiography and Roentgen stereophotogrammetry (RSA) techniques used for the assessment of in-vivo joint kinematics involves application of ionizing radiation, which is a limitation for clinical research involving human subjects. To overcome this limitation, our long-term goal is to develop a magnetic resonance imaging (MRI)-only, three dimensional (3-D) modeling technique that permits dynamic imaging of joint motion in humans. Here, we present our initial findings, as well as reliability data, for an MRI-only protocol and modeling technique. We developed a morphology-based motion-analysis technique that uses MRI of custom-built solid-body objects to animate and quantify experimental displacements between them. The technique involved four major steps. First, the imaging volume was calibrated using a custom-built grid. Second, 3-D models were segmented from axial scans of two custom-built solid-body cubes. Third, these cubes were positioned at pre-determined relative displacements (translation and rotation) in the magnetic resonance coil and scanned with a T1 and a fast contrast-enhanced pulse sequences. The digital imaging and communications in medicine (DICOM) images were then processed for animation. The fourth step involved importing these processed images into an animation software, where they were displayed as background scenes. In the same step, 3-D models of the cubes were imported into the animation software, where the user manipulated the models to match their outlines in the scene (rotoscoping) and registered the models into an anatomical joint system. Measurements of displacements obtained from two different rotoscoping sessions were tested for reliability using coefficient of variations (CV), intraclass correlation coefficients (ICC), Bland-Altman plots, and Limits of Agreement analyses. Between-session reliability was high for both the T1 and the contrast-enhanced sequences. Specifically, the average CVs for translation were 4

  12. Reliability-based sensitivity of mechanical components with arbitrary distribution parameters

    International Nuclear Information System (INIS)

    Zhang, Yi Min; Yang, Zhou; Wen, Bang Chun; He, Xiang Dong; Liu, Qiaoling

    2010-01-01

    This paper presents a reliability-based sensitivity method for mechanical components with arbitrary distribution parameters. Techniques from the perturbation method, the Edgeworth series, the reliability-based design theory, and the sensitivity analysis approach were employed directly to calculate the reliability-based sensitivity of mechanical components on the condition that the first four moments of the original random variables are known. The reliability-based sensitivity information of the mechanical components can be accurately and quickly obtained using a practical computer program. The effects of the design parameters on the reliability of mechanical components were studied. The method presented in this paper provides the theoretic basis for the reliability-based design of mechanical components

  13. A semi-supervised learning approach for RNA secondary structure prediction.

    Science.gov (United States)

    Yonemoto, Haruka; Asai, Kiyoshi; Hamada, Michiaki

    2015-08-01

    RNA secondary structure prediction is a key technology in RNA bioinformatics. Most algorithms for RNA secondary structure prediction use probabilistic models, in which the model parameters are trained with reliable RNA secondary structures. Because of the difficulty of determining RNA secondary structures by experimental procedures, such as NMR or X-ray crystal structural analyses, there are still many RNA sequences that could be useful for training whose secondary structures have not been experimentally determined. In this paper, we introduce a novel semi-supervised learning approach for training parameters in a probabilistic model of RNA secondary structures in which we employ not only RNA sequences with annotated secondary structures but also ones with unknown secondary structures. Our model is based on a hybrid of generative (stochastic context-free grammars) and discriminative models (conditional random fields) that has been successfully applied to natural language processing. Computational experiments indicate that the accuracy of secondary structure prediction is improved by incorporating RNA sequences with unknown secondary structures into training. To our knowledge, this is the first study of a semi-supervised learning approach for RNA secondary structure prediction. This technique will be useful when the number of reliable structures is limited. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Physics of Failure as a Basis for Solder Elements Reliability Assessment in Wind Turbines

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    description of the reliability. A physics of failure approach is applied. A SnAg solder component used in power electronics is used as an example. Crack propagation in the SnAg solder is modeled and a model to assess the accumulated plastic strain is proposed based on a physics of failure approach. Based...... on the proposed model it is described how to find the accumulated linear damage and reliability levels for a given temperature loading profile. Using structural reliability methods the reliability levels of the electrical components are assessed by introducing scale factors for stresses....

  15. Interaction of CREDO [Centralized Reliability Data Organization] with the EBR-II [Experimental Breeder Reactor II] PRA [probabilistic risk assessment] development

    International Nuclear Information System (INIS)

    Smith, M.S.; Ragland, W.A.

    1989-01-01

    The National Academy of Sciences review of US Department of Energy (DOE) class 1 reactors recommended that the Experimental Breeder Reactor II (EBR-II), operated by Argonne National Laboratory (ANL), develop a level 1 probabilistic risk assessment (PRA) and make provisions for level 2 and level 3 PRAs based on the results of the level 1 PRA. The PRA analysis group at ANL will utilize the Centralized Reliability Data Organization (CREDO) at Oak Ridge National Laboratory to support the PRA data needs. CREDO contains many years of empirical liquid-metal reactor component data from EBR-II. CREDO is a mutual data- and cost-sharing system sponsored by DOE and the Power Reactor and Nuclear Fuels Development Corporation of Japan. CREDO is a component based data system; data are collected on components that are liquid-metal specific, associated with a liquid-metal environment, contained in systems that interface with liquid-metal environments, or are safety related for use in reliability/availability/maintainability (RAM) analyses of advanced reactors. The links between the EBR-II PRA development effort and the CREDO data collection at EBR-II extend beyond the sharing of data. The PRA provides a measure of the relative contribution to risk of the various components. This information can be used to prioritize future CREDO data collection activities at EBR-II and other sites

  16. Trading Water Conservation Credits: A Coordinative Approach for Enhanced Urban Water Reliability

    Science.gov (United States)

    Gonzales, P.; Ajami, N. K.

    2016-12-01

    Water utilities in arid and semi-arid regions are increasingly relying on water use efficiency and conservation to extend the availability of supplies. Despite spatial and institutional inter-dependency of many service providers, these demand-side management initiatives have traditionally been tackled by individual utilities operating in a silo. In this study, we introduce a new approach to water conservation that addresses regional synergies—a novel system of tradable water conservation credits. Under the proposed approach, utilities have the flexibility to invest in water conservation measures that are appropriate for their specific service area. When utilities have insufficient capacity for local cost-effective measures, they may opt to purchase credits, contributing to fund subsidies for utilities that do have that capacity and can provide the credits, while the region as whole benefits from more reliable water supplies. While similar programs have been used to address water quality concerns, to our knowledge this is one of the first studies proposing tradable credits for incentivizing water conservation. Through mathematical optimization, this study estimates the potential benefits of a trading program and demonstrates the institutional and economic characteristics needed for such a policy to be viable, including a proposed web platform to facilitate transparent regional planning, data-driven decision-making, and enhanced coordination of utilities. We explore the impacts of defining conservation targets tailored to local realities of utilities, setting credit prices, and different policy configurations. We apply these models to the case study of water utility members of the Bay Area Water Supply and Conservation Agency. Preliminary work shows that the diverse characteristics of these utilities present opportunities for the region to achieve conservation goals while maximizing the benefits to individual utilities through more flexible coordinative efforts.

  17. The cost of reliability

    International Nuclear Information System (INIS)

    Ilic, M.

    1998-01-01

    In this article the restructuring process under way in the US power industry is being revisited from the point of view of transmission system provision and reliability was rolled into the average cost of electricity to all, it is not so obvious how is this cost managed in the new industry. A new MIT approach to transmission pricing is here suggested as a possible solution [it

  18. Impact of Climate Change on Natural Snow Reliability, Snowmaking Capacities, and Wind Conditions of Ski Resorts in Northeast Turkey: A Dynamical Downscaling Approach

    Directory of Open Access Journals (Sweden)

    Osman Cenk Demiroglu

    2016-04-01

    Full Text Available Many ski resorts worldwide are going through deteriorating snow cover conditions due to anthropogenic warming trends. As the natural and the artificially supported, i.e., technical, snow reliability of ski resorts diminish, the industry approaches a deadlock. For this reason, impact assessment studies have become vital for understanding vulnerability of ski tourism. This study considers three resorts at one of the rapidly emerging ski destinations, Northeast Turkey, for snow reliability analyses. Initially one global circulation model is dynamically downscaled by using the regional climate model RegCM4.4 for 1971–2000 and 2021–2050 periods along the RCP4.5 greenhouse gas concentration pathway. Next, the projected climate outputs are converted into indicators of natural snow reliability, snowmaking capacity, and wind conditions. The results show an overall decline in the frequencies of naturally snow reliable days and snowmaking capacities between the two periods. Despite the decrease, only the lower altitudes of one ski resort would face the risk of losing natural snow reliability and snowmaking could still compensate for forming the base layer before the critical New Year’s week. On the other hand, adverse high wind conditions improve as to reduce the number of lift closure days at all resorts. Overall, this particular region seems to be relatively resilient against climate change.

  19. System reliability, performance and trust in adaptable automation.

    Science.gov (United States)

    Chavaillaz, Alain; Wastell, David; Sauer, Jürgen

    2016-01-01

    The present study examined the effects of reduced system reliability on operator performance and automation management in an adaptable automation environment. 39 operators were randomly assigned to one of three experimental groups: low (60%), medium (80%), and high (100%) reliability of automation support. The support system provided five incremental levels of automation which operators could freely select according to their needs. After 3 h of training on a simulated process control task (AutoCAMS) in which the automation worked infallibly, operator performance and automation management were measured during a 2.5-h testing session. Trust and workload were also assessed through questionnaires. Results showed that although reduced system reliability resulted in lower levels of trust towards automation, there were no corresponding differences in the operators' reliance on automation. While operators showed overall a noteworthy ability to cope with automation failure, there were, however, decrements in diagnostic speed and prospective memory with lower reliability. Copyright © 2015. Published by Elsevier Ltd.

  20. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  1. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  2. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Directory of Open Access Journals (Sweden)

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  3. Trends in Control Area of PLC Reliability and Safety Parameters

    Directory of Open Access Journals (Sweden)

    Juraj Zdansky

    2008-01-01

    Full Text Available Extension of the PLC application possibilities is closely related to increase of reliability and safety parameters. If the requirement of reliability and safety parameters will be suitable, the PLC could by implemented to specific applications such the safety-related processes control. The goal of this article is to show the way which producers are approaching to increase PLC`s reliability and safety parameters. The second goal is to analyze these parameters for range of present choice and describe the possibility how the reliability and safety parameters can be affected.

  4. Predicting Cost/Reliability/Maintainability of Advanced General Aviation Avionics Equipment

    Science.gov (United States)

    Davis, M. R.; Kamins, M.; Mooz, W. E.

    1978-01-01

    A methodology is provided for assisting NASA in estimating the cost, reliability, and maintenance (CRM) requirements for general avionics equipment operating in the 1980's. Practical problems of predicting these factors are examined. The usefulness and short comings of different approaches for modeling coast and reliability estimates are discussed together with special problems caused by the lack of historical data on the cost of maintaining general aviation avionics. Suggestions are offered on how NASA might proceed in assessing cost reliability CRM implications in the absence of reliable generalized predictive models.

  5. Safety and reliability analysis based on nonprobabilistic methods

    International Nuclear Information System (INIS)

    Kozin, I.O.; Petersen, K.E.

    1996-01-01

    Imprecise probabilities, being developed during the last two decades, offer a considerably more general theory having many advantages which make it very promising for reliability and safety analysis. The objective of the paper is to argue that imprecise probabilities are more appropriate tool for reliability and safety analysis, that they allow to model the behavior of nuclear industry objects more comprehensively and give a possibility to solve some problems unsolved in the framework of conventional approach. Furthermore, some specific examples are given from which we can see the usefulness of the tool for solving some reliability tasks

  6. Approach to Operational Experimental Estimation of Static Stresses of Elements of Mechanical Structures

    Science.gov (United States)

    Sedov, A. V.; Kalinchuk, V. V.; Bocharova, O. V.

    2018-01-01

    The evaluation of static stresses and strength of units and components is a crucial task for increasing reliability in the operation of vehicles and equipment, to prevent emergencies, especially in structures made of metal and composite materials. At the stage of creation and commissioning of structures to control the quality of manufacturing of individual elements and components, diagnostic control methods are widely used. They are acoustic, ultrasonic, X-ray, radiation methods and others. The using of these methods to control the residual life and the degree of static stresses of units and parts during operation is fraught with great difficulties both in methodology and in instrumentation. In this paper, the authors propose an effective approach of operative control of the degree of static stresses of units and parts of mechanical structures which are in working condition, based on recording the changing in the surface wave properties of a system consisting of a sensor and a controlled environment (unit, part). The proposed approach of low-frequency diagnostics of static stresses presupposes a new adaptive-spectral analysis of a surface wave created by external action (impact). It is possible to estimate implicit stresses of structures in the experiment due to this approach.

  7. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

    Science.gov (United States)

    Chatzis, Sotirios P; Andreou, Andreas S

    2015-11-01

    Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

  8. A framework for reliability and risk centered maintenance

    International Nuclear Information System (INIS)

    Selvik, J.T.; Aven, T.

    2011-01-01

    Reliability centered maintenance (RCM) is a well-established analysis method for preventive maintenance planning. As its name indicates, reliability is the main point of reference for the planning, but consequences of failures are also assessed. However, uncertainties and risk are to a limited extent addressed by the RCM method, and in this paper we suggest an extension of the RCM to reliability and risk centered maintenance (RRCM) by also considering risk as the reference for the analysis in addition to reliability. A broad perspective on risk is adopted where uncertainties are the main component of risk in addition to possible events and associated consequences. A case from the offshore oil and gas industry is presented to illustrate and discuss the suggested approach.

  9. Experimental oligopolies modeling: A dynamic approach based on heterogeneous behaviors

    Science.gov (United States)

    Cerboni Baiardi, Lorenzo; Naimzada, Ahmad K.

    2018-05-01

    In the rank of behavioral rules, imitation-based heuristics has received special attention in economics (see [14] and [12]). In particular, imitative behavior is considered in order to understand the evidences arising in experimental oligopolies which reveal that the Cournot-Nash equilibrium does not emerge as unique outcome and show that an important component of the production at the competitive level is observed (see e.g.[1,3,9] or [7,10]). By considering the pioneering groundbreaking approach of [2], we build a dynamical model of linear oligopolies where heterogeneous decision mechanisms of players are made explicit. In particular, we consider two different types of quantity setting players characterized by different decision mechanisms that coexist and operate simultaneously: agents that adaptively adjust their choices towards the direction that increases their profit are embedded with imitator agents. The latter ones use a particular form of proportional imitation rule that considers the awareness about the presence of strategic interactions. It is noteworthy that the Cournot-Nash outcome is a stationary state of our models. Our thesis is that the chaotic dynamics arousing from a dynamical model, where heterogeneous players are considered, are capable to qualitatively reproduce the outcomes of experimental oligopolies.

  10. Reliability of salivary testosterone measurements in diagnosis of Polycystic Ovarian Syndrome

    Directory of Open Access Journals (Sweden)

    Omnia Youssef

    2010-07-01

    Conclusion: Determination of salivary testosterone is a reliable method to detect changes in the concentration of available biologically active testosterone in the serum. Salivary testosterone provides a sensitive, simple, reliable, non-invasive and uncomplicated diagnostic approach for PCOS.

  11. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility.

    Science.gov (United States)

    Zaballos, Agustín; Navarro, Joan; Martín De Pozuelo, Ramon

    2018-02-28

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid's data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  12. Humidity adsorption and transfer in hygroscopic materials. Percolation-type approach and experimentation

    International Nuclear Information System (INIS)

    Quenard, Daniel

    1989-01-01

    Water vapor adsorption and transfer in microporous media are studied by using a 3 level hierarchical approach. At the microscopic level (pore size), we describe the basic phenomena (adsorption/desorption, capillary condensation, molecular and Knudsen diffusion, Hagen-Poiseuille flow) that occur during the isotherm water vapor transport in a single cylindrical pore, at the steady state. The transport through a condensed pore is taken into account by its 'vapor equivalent flow' and we underline that capillary condensation may cause vapor flow amplification of several orders of magnitude. We suggest to use an electrical analogy between a cylindrical pore and a Zener diode. Then at the mesoscopic level (material size), we introduce pore networks to provide use with a simplified description of the microstructure. Three types of networks are studied: square, triangular and honeycomb. By using a random distribution of the single cylindrical pores on the 2D networks, we are able to estimate the sorption isotherms and the water vapor permeability which are the two essential characteristics to understand the behaviour of materials towards humidity. To develop this approach we refer to the percolation concept and we use most of its principal results. To estimate the adsorption isotherms we introduce a surface adsorption model and we use the KELVIN-LAPLACE equation. Hysteresis appears naturally thanks to the 'ink-bottle' phenomenon and it is all the more important since the network is ill-connected. The water vapor permeability is calculated thanks to the electrical analogy (cylindrical pore-Zener diode). We emphasize an important amplification of the equivalent permeability when the relative humidity reaches a threshold value. This phenomenon provides use with a possible explanation of numerous experimental results. The respective effects of pore size distribution and temperature, on sorption isotherms and permeability, are presented. We present several

  13. Application of fault tree analysis for customer reliability assessment of a distribution power system

    International Nuclear Information System (INIS)

    Abdul Rahman, Fariz; Varuttamaseni, Athi; Kintner-Meyer, Michael; Lee, John C.

    2013-01-01

    A new method is developed for predicting customer reliability of a distribution power system using the fault tree approach with customer weighted values of component failure frequencies and downtimes. Conventional customer reliability prediction of the electric grid employs the system average (SA) component failure frequency and downtime that are weighted by only the quantity of the components in the system. These SA parameters are then used to calculate the reliability and availability of components in the system, and eventually to find the effect on customer reliability. Although this approach is intuitive, information is lost regarding customer disturbance experiences when customer information is not utilized in the SA parameter calculations, contributing to inaccuracies when predicting customer reliability indices in our study. Hence our new approach directly incorporates customer disturbance information in component failure frequency and downtime calculations by weighting these parameters with information of customer interruptions. This customer weighted (CW) approach significantly improves the prediction of customer reliability indices when applied to our reliability model with fault tree and two-state Markov chain formulations. Our method has been successfully applied to an actual distribution power system that serves over 2.1 million customers. Our results show an improved benchmarking performance on the system average interruption frequency index (SAIFI) by 26% between the SA-based and CW-based reliability calculations. - Highlights: ► We model the reliability of a power system with fault tree and two-state Markov chain. ► We propose using customer weighted component failure frequencies and downtimes. ► Results show customer weighted values perform superior to component average values. ► This method successfully incorporates customer disturbance information into the model.

  14. Reliability and safety of nuclear power stations

    International Nuclear Information System (INIS)

    Stepanek, S.

    1979-01-01

    The main problems are briefly discussed associated with the assessment of the safety and reliability of reactor pressure vessels. Two approaches are being applied to the assessment: one is based on the crack arrest temperature, the other on the determination of conditions corresponding to brittle fracture formation and on the determination of the critical defect size. The importance is stressed of continuous in-service inspection which may increase the factor of reliability by up to 10 4 times. (Z.M.)

  15. Telecommunications system reliability engineering theory and practice

    CERN Document Server

    Ayers, Mark L

    2012-01-01

    "Increasing system complexity require new, more sophisticated tools for system modeling and metric calculation. Bringing the field up to date, this book provides telecommunications engineers with practical tools for analyzing, calculating, and reporting availability, reliability, and maintainability metrics. It gives the background in system reliability theory and covers in-depth applications in fiber optic networks, microwave networks, satellite networks, power systems, and facilities management. Computer programming tools for simulating the approaches presented, using the Matlab software suite, are also provided"

  16. The development of a nuclear chemical plant human reliability management approach: HRMS and JHEDI

    International Nuclear Information System (INIS)

    Kirwan, Barry

    1997-01-01

    In the late 1980's, amidst the qualitative and quantitative validation of certain Human Reliability Assessment (HRA) techniques, there was a desire for a new technique specifically for a nuclear reprocessing plant being designed. The technique was to have the following attributes: it should be data-based rather than involving pure expert judgement; it was to be flexible, so that it would allow both relatively rapid screening and more detailed assessment; and it was to have sensitivity analysis possibilities, so that Human Factors design-related parameters, albeit at a gross level, could be brought into the risk assessment equation. The techniques and literature were surveyed, and it was decided that no one technique fulfilled these requirements, and so a new approach was developed. Two techniques were devised, the Human Reliability Management System (HRMS), and the Justification of Human Error Data Information (JHEDI) technique, the latter being essentially a quicker screening version of the former. Both techniques carry out task analysis, error analysis, and Performance Shaping Factor-based quantification, but JHEDI involves less detailed assessment than HRMS. Additionally, HRMS can be utilised to determine error reduction mechanisms, based on the way the Performance Shaping Factors are contributing to the assessed error probabilities. Both techniques are fully computerised and assessments are highly documentable and auditable, which was seen as a useful feature both by the company developing the techniques, and by the regulatory authorities assessing the final output risk assessments into which these two techniques fed data. This paper focuses in particular on the quantification process used by these techniques. The quantification approach for both techniques was principally one of extrapolation from real data to the desired Human Error Probability (HEP), based on a comparison between Performance Shaping Factor (PSF) profiles for the real, and the to

  17. DECISION USEFULNESS: TRADE-OFF ANTARA RELIABILITY DAN RELEVANCE

    Directory of Open Access Journals (Sweden)

    AGUS INDRA TENAYA

    2007-07-01

    Full Text Available The purpose of this article is to search for trade-off solution betweenreliability and relevance. Approach that can be used to have more reliable andrelevant financial statement is decision usefulness. This approach suggests thatfinancial statement must be useful to become a base of investors’ decision making.The change function of financial statement from just a tool of responsibility tobecome a tool of decision making has caused historical cost-based financialstatement could not be used to predict future value of a firm. This problem couldbe solved by presenting full disclosure of financial statement. Discussion sessionshows that full disclosure results in more useful and reliable accountinginformation to be used in decision making process of various users.

  18. Equipment reliability process improvement and preventive maintenance optimization

    International Nuclear Information System (INIS)

    Darragi, M.; Georges, A.; Vaillancourt, R.; Komljenovic, D.; Croteau, M.

    2004-01-01

    The Gentilly-2 Nuclear Power Plant wants to optimize its preventive maintenance program through an Integrated Equipment Reliability Process. All equipment reliability related activities should be reviewed and optimized in a systematic approach especially for aging plants such as G2. This new approach has to be founded on best practices methods with the purpose of the rationalization of the preventive maintenance program and the performance monitoring of on-site systems, structures and components (SSC). A rational preventive maintenance strategy is based on optimized task scopes and frequencies depending on their applicability, critical effects on system safety and plant availability as well as cost-effectiveness. Preventive maintenance strategy efficiency is systematically monitored through degradation indicators. (author)

  19. Field Programmable Gate Array Reliability Analysis Guidelines for Launch Vehicle Reliability Block Diagrams

    Science.gov (United States)

    Al Hassan, Mohammad; Britton, Paul; Hatfield, Glen Spencer; Novack, Steven D.

    2017-01-01

    Field Programmable Gate Arrays (FPGAs) integrated circuits (IC) are one of the key electronic components in today's sophisticated launch and space vehicle complex avionic systems, largely due to their superb reprogrammable and reconfigurable capabilities combined with relatively low non-recurring engineering costs (NRE) and short design cycle. Consequently, FPGAs are prevalent ICs in communication protocols and control signal commands. This paper will identify reliability concerns and high level guidelines to estimate FPGA total failure rates in a launch vehicle application. The paper will discuss hardware, hardware description language, and radiation induced failures. The hardware contribution of the approach accounts for physical failures of the IC. The hardware description language portion will discuss the high level FPGA programming languages and software/code reliability growth. The radiation portion will discuss FPGA susceptibility to space environment radiation.

  20. An Intuitionistic Fuzzy Methodology for Component-Based Software Reliability Optimization

    DEFF Research Database (Denmark)

    Madsen, Henrik; Grigore, Albeanu; Popenţiuvlǎdicescu, Florin

    2012-01-01

    Component-based software development is the current methodology facilitating agility in project management, software reuse in design and implementation, promoting quality and productivity, and increasing the reliability and performability. This paper illustrates the usage of intuitionistic fuzzy...... degree approach in modelling the quality of entities in imprecise software reliability computing in order to optimize management results. Intuitionistic fuzzy optimization algorithms are proposed to be used for complex software systems reliability optimization under various constraints....

  1. User's guide to the Reliability Estimation System Testbed (REST)

    Science.gov (United States)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  2. A novel optimization approach to estimating kinetic parameters of the enzymatic hydrolysis of corn stover

    Directory of Open Access Journals (Sweden)

    Fenglei Qi

    2016-01-01

    Full Text Available Enzymatic hydrolysis is an integral step in the conversion of lignocellulosic biomass to ethanol. The conversion of cellulose to fermentable sugars in the presence of inhibitors is a complex kinetic problem. In this study, we describe a novel approach to estimating the kinetic parameters underlying this process. This study employs experimental data measuring substrate and enzyme loadings, sugar and acid inhibitions for the production of glucose. Multiple objectives to minimize the difference between model predictions and experimental observations are developed and optimized by adopting multi-objective particle swarm optimization method. Model reliability is assessed by exploring likelihood profile in each parameter space. Compared to previous studies, this approach improved the prediction of sugar yields by reducing the mean squared errors by 34% for glucose and 2.7% for cellobiose, suggesting improved agreement between model predictions and the experimental data. Furthermore, kinetic parameters such as K2IG2, K1IG, K2IG, K1IA, and K3IA are identified as contributors to the model non-identifiability and wide parameter confidence intervals. Model reliability analysis indicates possible ways to reduce model non-identifiability and tighten parameter confidence intervals. These results could help improve the design of lignocellulosic biorefineries by providing higher fidelity predictions of fermentable sugars under inhibitory conditions.

  3. Gearbox Reliability Collaborative Bearing Calibration

    Energy Technology Data Exchange (ETDEWEB)

    van Dam, J.

    2011-10-01

    NREL has initiated the Gearbox Reliability Collaborative (GRC) to investigate the root cause of the low wind turbine gearbox reliability. The GRC follows a multi-pronged approach based on a collaborative of manufacturers, owners, researchers and consultants. The project combines analysis, field testing, dynamometer testing, condition monitoring, and the development and population of a gearbox failure database. At the core of the project are two 750kW gearboxes that have been redesigned and rebuilt so that they are representative of the multi-megawatt gearbox topology currently used in the industry. These gearboxes are heavily instrumented and are tested in the field and on the dynamometer. This report discusses the bearing calibrations of the gearboxes.

  4. Limits on reliable information flows through stochastic populations.

    Science.gov (United States)

    Boczkowski, Lucas; Natale, Emanuele; Feinerman, Ofer; Korman, Amos

    2018-06-06

    Biological systems can share and collectively process information to yield emergent effects, despite inherent noise in communication. While man-made systems often employ intricate structural solutions to overcome noise, the structure of many biological systems is more amorphous. It is not well understood how communication noise may affect the computational repertoire of such groups. To approach this question we consider the basic collective task of rumor spreading, in which information from few knowledgeable sources must reliably flow into the rest of the population. We study the effect of communication noise on the ability of groups that lack stable structures to efficiently solve this task. We present an impossibility result which strongly restricts reliable rumor spreading in such groups. Namely, we prove that, in the presence of even moderate levels of noise that affect all facets of the communication, no scheme can significantly outperform the trivial one in which agents have to wait until directly interacting with the sources-a process which requires linear time in the population size. Our results imply that in order to achieve efficient rumor spread a system must exhibit either some degree of structural stability or, alternatively, some facet of the communication which is immune to noise. We then corroborate this claim by providing new analyses of experimental data regarding recruitment in Cataglyphis niger desert ants. Finally, in light of our theoretical results, we discuss strategies to overcome noise in other biological systems.

  5. SGHWR fuel performance, safety and reliability

    International Nuclear Information System (INIS)

    Pickman, D.O.; Inglis, G.H.

    1977-05-01

    The design principles involved in fuel pins and elements need to take account of the sometimes conflicting requirements of safety and reliability. The principal factors involved in this optimisation are discussed and it is shown from fuel irradiation experience in the Winfrith SGHWR that the necessary bias towards safety has not resulted in a reliability level lower than that shown by other successful water reactor designs. Reliability has important economic implications. By a detailed evaluation of SGHWR fuel defects it is shown that very few defects can be shown to be related to design, rating, or burn-up. This demonstrates that economic aspects have not over-ridden necessary criteria that most be met to achieve the desirable reliability level. It is possible that large scale experience on SGHWR fuel may eventually demonstrate that the balance is too much in favour of reliability and consideration may be given to whether design changes favouring economy could be achieved without compromising safety. The safety criteria applied to SGHWR fuel are designed to avoid any possibility of a temperature runaway in any credible accident situation. the philosophy and supporting experimental work programme are outlines and the fuel design features which particularly contribute to maximising safety margins are outlined. Reference is made to the new 60-pin fuel element to be used in the commercial SGHWRs and to its comparison in design and performance aspects with the 36-pin element that has been used to date in the Winfrith SGHWR. (author)

  6. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    Science.gov (United States)

    Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra

    2014-06-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute to stability of the tunnel structure are recognized owing to identify various aspects of reliability and sustainability in the system. The selection of efficient support methods for rock tunneling is a key factor in order to reduce the number of problems during construction and maintain the project cost and time within the limited budget and planned schedule. This paper introduces a smart approach by which decision-makers will be able to find the overall reliability of tunnel support system before selecting the final scheme of the lining system. Due to this research focus, engineering reliability which is a branch of statistics and probability is being appropriately applied to the field and much effort has been made to use it in tunneling while investigating the reliability of the lining support system for the tunnel structure. Therefore, reliability analysis for evaluating the tunnel support performance is the main idea used in this research. Decomposition approaches are used for producing system block diagram and determining the failure probability of the whole system. Effectiveness of the proposed reliability model of tunnel lining together with the recommended approaches is examined using several case studies and the final value of reliability obtained for different designing scenarios. Considering the idea of linear correlation between safety factors and reliability parameters, the values of isolated reliabilities determined for different structural components of tunnel support system. In order to determine individual safety factors, finite element modeling is employed for different structural subsystems and the results of numerical analyses are obtained in

  7. A practical approach for solving multi-objective reliability redundancy allocation problems using extended bare-bones particle swarm optimization

    International Nuclear Information System (INIS)

    Zhang, Enze; Wu, Yifei; Chen, Qingwei

    2014-01-01

    This paper proposes a practical approach, combining bare-bones particle swarm optimization and sensitivity-based clustering for solving multi-objective reliability redundancy allocation problems (RAPs). A two-stage process is performed to identify promising solutions. Specifically, a new bare-bones multi-objective particle swarm optimization algorithm (BBMOPSO) is developed and applied in the first stage to identify a Pareto-optimal set. This algorithm mainly differs from other multi-objective particle swarm optimization algorithms in the parameter-free particle updating strategy, which is especially suitable for handling the complexity and nonlinearity of RAPs. Moreover, by utilizing an approach based on the adaptive grid to update the global particle leaders, a mutation operator to improve the exploration ability and an effective constraint handling strategy, the integrated BBMOPSO algorithm can generate excellent approximation of the true Pareto-optimal front for RAPs. This is followed by a data clustering technique based on difference sensitivity in the second stage to prune the obtained Pareto-optimal set and obtain a small, workable sized set of promising solutions for system implementation. Two illustrative examples are presented to show the feasibility and effectiveness of the proposed approach

  8. Hemorrhoids: an experimental model in monkeys

    Directory of Open Access Journals (Sweden)

    Plapler Hélio

    2006-01-01

    Full Text Available PURPOSE: Hemorrhoids are a matter of concern due to a painful outcome. We describe a simple, easy and reliable experimental model to produce hemorrhoids in monkeys. METHODS: 14 monkeys (Cebus apella were used. After general anesthesia, hemorrhoids were induced by ligation of the inferior hemorrhoidal vein, which is very alike to humans. The vein was located through a perianal incision, dissected and ligated with a 3-0 vicryl. The skin was sutured with a 4-0 catgut thread. Animals were kept in appropriate cages and evaluated daily. RESULTS: Nine days later there were hemorrhoidal piles in the anus in fifty percent (50% of the animals. Outcome was unremarkable. There was no bleeding and all animals showed no signs of pain or suffering. CONCLUSION: This is an affordable and reliable experimental model to induce hemorrhoids for experimental studies.

  9. Electrical system design and reliability at Ontario Hydro nuclear generating stations

    Energy Technology Data Exchange (ETDEWEB)

    Royce, C. J. [Ontario Hydro, 700 University Avenue, Toronto, Ontario M5G 1X6 (Canada)

    1986-02-15

    This paper provides an overview of design practice and the predicted and actual reliability of electrical station service Systems at Ontario Nuclear Generating Stations. Operational experience and licensing changes have indicated the desirability of improving reliability in certain instances. For example, the requirement to start large emergency coolant injection pumps resulted in the turbine generator units in a multi-unit station being used as a back-up power supply. Results of reliability analyses are discussed. To mitigate the effects of common mode events Ontario Hydro adopted a 'two group' approach to the design of safety related Systems. This 'two group' approach is reviewed and a single fully environmentally qualified standby power supply is proposed for future use. (author)

  10. Reliability allocation problem in a series-parallel system

    International Nuclear Information System (INIS)

    Yalaoui, Alice; Chu, Chengbin; Chatelet, Eric

    2005-01-01

    In order to improve system reliability, designers may introduce in a system different technologies in parallel. When each technology is composed of components in series, the configuration belongs to the series-parallel systems. This type of system has not been studied as much as the parallel-series architecture. There exist no methods dedicated to the reliability allocation in series-parallel systems with different technologies. We propose in this paper theoretical and practical results for the allocation problem in a series-parallel system. Two resolution approaches are developed. Firstly, a one stage problem is studied and the results are exploited for the multi-stages problem. A theoretical condition for obtaining the optimal allocation is developed. Since this condition is too restrictive, we secondly propose an alternative approach based on an approximated function and the results of the one-stage study. This second approach is applied to numerical examples

  11. An analytical framework for reliability growth of one-shot systems

    International Nuclear Information System (INIS)

    Hall, J. Brian; Mosleh, Ali

    2008-01-01

    In this paper, we introduce a new reliability growth methodology for one-shot systems that is applicable to the case where all corrective actions are implemented at the end of the current test phase. The methodology consists of four model equations for assessing: expected reliability, the expected number of failure modes observed in testing, the expected probability of discovering new failure modes, and the expected portion of system unreliability associated with repeat failure modes. These model equations provide an analytical framework for which reliability practitioners can estimate reliability improvement, address goodness-of-fit concerns, quantify programmatic risk, and assess reliability maturity of one-shot systems. A numerical example is given to illustrate the value and utility of the presented approach. This methodology is useful to program managers and reliability practitioners interested in applying the techniques above in their reliability growth program

  12. Reliability assessment of Port Harcourt 33/11kv Distribution System ...

    African Journals Online (AJOL)

    This makes reliability studies an important task besides all the other analyses required for assessing the system performance. The paper presents an analytical approach in the reliability assessment of the Port Harcourt 33/11kV power distribution system. The assessment was performed with the 2009 power outage data ...

  13. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  14. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  15. Advanced solutions for operational reliability improvements

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, K [VTT Manufacturing Technology, Espoo (Finland)

    1998-12-31

    A great number of new technical tools are today developed for improved operational reliability of machines and industrial equipment. Examples of such techniques and tools recently developed at the Technical Research Centre of Finland (VTT) are: metallographic approach for steam-piping lifetime estimation, an expert system AURORA for corrosion prediction and material selection, an automatic image-processing-based on-line wear particle analysis system, microsensors for condition monitoring, a condition monitoring and expert system, CEPDIA, for the diagnosis of centrifugal pumps, a machine tool analysis and diagnostic expert system, non-leakage magnetic fluid seals with extended lifetime and diamond-like surface coatings on components with decreased friction and wear properties. A hyperbook-supported holistic approach to problem solving in maintenance and reliability engineering has been developed to help the user achieve a holistic understanding of the problem and its relationships, to navigate among the several technical tools and methods available, and to find those suitable for his application. (orig.)

  16. Advanced solutions for operational reliability improvements

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, K. [VTT Manufacturing Technology, Espoo (Finland)

    1997-12-31

    A great number of new technical tools are today developed for improved operational reliability of machines and industrial equipment. Examples of such techniques and tools recently developed at the Technical Research Centre of Finland (VTT) are: metallographic approach for steam-piping lifetime estimation, an expert system AURORA for corrosion prediction and material selection, an automatic image-processing-based on-line wear particle analysis system, microsensors for condition monitoring, a condition monitoring and expert system, CEPDIA, for the diagnosis of centrifugal pumps, a machine tool analysis and diagnostic expert system, non-leakage magnetic fluid seals with extended lifetime and diamond-like surface coatings on components with decreased friction and wear properties. A hyperbook-supported holistic approach to problem solving in maintenance and reliability engineering has been developed to help the user achieve a holistic understanding of the problem and its relationships, to navigate among the several technical tools and methods available, and to find those suitable for his application. (orig.)

  17. Reliable Gait Recognition Using 3D Reconstructions and Random Forests - An Anthropometric Approach

    DEFF Research Database (Denmark)

    Sandau, Martin; Heimbürger, Rikke V.; Jensen, Karl E.

    2016-01-01

    reliable recognition. Sixteen participants performed normal walking where 3D reconstructions were obtained continually. Segment lengths and kinematics from the extremities were manually extracted by eight expert observers. The results showed that all the participants were recognized, assuming the same...... expert annotated the data. Recognition based on data annotated by different experts was less reliable achieving 72.6% correct recognitions as some parameters were heavily affected by interobserver variability. This study verified that 3D reconstructions are feasible for forensic gait analysis...

  18. Methodology for allocating reliability and risk

    International Nuclear Information System (INIS)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.

    1986-05-01

    This report describes a methodology for reliability and risk allocation in nuclear power plants. The work investigates the technical feasibility of allocating reliability and risk, which are expressed in a set of global safety criteria and which may not necessarily be rigid, to various reactor systems, subsystems, components, operations, and structures in a consistent manner. The report also provides general discussions on the problem of reliability and risk allocation. The problem is formulated as a multiattribute decision analysis paradigm. The work mainly addresses the first two steps of a typical decision analysis, i.e., (1) identifying alternatives, and (2) generating information on outcomes of the alternatives, by performing a multiobjective optimization on a PRA model and reliability cost functions. The multiobjective optimization serves as the guiding principle to reliability and risk allocation. The concept of ''noninferiority'' is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The final step of decision analysis, i.e., assessment of the decision maker's preferences could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided, and several outstanding issues such as generic allocation, preference assessment, and uncertainty are discussed. 29 refs., 44 figs., 39 tabs

  19. Reliability Correction for Functional Connectivity: Theory and Implementation

    Science.gov (United States)

    Mueller, Sophia; Wang, Danhong; Fox, Michael D.; Pan, Ruiqi; Lu, Jie; Li, Kuncheng; Sun, Wei; Buckner, Randy L.; Liu, Hesheng

    2016-01-01

    Network properties can be estimated using functional connectivity MRI (fcMRI). However, regional variation of the fMRI signal causes systematic biases in network estimates including correlation attenuation in regions of low measurement reliability. Here we computed the spatial distribution of fcMRI reliability using longitudinal fcMRI datasets and demonstrated how pre-estimated reliability maps can correct for correlation attenuation. As a test case of reliability-based attenuation correction we estimated properties of the default network, where reliability was significantly lower than average in the medial temporal lobe and higher in the posterior medial cortex, heterogeneity that impacts estimation of the network. Accounting for this bias using attenuation correction revealed that the medial temporal lobe’s contribution to the default network is typically underestimated. To render this approach useful to a greater number of datasets, we demonstrate that test-retest reliability maps derived from repeated runs within a single scanning session can be used as a surrogate for multi-session reliability mapping. Using data segments with different scan lengths between 1 and 30 min, we found that test-retest reliability of connectivity estimates increases with scan length while the spatial distribution of reliability is relatively stable even at short scan lengths. Finally, analyses of tertiary data revealed that reliability distribution is influenced by age, neuropsychiatric status and scanner type, suggesting that reliability correction may be especially important when studying between-group differences. Collectively, these results illustrate that reliability-based attenuation correction is an easily implemented strategy that mitigates certain features of fMRI signal nonuniformity. PMID:26493163

  20. An experimental approach to validating a theory of human error in complex systems

    Science.gov (United States)

    Morris, N. M.; Rouse, W. B.

    1985-01-01

    The problem of 'human error' is pervasive in engineering systems in which the human is involved. In contrast to the common engineering approach of dealing with error probabilistically, the present research seeks to alleviate problems associated with error by gaining a greater understanding of causes and contributing factors from a human information processing perspective. The general approach involves identifying conditions which are hypothesized to contribute to errors, and experimentally creating the conditions in order to verify the hypotheses. The conceptual framework which serves as the basis for this research is discussed briefly, followed by a description of upcoming research. Finally, the potential relevance of this research to design, training, and aiding issues is discussed.

  1. Efficient surrogate models for reliability analysis of systems with multiple failure modes

    International Nuclear Information System (INIS)

    Bichon, Barron J.; McFarland, John M.; Mahadevan, Sankaran

    2011-01-01

    Despite many advances in the field of computational reliability analysis, the efficient estimation of the reliability of a system with multiple failure modes remains a persistent challenge. Various sampling and analytical methods are available, but they typically require accepting a tradeoff between accuracy and computational efficiency. In this work, a surrogate-based approach is presented that simultaneously addresses the issues of accuracy, efficiency, and unimportant failure modes. The method is based on the creation of Gaussian process surrogate models that are required to be locally accurate only in the regions of the component limit states that contribute to system failure. This approach to constructing surrogate models is demonstrated to be both an efficient and accurate method for system-level reliability analysis. - Highlights: → Extends efficient global reliability analysis to systems with multiple failure modes. → Constructs locally accurate Gaussian process models of each response. → Highly efficient and accurate method for assessing system reliability. → Effectiveness is demonstrated on several test problems from the literature.

  2. The Threat of Uncertainty: Why Using Traditional Approaches for Evaluating Spacecraft Reliability are Insufficient for Future Human Mars Missions

    Science.gov (United States)

    Stromgren, Chel; Goodliff, Kandyce; Cirillo, William; Owens, Andrew

    2016-01-01

    Through the Evolvable Mars Campaign (EMC) study, the National Aeronautics and Space Administration (NASA) continues to evaluate potential approaches for sending humans beyond low Earth orbit (LEO). A key aspect of these missions is the strategy that is employed to maintain and repair the spacecraft systems, ensuring that they continue to function and support the crew. Long duration missions beyond LEO present unique and severe maintainability challenges due to a variety of factors, including: limited to no opportunities for resupply, the distance from Earth, mass and volume constraints of spacecraft, high sensitivity of transportation element designs to variation in mass, the lack of abort opportunities to Earth, limited hardware heritage information, and the operation of human-rated systems in a radiation environment with little to no experience. The current approach to maintainability, as implemented on ISS, which includes a large number of spares pre-positioned on ISS, a larger supply sitting on Earth waiting to be flown to ISS, and an on demand delivery of logistics from Earth, is not feasible for future deep space human missions. For missions beyond LEO, significant modifications to the maintainability approach will be required.Through the EMC evaluations, several key findings related to the reliability and safety of the Mars spacecraft have been made. The nature of random and induced failures presents significant issues for deep space missions. Because spare parts cannot be flown as needed for Mars missions, all required spares must be flown with the mission or pre-positioned. These spares must cover all anticipated failure modes and provide a level of overall reliability and safety that is satisfactory for human missions. This will require a large amount of mass and volume be dedicated to storage and transport of spares for the mission. Further, there is, and will continue to be, a significant amount of uncertainty regarding failure rates for spacecraft

  3. Novel approach based on microcontroller to online protection of induction motors

    International Nuclear Information System (INIS)

    Bayindir, Ramazan; Sefa, Ibrahim

    2007-01-01

    The study presents a combined protection approach for induction motors (IMs). To achieve this, the current, voltage, speed and temperature values of the IM were measured with sensors and processed automatically with the developed software in C. The processes were then inserted into a microcontroller. The experimental results have shown that the IM was protected against the possible problems encountered in online operation. The approach presented in this work provides flexibility, accuracy and reliability for smooth protection with the help of the combined protection approach. Moreover, the protection system can easily be applied to larger motors after small modifications in the software developed. It is also expected that the proposed system would provide faster and low cost protection

  4. To the problem of reliability standardization in computer-aided manufacturing at NPP units

    International Nuclear Information System (INIS)

    Yastrebenetskij, M.A.; Shvyryaev, Yu.V.; Spektor, L.I.; Nikonenko, I.V.

    1989-01-01

    The problems of reliability standardization in computer-aided manufacturing of NPP units considering the following approaches: computer-aided manufacturing of NPP units as a part of automated technological complex; computer-aided manufacturing of NPP units as multi-functional system, are analyzed. Selection of the composition of reliability indeces for computer-aided manufacturing of NPP units for each of the approaches considered is substantiated

  5. Reliability Evaluation of Service-Oriented Architecture Systems Considering Fault-Tolerance Designs

    Directory of Open Access Journals (Sweden)

    Kuan-Li Peng

    2014-01-01

    strategies. Sensitivity analysis of SOA at both coarse and fine grain levels is also studied, which can be used to efficiently identify the critical parts within the system. Two SOA system scenarios based on real industrial practices are studied. Experimental results show that the proposed SOA model can be used to accurately depict the behavior of SOA systems. Additionally, a sensitivity analysis that quantizes the effects of system structure as well as fault tolerance on the overall reliability is also studied. On the whole, the proposed reliability modeling and analysis framework may help the SOA system service provider to evaluate the overall system reliability effectively and also make smarter improvement plans by focusing resources on enhancing reliability-sensitive parts within the system.

  6. An experimental approach to improve the basin type solar still using an integrated natural circulation loop

    International Nuclear Information System (INIS)

    Rahmani, Ahmed; Boutriaa, Abdelouahab; Hadef, Amar

    2015-01-01

    Highlights: • A new experimental approach to improve the conventional solar still performances is proposed. • A passive natural circulation loop is integrated to the conventional solar still. • Natural circulation of humid-air in a closed loop is studied by the present study. • Natural circulation capability in driving air convection in the still was demonstrated. • Air convection created inside the still increase the evaporation heat and mass transfer. - Abstract: In this paper, a new experimental approach is proposed to enhance the performances of the conventional solar still using the natural circulation effect inside the still. The idea consists in generating air flow by a rectangular natural circulation loop appended to the rear side of the still. The proposed still was tested during summer period and the experimental data presented in this paper concerns four typical days. The convective heat transfer coefficient is evaluated and compared with Dunkle’s model. The comparison shows that convective heat transfer is considerably improved by the air convection created inside the still. The natural circulation phenomenon in the still is studied and a good agreement between the experimental data and Vijajan’s laminar correlation is found. Therefore, natural circulation phenomenon is found to have a good effect on the still performances where the still daily productivity is of 3.72 kg/m 2 and the maximum efficiency is of 45.15%

  7. Reliability demonstration of imaging surveillance systems

    International Nuclear Information System (INIS)

    Sheridan, T.F.; Henderson, J.T.; MacDiarmid, P.R.

    1979-01-01

    Security surveillance systems which employ closed circuit television are being deployed with increasing frequency for the protection of property and other valuable assets. A need exists to demonstrate the reliability of such systems before their installation to assure that the deployed systems will operate when needed with only the scheduled amount of maintenance and support costs. An approach to the reliability demonstration of imaging surveillance systems which employ closed circuit television is described. Failure definitions based on industry television standards and imaging alarm assessment criteria for surveillance systems are discussed. Test methods which allow 24 hour a day operation without the need for numerous test scenarios, test personnel and elaborate test facilities are presented. Existing reliability demonstration standards are shown to apply which obviate the need for elaborate statistical tests. The demonstration methods employed are shown to have applications in other types of imaging surveillance systems besides closed circuit television

  8. Fuzzy QFD for supply chain management with reliability consideration

    International Nuclear Information System (INIS)

    Sohn, So Young; Choi, In Su

    2001-01-01

    Although many products are made through several tiers of supply chains, a systematic way of handling reliability issues in a various product planning stage has drawn attention, only recently, in the context of supply chain management (SCM). The main objective of this paper is to develop a fuzzy quality function deployment (QFD) model in order to convey fuzzy relationship between customers needs and design specification for reliability in the context of SCM. A fuzzy multi criteria decision-making procedure is proposed and is applied to find a set of optimal solution with respect to the performance of the reliability test needed in CRT design. It is expected that the proposed approach can make significant contributions on the following areas: effectively communicating with technical personnel and users; developing relatively error-free reliability review system; and creating consistent and complete documentation for design for reliability

  9. Fuzzy QFD for supply chain management with reliability consideration

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, So Young; Choi, In Su

    2001-06-01

    Although many products are made through several tiers of supply chains, a systematic way of handling reliability issues in a various product planning stage has drawn attention, only recently, in the context of supply chain management (SCM). The main objective of this paper is to develop a fuzzy quality function deployment (QFD) model in order to convey fuzzy relationship between customers needs and design specification for reliability in the context of SCM. A fuzzy multi criteria decision-making procedure is proposed and is applied to find a set of optimal solution with respect to the performance of the reliability test needed in CRT design. It is expected that the proposed approach can make significant contributions on the following areas: effectively communicating with technical personnel and users; developing relatively error-free reliability review system; and creating consistent and complete documentation for design for reliability.

  10. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Science.gov (United States)

    2018-01-01

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29495599

  11. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Directory of Open Access Journals (Sweden)

    Agustín Zaballos

    2018-02-01

    Full Text Available Information and communication technologies (ICTs have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  12. Human factors perspective on the reliability of NDT in nuclear applications

    International Nuclear Information System (INIS)

    Bertovic, Marija; Mueller, Christina; Fahlbruch, Babette

    2013-01-01

    A series of research studies have been conducted over the course of five years venturing into the fields of in-service inspections (ISI) in nuclear power plants (NPPs) and inspection of manufactured components to be used for permanent nuclear waste disposal. This paper will provide an overview of four research studies, present selected experimental results and suggest ways for optimization of the NDT process, procedures, and training. The experimental results have shown that time pressure and mental workload negatively influence the quality of the manual inspection performance. Noticeable were influences of the organization of the working schedule, communication, procedures, supervision, and demonstration task. Customized Failure Mode and Effects Analysis (FMEA) was used to identify potential human risks, arising during acquisition and evaluation of NDT data. Several preventive measures were suggested and furthermore discussed, with respect to problems that could arise from their application. Experimental results show that implementing human redundancy in critical tasks, such as defect identification, as well as using an automated aid (software) to help operators in decision making about the existence and size of defects, could lead to other kinds of problems, namely social loafing and automation bias that might affect the reliability of NDT in an undesired manner. Shifting focus from the operator, as the main source of errors, to the organization, as the underlying source, is a recommended approach to ensure safety. (orig.) [de

  13. Statistical approach for uncertainty quantification of experimental modal model parameters

    DEFF Research Database (Denmark)

    Luczak, M.; Peeters, B.; Kahsin, M.

    2014-01-01

    Composite materials are widely used in manufacture of aerospace and wind energy structural components. These load carrying structures are subjected to dynamic time-varying loading conditions. Robust structural dynamics identification procedure impose tight constraints on the quality of modal models...... represent different complexity levels ranging from coupon, through sub-component up to fully assembled aerospace and wind energy structural components made of composite materials. The proposed method is demonstrated on two application cases of a small and large wind turbine blade........ This paper aims at a systematic approach for uncertainty quantification of the parameters of the modal models estimated from experimentally obtained data. Statistical analysis of modal parameters is implemented to derive an assessment of the entire modal model uncertainty measure. Investigated structures...

  14. Independent component analysis for the extraction of reliable protein signal profiles from MALDI-TOF mass spectra.

    Science.gov (United States)

    Mantini, Dante; Petrucci, Francesca; Del Boccio, Piero; Pieragostino, Damiana; Di Nicola, Marta; Lugaresi, Alessandra; Federici, Giorgio; Sacchetta, Paolo; Di Ilio, Carmine; Urbani, Andrea

    2008-01-01

    Independent component analysis (ICA) is a signal processing technique that can be utilized to recover independent signals from a set of their linear mixtures. We propose ICA for the analysis of signals obtained from large proteomics investigations such as clinical multi-subject studies based on MALDI-TOF MS profiling. The method is validated on simulated and experimental data for demonstrating its capability of correctly extracting protein profiles from MALDI-TOF mass spectra. The comparison on peak detection with an open-source and two commercial methods shows its superior reliability in reducing the false discovery rate of protein peak masses. Moreover, the integration of ICA and statistical tests for detecting the differences in peak intensities between experimental groups allows to identify protein peaks that could be indicators of a diseased state. This data-driven approach demonstrates to be a promising tool for biomarker-discovery studies based on MALDI-TOF MS technology. The MATLAB implementation of the method described in the article and both simulated and experimental data are freely available at http://www.unich.it/proteomica/bioinf/.

  15. Pipeline integrity model-a formative approach towards reliability and life assessment

    International Nuclear Information System (INIS)

    Sayed, A.M.; Jaffery, M.A.

    2005-01-01

    Pipe forms an integral part of transmission medium in oil and gas industry. This holds true for both upstream and downstream segments of this global energy business. With the aging of this asset base, emphasis on its operational aspects has been under immense considerations from the operators and regulators sides. Moreover, the milieu of information area and enhancement in global trade has lifted the barriers on means to forge forward towards better utilization of resources. This has resulted in optimized solutions as priority for business and technical manager's world over. There is a paradigm shift from mere development of 'smart materials' to 'low life cycle cost material'. The force inducing this change is a rationale one: the recovery of development costs is no more a problem in a global community; rather it is the pay-off time which matters most to the materials end users. This means that decision makers are not evaluating just the price offered but are keen to judge the entire life cycle cost of a product. The integrity of pipe are affected by factors such as corrosion, fatigue-crack growth, stress-corrosion cracking, and mechanical damage. Extensive research in the area of reliability and life assessment has been carried out. A number of models concerning with the reliability issues of pipes have been developed and are being used by a number of pipeline operators worldwide. Yet, it is emphasised that there are no substitute for sound engineering judgment and allowance for factors of safety. The ability of a laid down pipe network to transport the intended fluid under pre-defined conditions for the entire project envisaged life, is referred to the reliability of system. The reliability is built into the product through extensive benchmarking against industry standard codes. The process of pipes construction for oil and gas service is regulated through American Petroleum Institute's Specification for Line Pipe. Subsequently, specific programs have been

  16. Simulated patient training: Using inter-rater reliability to evaluate simulated patient consistency in nursing education.

    Science.gov (United States)

    MacLean, Sharon; Geddes, Fiona; Kelly, Michelle; Della, Phillip

    2018-03-01

    Simulated patients (SPs) are frequently used for training nursing students in communication skills. An acknowledged benefit of using SPs is the opportunity to provide a standardized approach by which participants can demonstrate and develop communication skills. However, relatively little evidence is available on how to best facilitate and evaluate the reliability and accuracy of SPs' performances. The aim of this study is to investigate the effectiveness of an evidenced based SP training framework to ensure standardization of SPs. The training framework was employed to improve inter-rater reliability of SPs. A quasi-experimental study was employed to assess SP post-training understanding of simulation scenario parameters using inter-rater reliability agreement indices. Two phases of data collection took place. Initially a trial phase including audio-visual (AV) recordings of two undergraduate nursing students completing a simulation scenario is rated by eight SPs using the Interpersonal Communication Assessments Scale (ICAS) and Quality of Discharge Teaching Scale (QDTS). In phase 2, eight SP raters and four nursing faculty raters independently evaluated students' (N=42) communication practices using the QDTS. Intraclass correlation coefficients (ICC) were >0.80 for both stages of the study in clinical communication skills. The results support the premise that if trained appropriately, SPs have a high degree of reliability and validity to both facilitate and evaluate student performance in nurse education. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.

  17. The crab Carcinus maenas as a suitable experimental model in ecotoxicology.

    Science.gov (United States)

    Rodrigues, Elsa Teresa; Pardal, Miguel Ângelo

    2014-09-01

    Aquatic ecotoxicology broadly focuses on how aquatic organisms interact with pollutants in their environment in order to determine environmental hazard and potential risks to humans. Research has produced increasing evidence on the pivotal role of aquatic invertebrates in the assessment of the impact of pollutants on the environment. Its potential use to replace fish bioassays, which offers ethical advantages, has already been widely studied. Nevertheless, the selection of adequate invertebrate experimental models, appropriate experimental designs and bioassays, as well as the control of potential confounding factors in toxicity testing are of major importance to obtain scientifically valid results. Therefore, the present study reviews more than four decades of published research papers in which the Green crab Carcinus maenas was used as an experimental test organism. In general, the surveyed literature indicates that C. maenas is sensitive to a wide range of aquatic pollutants and that its biological responses are linked to exposure concentrations or doses. Current scientific knowledge regarding the biology and ecology of C. maenas and the extensive studies on toxicology found for the present review recognise the Green crab as a reliable estuarine/marine model for routine testing in ecotoxicology research and environmental quality assessment, especially in what concerns the application of the biomarker approach. Data gathered provide valuable information for the selection of adequate and trustworthy bioassays to be used in C. maenas toxicity testing. Since the final expression of high quality testing is a reliable outcome, the present review recommends gender, size and morphotype separation in C. maenas experimental designs and data evaluation. Moreover, the organisms' nutritional status should be taken into account, especially in long-term studies. Studies should also consider the crabs' resilience when facing historical and concurrent contamination. Finally

  18. Reliable quantum communication over a quantum relay channel

    Energy Technology Data Exchange (ETDEWEB)

    Gyongyosi, Laszlo, E-mail: gyongyosi@hit.bme.hu [Quantum Technologies Laboratory, Department of Telecommunications, Budapest University of Technology and Economics, 2 Magyar tudosok krt, Budapest, H-1117, Hungary and Information Systems Research Group, Mathematics and Natural Sciences, Hungarian Ac (Hungary); Imre, Sandor [Quantum Technologies Laboratory, Department of Telecommunications, Budapest University of Technology and Economics, 2 Magyar tudosok krt, Budapest, H-1117 (Hungary)

    2014-12-04

    We show that reliable quantum communication over an unreliable quantum relay channels is possible. The coding scheme combines the results on the superadditivity of quantum channels and the efficient quantum coding approaches.

  19. The flaws and human harms of animal experimentation.

    Science.gov (United States)

    Akhtar, Aysha

    2015-10-01

    Nonhuman animal ("animal") experimentation is typically defended by arguments that it is reliable, that animals provide sufficiently good models of human biology and diseases to yield relevant information, and that, consequently, its use provides major human health benefits. I demonstrate that a growing body of scientific literature critically assessing the validity of animal experimentation generally (and animal modeling specifically) raises important concerns about its reliability and predictive value for human outcomes and for understanding human physiology. The unreliability of animal experimentation across a wide range of areas undermines scientific arguments in favor of the practice. Additionally, I show how animal experimentation often significantly harms humans through misleading safety studies, potential abandonment of effective therapeutics, and direction of resources away from more effective testing methods. The resulting evidence suggests that the collective harms and costs to humans from animal experimentation outweigh potential benefits and that resources would be better invested in developing human-based testing methods.

  20. The effect of the spatial positioning of items on the reliability of questionnaires measuring affect

    Directory of Open Access Journals (Sweden)

    Leigh Leo

    2016-08-01

    Full Text Available Orientation: Extant research has shown that the relationship between spatial location and affect may have pervasive effects on evaluation. In particular, experimental findings on embodied cognition indicate that a person is spatially orientated to position what is positive at the top and what is negative at the bottom (vertical spatial orientation, and to a lesser extent, to position what is positive on the left and what is negative on the right (horizontal spatial orientation. It is therefore hypothesised, that when there is congruence between a respondent’s spatial orientation (related to affect and the spatial positioning (layout of a questionnaire, the reliability will be higher than in the case of incongruence. Research purpose: The principal objective of the two studies reported here was to ascertain the extent to which congruence between a respondent’s spatial orientation (related to affect and the layout of the questionnaire (spatial positioning of questionnaire items may impact on the reliability of a questionnaire measuring affect. Motivation for the study: The spatial position of items on a questionnaire measuring affect may indirectly impact on the reliability of the questionnaire. Research approach, design and method: In both studies, a controlled experimental research design was conducted using a sample of university students (n = 1825. Major findings: In both experiments, evidence was found to support the hypothesis that greater congruence between a respondent’s spatial orientation (related to affect and the spatial positioning (layout of a questionnaire leads to higher reliability on a questionnaire measuring affect. Practical implications: These findings may serve to create awareness of the influence of the spatial positioning of items as a confounding variable in questionnaire design. Contribution/value-add: Overall, this research complements previous studies by confirming the metaphorical representation of affect and

  1. Competing risk models in reliability systems, a Weibull distribution model with Bayesian analysis approach

    International Nuclear Information System (INIS)

    Iskandar, Ismed; Gondokaryono, Yudi Satria

    2016-01-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  2. Developing a certifiable UAS reliability assessment approach through algorithmic redundancy, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Manned aircraft, civilian or military, are required to meet certain reliability standards specified by the FAA in order to operate in the US national airspace. These...

  3. The impact of reliability on the productivity of railroad companies

    DEFF Research Database (Denmark)

    Abate, Megersa Abera; Lijesen, Mark; Pels, Eric

    2013-01-01

    This paper studies the relationship between reliability (proxied by punctuality) and productivity in passenger railroad services. Increasing reliability may lower productivity, as it requires inputs, that can’t be used to produce outputs. The relationship between reliability and productivity also...... runs through other factors, in which case a positive relationship may be expected. We apply data envelopment analysis and the Malmquist index approach to a panel of seven European railway systems to explore this relationship. Our empirical results suggest that increasing reliability does not harm...... the productivity of railway operations and aiming to improve both may be a feasible strategy. © 2013 Elsevier Ltd. All rights reserved....

  4. Wave Energy Converters : An experimental approach to onshore testing, deployments and offshore monitoring

    OpenAIRE

    Ulvgård, Liselotte

    2017-01-01

    The wave energy converter (WEC) concept developed at Uppsala University consists of a point absorbing buoy, directly connected to a permanent magnet linear generator. Since 2006, over a dozen full scale WECs have been deployed at the Lysekil Research Site, on the west coast of Sweden. Beyond the development of the WEC concept itself, the full scale approach enables, and requires, experimental and multidisciplinary research within several peripheral areas, such as instrumentation, offshore ope...

  5. Some developments in human reliability analysis approaches and tools

    Energy Technology Data Exchange (ETDEWEB)

    Hannaman, G W; Worledge, D H

    1988-01-01

    Since human actions have been recognized as an important contributor to safety of operating plants in most industries, research has been performed to better understand and account for the way operators interact during accidents through the control room and equipment interface. This paper describes the integration of a series of research projects sponsored by the Electric Power Research Institute to strengthen the methods for performing the human reliability analysis portion of the probabilistic safety studies. It focuses on the analytical framework used to guide the analysis, the development of the models for quantifying time-dependent actions, and simulator experiments used to validate the models.

  6. EDF/EPRI collaborative program on operator reliability experiments

    International Nuclear Information System (INIS)

    Villemeur, A.; Meslin, T.; Mosneron, F.; Worledge, D.H.; Joksimovich, V.; Spurgin, A.J.

    1988-01-01

    Electricite de France (EDF) and Electric Power Research Institute (EPRI) have been involved in human reliability studies over the last few years, in the context of improvements in human reliability assessment (HRA) methodologies, and have been following a systematic process since 1982 which consists of addressing the following five ingredients: - First, classify human interactions into a limited number of classes. - Second, introduce an acceptable framework to organize the application of HRA to PRA studies. - Third, select approach(es) to quantification. - Fourth, test promising models. - Fifth, establish an appropriate data base for tested model(s) with regard to specific applications. EPRI has just recently completed Phase I of the fourth topic. This primarily focused on testing the fundamental hypotheses behing the human cognitive reliability (HCR) correlation, using power plant simulators. EDF has been carrying out simulator studies since 1980, both for man-machine interface validation and HRA data collection. This background of experience provided a stepping stone for the EPRI project. On the other hand, before 1986, EDF had mainly been concentrating on getting qualitative insights from the tests and lacked experience in quantitative analysis and modeling, while EPRI had made advances in this latter area. Before the EPRI Operator Reliability Experiments (ORE) project was initiated, it was abundantly clear to EPRI and EDF that cooperation between the two could be useful and that both parties could gain from the cooperation

  7. Study of Monte Carlo approach to experimental uncertainty propagation with MSTW 2008 PDFs

    CERN Document Server

    Watt, G.

    2012-01-01

    We investigate the Monte Carlo approach to propagation of experimental uncertainties within the context of the established 'MSTW 2008' global analysis of parton distribution functions (PDFs) of the proton at next-to-leading order in the strong coupling. We show that the Monte Carlo approach using replicas of the original data gives PDF uncertainties in good agreement with the usual Hessian approach using the standard Delta(chi^2) = 1 criterion, then we explore potential parameterisation bias by increasing the number of free parameters, concluding that any parameterisation bias is likely to be small, with the exception of the valence-quark distributions at low momentum fractions x. We motivate the need for a larger tolerance, Delta(chi^2) > 1, by making fits to restricted data sets and idealised consistent or inconsistent pseudodata. Instead of using data replicas, we alternatively produce PDF sets randomly distributed according to the covariance matrix of fit parameters including appropriate tolerance values,...

  8. Final report on reliability and lifetime prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Gillen, Kenneth T; Wise, Jonathan; Jones, Gary D.; Causa, Al G.; Terrill, Edward R.; Borowczak, Marc

    2012-12-01

    This document highlights the important results obtained from the subtask of the Goodyear CRADA devoted to better understanding reliability of tires and to developing better lifetime prediction methods. The overall objective was to establish the chemical and physical basis for the degradation of tires using standard as well as unique models and experimental techniques. Of particular interest was the potential application of our unique modulus profiling apparatus for assessing tire properties and for following tire degradation. During the course of this complex investigation, extensive relevant information was generated, including experimental results, data analyses and development of models and instruments. Detailed descriptions of the findings are included in this report.

  9. Social Studies Oriented Achievement Goal Scale (SOAGS: Validity and Reliability Study

    Directory of Open Access Journals (Sweden)

    Melehat GEZER

    2016-12-01

    Full Text Available This study aims to develop a valid and reliable instrument for measuring students' social studies achievement goal. The research was conducted on a study group consisted of 374 middle school students studying in the central district of Diyarbakır in 2014-2015 school year fall semester. Expert opinion was consulted with regard to the scale's content and face validity. Exploratory Factor Analysis (EFA and Confirmatory Factor Analysis (CFA were performed in order to measure the scale's construct validity. As a result of EFA, a 29-item and a six-factor structure model which explains 50.82% of the total variance was obtained. The emerging factors were called as a self-approach, task-approach, other-approach, task-avoidance, other-avoidance and self-avoidance respectively. The findings acquired CFA indicated that the 29-item and six-factor structure related to social studies oriented achievement goal scale have acceptable goodness of fit indices. The scale's reliability coefficients were calculated by means of internal consistency method. As a result of reliability analysis, it was determined that the reliability coefficients were within admissible limits. The finding of the item correlation and 27% of upper and lower group comparisons demonstrated that all of the items in the scale should remain. In light of these results, it could be argued that the scale is reliable and valid instrument and can be used in order to test students' social studies achievement goals.

  10. A Closed-Form Technique for the Reliability and Risk Assessment of Wind Turbine Systems

    Directory of Open Access Journals (Sweden)

    Leonardo Dueñas-Osorio

    2012-06-01

    Full Text Available This paper proposes a closed-form method to evaluate wind turbine system reliability and associated failure consequences. Monte Carlo simulation, a widely used approach for system reliability assessment, usually requires large numbers of computational experiments, while existing analytical methods are limited to simple system event configurations with a focus on average values of reliability metrics. By analyzing a wind turbine system and its components in a combinatorial yet computationally efficient form, the proposed approach provides an entire probability distribution of system failure that contains all possible configurations of component failure and survival events. The approach is also capable of handling unique component attributes such as downtime and repair cost needed for risk estimations, and enables sensitivity analysis for quantifying the criticality of individual components to wind turbine system reliability. Applications of the technique are illustrated by assessing the reliability of a 12-subassembly turbine system. In addition, component downtimes and repair costs of components are embedded in the formulation to compute expected annual wind turbine unavailability and repair cost probabilities, and component importance metrics useful for maintenance planning and research prioritization. Furthermore, this paper introduces a recursive solution to closed-form method and applies this to a 45-component turbine system. The proposed approach proves to be computationally efficient and yields vital reliability information that could be readily used by wind farm stakeholders for decision making and risk management.

  11. A semantic web approach applied to integrative bioinformatics experimentation: a biological use case with genomics data.

    NARCIS (Netherlands)

    Post, L.J.G.; Roos, M.; Marshall, M.S.; van Driel, R.; Breit, T.M.

    2007-01-01

    The numerous public data resources make integrative bioinformatics experimentation increasingly important in life sciences research. However, it is severely hampered by the way the data and information are made available. The semantic web approach enhances data exchange and integration by providing

  12. Reliability analysis of self-actuated shutdown system

    International Nuclear Information System (INIS)

    Itooka, S.; Kumasaka, K.; Okabe, A.; Satoh, K.; Tsukui, Y.

    1991-01-01

    An analytical study was performed for the reliability of a self-actuated shutdown system (SASS) under the unprotected loss of flow (ULOF) event in a typical loop-type liquid metal fast breeder reactor (LMFBR) by the use of the response surface Monte Carlo analysis method. Dominant parameters for the SASS, such as Curie point characteristics, subassembly outlet coolant temperature, electromagnetic surface condition, etc., were selected and their probability density functions (PDFs) were determined by the design study information and experimental data. To get the response surface function (RSF) for the maximum coolant temperature, transient analyses of ULOF were performed by utilizing the experimental design method in the determination of analytical cases. Then, the RSF was derived by the multi-variable regression analysis. The unreliability of the SASS was evaluated as a probability that the maximum coolant temperature exceeded an acceptable level, employing the Monte Carlo calculation using the above PDFs and RSF. In this study, sensitivities to the dominant parameter were compared. The dispersion of subassembly outlet coolant temperature near the SASS-was found to be one of the most sensitive parameters. Fault tree analysis was performed using this value for the SASS in order to evaluate the shutdown system reliability. As a result of this study, the effectiveness of the SASS on the reliability improvement in the LMFBR shutdown system was analytically confirmed. This study has been performed as a part of joint research and development projects for DFBR under the sponsorship of the nine Japanese electric power companies, Electric Power Development Company and the Japan Atomic Power Company. (author)

  13. Reliability Modeling of Electromechanical System with Meta-Action Chain Methodology

    Directory of Open Access Journals (Sweden)

    Genbao Zhang

    2018-01-01

    Full Text Available To establish a more flexible and accurate reliability model, the reliability modeling and solving algorithm based on the meta-action chain thought are used in this thesis. Instead of estimating the reliability of the whole system only in the standard operating mode, this dissertation adopts the structure chain and the operating action chain for the system reliability modeling. The failure information and structure information for each component are integrated into the model to overcome the given factors applied in the traditional modeling. In the industrial application, there may be different operating modes for a multicomponent system. The meta-action chain methodology can estimate the system reliability under different operating modes by modeling the components with varieties of failure sensitivities. This approach has been identified by computing some electromechanical system cases. The results indicate that the process could improve the system reliability estimation. It is an effective tool to solve the reliability estimation problem in the system under various operating modes.

  14. Incorporating Cyber Layer Failures in Composite Power System Reliability Evaluations

    Directory of Open Access Journals (Sweden)

    Yuqi Han

    2015-08-01

    Full Text Available This paper proposes a novel approach to analyze the impacts of cyber layer failures (i.e., protection failures and monitoring failures on the reliability evaluation of composite power systems. The reliability and availability of the cyber layer and its protection and monitoring functions with various topologies are derived based on a reliability block diagram method. The availability of the physical layer components are modified via a multi-state Markov chain model, in which the component protection and monitoring strategies, as well as the cyber layer topology, are simultaneously considered. Reliability indices of composite power systems are calculated through non-sequential Monte-Carlo simulation. Case studies demonstrate that operational reliability downgrades in cyber layer function failure situations. Moreover, protection function failures have more significant impact on the downgraded reliability than monitoring function failures do, and the reliability indices are especially sensitive to the change of the cyber layer function availability in the range from 0.95 to 1.

  15. Sodium component reliability data collection at CREDO

    International Nuclear Information System (INIS)

    Bott, T.F.; Haas, P.M.; Manning, J.J.

    1979-01-01

    The Centralized Reliability Data Organization (CREDO) has been established at Oak Ridge National Laboratory (ORNL) by the Department of Energy to provide a national center for collection, evaluation and dissemination of reliability data for advanced reactors. While the system is being developed and continuous data collection at the two U.S. reactor sites (EBR-II and FFTF) is being established, data on advanced reactor components which have been in use at U.S. test loops and experimental reactors have been collected and analyzed. Engineering, operating and event data on sodium valves, pumps, flow meters, rupture discs, heat exchangers and cold traps have been collected from more than a dozen sites. The results of analyses of the data performed to date are presented

  16. Reliable RANSAC Using a Novel Preprocessing Model

    Directory of Open Access Journals (Sweden)

    Xiaoyan Wang

    2013-01-01

    Full Text Available Geometric assumption and verification with RANSAC has become a crucial step for corresponding to local features due to its wide applications in biomedical feature analysis and vision computing. However, conventional RANSAC is very time-consuming due to redundant sampling times, especially dealing with the case of numerous matching pairs. This paper presents a novel preprocessing model to explore a reduced set with reliable correspondences from initial matching dataset. Both geometric model generation and verification are carried out on this reduced set, which leads to considerable speedups. Afterwards, this paper proposes a reliable RANSAC framework using preprocessing model, which was implemented and verified using Harris and SIFT features, respectively. Compared with traditional RANSAC, experimental results show that our method is more efficient.

  17. A realistic approach to modeling an in-duct desulfurization process based on an experimental pilot plant study

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz, F.J.G.; Ollero, P. [University of Seville, Seville (Spain)

    2008-07-15

    This paper has been written to provide a realistic approach to modeling an in-duct desulfurization process and because of the disagreement between the results predicted by published kinetic models of the reaction between hydrated lime and SO{sub 2} at low temperature and the experimental results obtained in pilot plants where this process takes place. Results were obtained from an experimental program carried out in a 3-MWe pilot plant. Additionally, five kinetic models, from the literature, of the reaction of sulfation of Ca(OH){sub 2} at low temperatures were assessed by simulation and indicate that the desulfurization efficiencies predicted by them are clearly lower than those experimentally obtained in our own pilot plant as well as others. Next, a general model was fitted by minimizing the difference between the calculated and the experimental results from the pilot plant, using Matlab{sup TM}. The parameters were reduced as much as possible, to only two. Finally, after implementing this model in a simulation tool of the in-duct sorbent injection process, it was validated and it was shown to yield a realistic approach useful for both analyzing results and aiding in the design of an in-duct desulfurization process.

  18. Optimization of the representativeness and transposition approach, for the neutronic design of experimental programs in critical mock-up

    International Nuclear Information System (INIS)

    Dos-Santos, N.

    2013-01-01

    The work performed during this thesis focused on uncertainty propagation (nuclear data, technological uncertainties, calculation biases,...) on integral parameters, and the development of a novel approach enabling to reduce this uncertainty a priori directly from the design phase of a new experimental program. This approach is based on a multi-parameter multi-criteria extension of representativeness and transposition theories. The first part of this PhD work covers an optimization study of sensitivity and uncertainty calculation schemes to different modeling scales (cell, assembly and whole core) for LWRs and FBRs. A degraded scheme, based on standard and generalized perturbation theories, has been validated for the calculation of uncertainty propagation to various integral quantities of interest. It demonstrated the good a posteriori representativeness of the EPICURE experiment for the validation of mixed UOX-MOX loadings, as the importance of some nuclear data in the power tilt phenomenon in large LWR cores. The second part of this work was devoted to methods and tools development for the optimized design of experimental programs in ZPRs. Those methods are based on multi-parameters representativeness using simultaneously various quantities of interest. Finally, an original study has been conducted on the rigorous estimation of correlations between experimental programs in the transposition process. The coupling of experimental correlations and multi-parametric representativeness approach enables to efficiently design new programs, able to answer additional qualification requirements on calculation tools. (author) [fr

  19. Human reliability analysis of control room operators

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Isaac J.A.L.; Carvalho, Paulo Victor R.; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)

    2005-07-01

    Human reliability is the probability that a person correctly performs some system required action in a required time period and performs no extraneous action that can degrade the system Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. Significant progress has been made in the HRA field during the last years, mainly in nuclear area. Some first-generation HRA methods were developed, as THERP (Technique for human error rate prediction). Now, an array of called second-generation methods are emerging as alternatives, for instance ATHEANA (A Technique for human event analysis). The ergonomics approach has as tool the ergonomic work analysis. It focus on the study of operator's activities in physical and mental form, considering at the same time the observed characteristics of operator and the elements of the work environment as they are presented to and perceived by the operators. The aim of this paper is to propose a methodology to analyze the human reliability of the operators of industrial plant control room, using a framework that includes the approach used by ATHEANA, THERP and the work ergonomics analysis. (author)

  20. Pre-Proposal Assessment of Reliability for Spacecraft Docking with Limited Information

    Science.gov (United States)

    Brall, Aron

    2013-01-01

    This paper addresses the problem of estimating the reliability of a critical system function as well as its impact on the system reliability when limited information is available. The approach addresses the basic function reliability, and then the impact of multiple attempts to accomplish the function. The dependence of subsequent attempts on prior failure to accomplish the function is also addressed. The autonomous docking of two spacecraft was the specific example that generated the inquiry, and the resultant impact on total reliability generated substantial interest in presenting the results due to the relative insensitivity of overall performance to basic function reliability and moderate degradation given sufficient attempts to try and accomplish the required goal. The application of the methodology allows proper emphasis on the characteristics that can be estimated with some knowledge, and to insulate the integrity of the design from those characteristics that can't be properly estimated with any rational value of uncertainty. The nature of NASA's missions contains a great deal of uncertainty due to the pursuit of new science or operations. This approach can be applied to any function where multiple attempts at success, with or without degradation, are allowed.