WorldWideScience

Sample records for reliability estimation capability

  1. Structural reliability assessment capability in NESSUS

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  2. A Method of Nuclear Software Reliability Estimation

    International Nuclear Information System (INIS)

    Park, Gee Yong; Eom, Heung Seop; Cheon, Se Woo; Jang, Seung Cheol

    2011-01-01

    A method on estimating software reliability for nuclear safety software is proposed. This method is based on the software reliability growth model (SRGM) where the behavior of software failure is assumed to follow the non-homogeneous Poisson process. Several modeling schemes are presented in order to estimate and predict more precisely the number of software defects based on a few of software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating the software test cases into the model. It is identified that this method is capable of accurately estimating the remaining number of software defects which are on-demand type directly affecting safety trip functions. The software reliability can be estimated from a model equation and one method of obtaining the software reliability is proposed

  3. Estimation of Bridge Reliability Distributions

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    In this paper it is shown how the so-called reliability distributions can be estimated using crude Monte Carlo simulation. The main purpose is to demonstrate the methodology. Therefor very exact data concerning reliability and deterioration are not needed. However, it is intended in the paper to ...

  4. System Reliability Analysis Capability and Surrogate Model Application in RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Huang, Dongli [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gleicher, Frederick [Idaho National Lab. (INL), Idaho Falls, ID (United States); Wang, Bei [Idaho National Lab. (INL), Idaho Falls, ID (United States); Adbel-Khalik, Hany S. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pascucci, Valerio [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-11-01

    This report collect the effort performed to improve the reliability analysis capabilities of the RAVEN code and explore new opportunity in the usage of surrogate model by extending the current RAVEN capabilities to multi physics surrogate models and construction of surrogate models for high dimensionality fields.

  5. Reliability Estimation Based Upon Test Plan Results

    National Research Council Canada - National Science Library

    Read, Robert

    1997-01-01

    The report contains a brief summary of aspects of the Maximus reliability point and interval estimation technique as it has been applied to the reliability of a device whose surveillance tests contain...

  6. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  7. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  8. Reliability Estimates for Undergraduate Grade Point Average

    Science.gov (United States)

    Westrick, Paul A.

    2017-01-01

    Undergraduate grade point average (GPA) is a commonly employed measure in educational research, serving as a criterion or as a predictor depending on the research question. Over the decades, researchers have used a variety of reliability coefficients to estimate the reliability of undergraduate GPA, which suggests that there has been no consensus…

  9. Mission Reliability Estimation for Repairable Robot Teams

    Science.gov (United States)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  10. Adaptive Response Surface Techniques in Reliability Estimation

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Faber, M. H.; Sørensen, John Dalsgaard

    1993-01-01

    Problems in connection with estimation of the reliability of a component modelled by a limit state function including noise or first order discontinuitics are considered. A gradient free adaptive response surface algorithm is developed. The algorithm applies second order polynomial surfaces...

  11. Reliability of Estimation Pile Load Capacity Methods

    Directory of Open Access Journals (Sweden)

    Yudhi Lastiasih

    2014-04-01

    Full Text Available None of numerous previous methods for predicting pile capacity is known how accurate any of them are when compared with the actual ultimate capacity of piles tested to failure. The author’s of the present paper have conducted such an analysis, based on 130 data sets of field loading tests. Out of these 130 data sets, only 44 could be analysed, of which 15 were conducted until the piles actually reached failure. The pile prediction methods used were: Brinch Hansen’s method (1963, Chin’s method (1970, Decourt’s Extrapolation Method (1999, Mazurkiewicz’s method (1972, Van der Veen’s method (1953, and the Quadratic Hyperbolic Method proposed by Lastiasih et al. (2012. It was obtained that all the above methods were sufficiently reliable when applied to data from pile loading tests that loaded to reach failure. However, when applied to data from pile loading tests that loaded without reaching failure, the methods that yielded lower values for correction factor N are more recommended. Finally, the empirical method of Reese and O’Neill (1988 was found to be reliable enough to be used to estimate the Qult of a pile foundation based on soil data only.

  12. Demonstrating the capability and reliability of NDT inspections

    International Nuclear Information System (INIS)

    Wooldridge, A.B.

    1996-01-01

    This paper discusses some recent developments in demonstrating the capability of ultrasonics, eddy currents and radiography both theoretically and in practice, and indicates where further evidence is desirable. Magnox Electric has been involved with development of theoretical models for all three of these inspection methods. Feedback from experience on plant is also important to avoid overlooking any practical limitations of the inspections, and to ensure that the metallurgical characteristics of potential defects have been properly taken into account when designing and qualifying the inspections. For critical applications, inspection techniques are often supported by a Technical Justification which draws on all the relevant theoretical and experimental evidence, as well as experience of inspections on plant. The role of technical justifications is discussed in the context of inspection qualification. (author)

  13. Methodology for uranium resource estimates and reliability

    International Nuclear Information System (INIS)

    Blanchfield, D.M.

    1980-01-01

    The NURE uranium assessment method has evolved from a small group of geologists estimating resources on a few lease blocks, to a national survey involving an interdisciplinary system consisting of the following: (1) geology and geologic analogs; (2) engineering and cost modeling; (3) mathematics and probability theory, psychology and elicitation of subjective judgments; and (4) computerized calculations, computer graphics, and data base management. The evolution has been spurred primarily by two objectives; (1) quantification of uncertainty, and (2) elimination of simplifying assumptions. This has resulted in a tremendous data-gathering effort and the involvement of hundreds of technical experts, many in uranium geology, but many from other fields as well. The rationality of the methods is still largely based on the concept of an analog and the observation that the results are reasonable. The reliability, or repeatability, of the assessments is reasonably guaranteed by the series of peer and superior technical reviews which has been formalized under the current methodology. The optimism or pessimism of individual geologists who make the initial assessments is tempered by the review process, resulting in a series of assessments which are a consistent, unbiased reflection of the facts. Despite the many improvements over past methods, several objectives for future development remain, primarily to reduce subjectively in utilizing factual information in the estimation of endowment, and to improve the recognition of cost uncertainties in the assessment of economic potential. The 1980 NURE assessment methodology will undoubtly be improved, but the reader is reminded that resource estimates are and always will be a forecast for the future

  14. Lower bounds to the reliabilities of factor score estimators

    NARCIS (Netherlands)

    Hessen, D.J.

    2017-01-01

    Under the general common factor model, the reliabilities of factor score estimators might be of more interest than the reliability of the total score (the unweighted sum of item scores). In this paper, lower bounds to the reliabilities of Thurstone’s factor score estimators, Bartlett’s factor score

  15. Lower Bounds to the Reliabilities of Factor Score Estimators.

    Science.gov (United States)

    Hessen, David J

    2016-10-06

    Under the general common factor model, the reliabilities of factor score estimators might be of more interest than the reliability of the total score (the unweighted sum of item scores). In this paper, lower bounds to the reliabilities of Thurstone's factor score estimators, Bartlett's factor score estimators, and McDonald's factor score estimators are derived and conditions are given under which these lower bounds are equal. The relative performance of the derived lower bounds is studied using classic example data sets. The results show that estimates of the lower bounds to the reliabilities of Thurstone's factor score estimators are greater than or equal to the estimates of the lower bounds to the reliabilities of Bartlett's and McDonald's factor score estimators.

  16. Reliability: How much is it worth? Beyond its estimation or prediction, the (net) present value of reliability

    International Nuclear Information System (INIS)

    Saleh, J.H.; Marais, K.

    2006-01-01

    In this article, we link an engineering concept, reliability, to a financial and managerial concept, net present value, by exploring the impact of a system's reliability on its revenue generation capability. The framework here developed for non-repairable systems quantitatively captures the value of reliability from a financial standpoint. We show that traditional present value calculations of engineering systems do not account for system reliability, thus over-estimate a system's worth and can therefore lead to flawed investment decisions. It is therefore important to involve reliability engineers upfront before investment decisions are made in technical systems. In addition, the analyses here developed help designers identify the optimal level of reliability that maximizes a system's net present value-the financial value reliability provides to the system minus the cost to achieve this level of reliability. Although we recognize that there are numerous considerations driving the specification of an engineering system's reliability, we contend that the financial analysis of reliability here developed should be made available to decision-makers to support in part, or at least be factored into, the system reliability specification

  17. MEASUREMENT: ACCOUNTING FOR RELIABILITY IN PERFORMANCE ESTIMATES.

    Science.gov (United States)

    Waterman, Brian; Sutter, Robert; Burroughs, Thomas; Dunagan, W Claiborne

    2014-01-01

    When evaluating physician performance measures, physician leaders are faced with the quandary of determining whether departures from expected physician performance measurements represent a true signal or random error. This uncertainty impedes the physician leader's ability and confidence to take appropriate performance improvement actions based on physician performance measurements. Incorporating reliability adjustment into physician performance measurement is a valuable way of reducing the impact of random error in the measurements, such as those caused by small sample sizes. Consequently, the physician executive has more confidence that the results represent true performance and is positioned to make better physician performance improvement decisions. Applying reliability adjustment to physician-level performance data is relatively new. As others have noted previously, it's important to keep in mind that reliability adjustment adds significant complexity to the production, interpretation and utilization of results. Furthermore, the methods explored in this case study only scratch the surface of the range of available Bayesian methods that can be used for reliability adjustment; further study is needed to test and compare these methods in practice and to examine important extensions for handling specialty-specific concerns (e.g., average case volumes, which have been shown to be important in cardiac surgery outcomes). Moreover, it's important to note that the provider group average as a basis for shrinkage is one of several possible choices that could be employed in practice and deserves further exploration in future research. With these caveats, our results demonstrate that incorporating reliability adjustment into physician performance measurements is feasible and can notably reduce the incidence of "real" signals relative to what one would expect to see using more traditional approaches. A physician leader who is interested in catalyzing performance improvement

  18. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    Science.gov (United States)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been

  19. Influences on and Limitations of Classical Test Theory Reliability Estimates.

    Science.gov (United States)

    Arnold, Margery E.

    It is incorrect to say "the test is reliable" because reliability is a function not only of the test itself, but of many factors. The present paper explains how different factors affect classical reliability estimates such as test-retest, interrater, internal consistency, and equivalent forms coefficients. Furthermore, the limits of classical test…

  20. Estimation of structural reliability under combined loads

    International Nuclear Information System (INIS)

    Shinozuka, M.; Kako, T.; Hwang, H.; Brown, P.; Reich, M.

    1983-01-01

    For the overall safety evaluation of seismic category I structures subjected to various load combinations, a quantitative measure of the structural reliability in terms of a limit state probability can be conveniently used. For this purpose, the reliability analysis method for dynamic loads, which has recently been developed by the authors, was combined with the existing standard reliability analysis procedure for static and quasi-static loads. The significant parameters that enter into the analysis are: the rate at which each load (dead load, accidental internal pressure, earthquake, etc.) will occur, its duration and intensity. All these parameters are basically random variables for most of the loads to be considered. For dynamic loads, the overall intensity is usually characterized not only by their dynamic components but also by their static components. The structure considered in the present paper is a reinforced concrete containment structure subjected to various static and dynamic loads such as dead loads, accidental pressure, earthquake acceleration, etc. Computations are performed to evaluate the limit state probabilities under each load combination separately and also under all possible combinations of such loads

  1. Reliability Estimation for Digital Instrument/Control System

    International Nuclear Information System (INIS)

    Yang, Yaguang; Sydnor, Russell

    2011-01-01

    Digital instrumentation and controls (DI and C) systems are widely adopted in various industries because of their flexibility and ability to implement various functions that can be used to automatically monitor, analyze, and control complicated systems. It is anticipated that the DI and C will replace the traditional analog instrumentation and controls (AI and C) systems in all future nuclear reactor designs. There is an increasing interest for reliability and risk analyses for safety critical DI and C systems in regulatory organizations, such as The United States Nuclear Regulatory Commission. Developing reliability models and reliability estimation methods for digital reactor control and protection systems will involve every part of the DI and C system, such as sensors, signal conditioning and processing components, transmission lines and digital communication systems, D/A and A/D converters, computer system, signal processing software, control and protection software, power supply system, and actuators. Some of these components are hardware, such as sensors and actuators, their failure mechanisms are well understood, and the traditional reliability model and estimation methods can be directly applied. But many of these components are firmware which has software embedded in the hardware, and software needs special consideration because its failure mechanism is unique, and the reliability estimation method for a software system will be different from the ones used for hardware systems. In this paper, we will propose a reliability estimation method for the entire DI and C system reliability using a recently developed software reliability estimation method and a traditional hardware reliability estimation method

  2. Reliability Estimation for Digital Instrument/Control System

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yaguang; Sydnor, Russell [U.S. Nuclear Regulatory Commission, Washington, D.C. (United States)

    2011-08-15

    Digital instrumentation and controls (DI and C) systems are widely adopted in various industries because of their flexibility and ability to implement various functions that can be used to automatically monitor, analyze, and control complicated systems. It is anticipated that the DI and C will replace the traditional analog instrumentation and controls (AI and C) systems in all future nuclear reactor designs. There is an increasing interest for reliability and risk analyses for safety critical DI and C systems in regulatory organizations, such as The United States Nuclear Regulatory Commission. Developing reliability models and reliability estimation methods for digital reactor control and protection systems will involve every part of the DI and C system, such as sensors, signal conditioning and processing components, transmission lines and digital communication systems, D/A and A/D converters, computer system, signal processing software, control and protection software, power supply system, and actuators. Some of these components are hardware, such as sensors and actuators, their failure mechanisms are well understood, and the traditional reliability model and estimation methods can be directly applied. But many of these components are firmware which has software embedded in the hardware, and software needs special consideration because its failure mechanism is unique, and the reliability estimation method for a software system will be different from the ones used for hardware systems. In this paper, we will propose a reliability estimation method for the entire DI and C system reliability using a recently developed software reliability estimation method and a traditional hardware reliability estimation method.

  3. Estimation of structural reliability under combined loads

    International Nuclear Information System (INIS)

    Shinozuka, M.; Kako, T.; Hwang, H.; Brown, P.; Reich, M.

    1983-01-01

    For the overall safety evaluation of seismic category I structures subjected to various load combinations, a quantitative measure of the structural reliability in terms of a limit state probability can be conveniently used. For this purpose, the reliability analysis method for dynamic loads, which has recently been developed by the authors, was combined with the existing standard reliability analysis procedure for static and quasi-static loads. The significant parameters that enter into the analysis are: the rate at which each load (dead load, accidental internal pressure, earthquake, etc.) will occur, its duration and intensity. All these parameters are basically random variables for most of the loads to be considered. For dynamic loads, the overall intensity is usually characterized not only by their dynamic components but also by their static components. The structure considered in the present paper is a reinforced concrete containment structure subjected to various static and dynamic loads such as dead loads, accidental pressure, earthquake acceleration, etc. Computations are performed to evaluate the limit state probabilities under each load combination separately and also under all possible combinations of such loads. Indeed, depending on the limit state condition to be specified, these limit state probabilities can indicate which particular load combination provides the dominant contribution to the overall limit state probability. On the other hand, some of the load combinations contribute very little to the overall limit state probability. These observations provide insight into the complex problem of which load combinations must be considered for design, for which limit states and at what level of limit state probabilities. (orig.)

  4. Estimation of the Reliability of Plastic Slabs

    DEFF Research Database (Denmark)

    Pirzada, G. B. : Ph.D.

    In this thesis, work related to fundamental conditions has been extended to non-fundamental or the general case of probabilistic analysis. Finally, using the ss-unzipping technique a door has been opened to system reliability analysis of plastic slabs. An attempt has been made in this thesis...... to give a probabilistic treatment of plastic slabs which is parallel to the deterministic and systematic treatment of plastic slabs by Nielsen (3). The fundamental reason is that in Nielsen (3) the treatment is based on a deterministic modelling of the basic material properties for the reinforced...

  5. Neglect Of Parameter Estimation Uncertainty Can Significantly Overestimate Structural Reliability

    Directory of Open Access Journals (Sweden)

    Rózsás Árpád

    2015-12-01

    Full Text Available Parameter estimation uncertainty is often neglected in reliability studies, i.e. point estimates of distribution parameters are used for representative fractiles, and in probabilistic models. A numerical example examines the effect of this uncertainty on structural reliability using Bayesian statistics. The study reveals that the neglect of parameter estimation uncertainty might lead to an order of magnitude underestimation of failure probability.

  6. Reliability estimation of semi-Markov systems: a case study

    International Nuclear Information System (INIS)

    Ouhbi, Brahim; Limnios, Nikolaos

    1997-01-01

    In this article, we are concerned with the estimation of the reliability and the availability of a turbo-generator rotor using a set of data observed in a real engineering situation provided by Electricite De France (EDF). The rotor is modeled by a semi-Markov process, which is used to estimate the rotor's reliability and availability. To do this, we present a method for estimating the semi-Markov kernel from a censored data

  7. Reliabilities of genomic estimated breeding values in Danish Jersey

    DEFF Research Database (Denmark)

    Thomasen, Jørn Rind; Guldbrandtsen, Bernt; Su, Guosheng

    2012-01-01

    In order to optimize the use of genomic selection in breeding plans, it is essential to have reliable estimates of the genomic breeding values. This study investigated reliabilities of direct genomic values (DGVs) in the Jersey population estimated by three different methods. The validation methods...... were (i) fivefold cross-validation and (ii) validation on the most recent 3 years of bulls. The reliability of DGV was assessed using squared correlations between DGV and deregressed proofs (DRPs). In the recent 3-year validation model, estimated reliabilities were also used to assess the reliabilities...... of DGV. The data set consisted of 1003 Danish Jersey bulls with conventional estimated breeding values (EBVs) for 14 different traits included in the Nordic selection index. The bulls were genotyped for Single-nucleotide polymorphism (SNP) markers using the Illumina 54 K chip. A Bayesian method was used...

  8. Temperature dependent power capability estimation of lithium-ion batteries for hybrid electric vehicles

    International Nuclear Information System (INIS)

    Zheng, Fangdan; Jiang, Jiuchun; Sun, Bingxiang; Zhang, Weige; Pecht, Michael

    2016-01-01

    The power capability of lithium-ion batteries affects the safety and reliability of hybrid electric vehicles and the estimate of power by battery management systems provides operating information for drivers. In this paper, lithium ion manganese oxide batteries are studied to illustrate the temperature dependency of power capability and an operating map of power capability is presented. Both parametric and non-parametric models are established in conditions of temperature, state of charge, and cell resistance to estimate the power capability. Six cells were tested and used for model development, training, and validation. Three samples underwent hybrid pulse power characterization tests at varied temperatures and were used for model parameter identification and model training. The other three were used for model validation. By comparison, the mean absolute error of the parametric model is about 29 W, and that of the non-parametric model is around 20 W. The mean relative errors of two models are 0.076 and 0.397, respectively. The parametric model has a higher accuracy in low temperature and state of charge conditions, while the non-parametric model has better estimation result in high temperature and state of charge conditions. Thus, two models can be utilized together to achieve a higher accuracy of power capability estimation. - Highlights: • The temperature dependency of power capability of lithium-ion battery is investigated. • The parametric and non-parametric power capability estimation models are proposed. • An exponential function is put forward to compensate the effects of temperature. • A comparative study on the accuracy of two models using statistical metrics is presented.

  9. Software Estimation: Developing an Accurate, Reliable Method

    Science.gov (United States)

    2011-08-01

    based and size-based estimates is able to accurately plan, launch, and execute on schedule. Bob Sinclair, NAWCWD Chris Rickets , NAWCWD Brad Hodgins...Office by Carnegie Mellon University. SMPSP and SMTSP are service marks of Carnegie Mellon University. 1. Rickets , Chris A, “A TSP Software Maintenance...Life Cycle”, CrossTalk, March, 2005. 2. Koch, Alan S, “TSP Can Be the Building blocks for CMMI”, CrossTalk, March, 2005. 3. Hodgins, Brad, Rickets

  10. Basics of Bayesian reliability estimation from attribute test data

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Waller, R.A.

    1975-10-01

    The basic notions of Bayesian reliability estimation from attribute lifetest data are presented in an introductory and expository manner. Both Bayesian point and interval estimates of the probability of surviving the lifetest, the reliability, are discussed. The necessary formulas are simply stated, and examples are given to illustrate their use. In particular, a binomial model in conjunction with a beta prior model is considered. Particular attention is given to the procedure for selecting an appropriate prior model in practice. Empirical Bayes point and interval estimates of reliability are discussed and examples are given. 7 figures, 2 tables

  11. IRT-Estimated Reliability for Tests Containing Mixed Item Formats

    Science.gov (United States)

    Shu, Lianghua; Schwarz, Richard D.

    2014-01-01

    As a global measure of precision, item response theory (IRT) estimated reliability is derived for four coefficients (Cronbach's a, Feldt-Raju, stratified a, and marginal reliability). Models with different underlying assumptions concerning test-part similarity are discussed. A detailed computational example is presented for the targeted…

  12. A Latent Class Approach to Estimating Test-Score Reliability

    Science.gov (United States)

    van der Ark, L. Andries; van der Palm, Daniel W.; Sijtsma, Klaas

    2011-01-01

    This study presents a general framework for single-administration reliability methods, such as Cronbach's alpha, Guttman's lambda-2, and method MS. This general framework was used to derive a new approach to estimating test-score reliability by means of the unrestricted latent class model. This new approach is the latent class reliability…

  13. Processes and Procedures for Estimating Score Reliability and Precision

    Science.gov (United States)

    Bardhoshi, Gerta; Erford, Bradley T.

    2017-01-01

    Precision is a key facet of test development, with score reliability determined primarily according to the types of error one wants to approximate and demonstrate. This article identifies and discusses several primary forms of reliability estimation: internal consistency (i.e., split-half, KR-20, a), test-retest, alternate forms, interscorer, and…

  14. Assessment of Equipment Capability to Perform Reliably under Severe Accident Conditions

    International Nuclear Information System (INIS)

    2017-07-01

    The experience from the last 40 years has shown that severe accidents can subject electrical and instrumentation and control (I&C) equipment to environmental conditions exceeding the equipment’s original design basis assumptions. Severe accident conditions can then cause rapid degradation or damage to various degrees up to complete failure of such equipment. This publication provides the technical basis to consider when assessing the capability of electrical and I&C equipment to perform reliably during a severe accident. It provides examples of calculation tools to determine the environmental parameters as well as examples and methods that Member States can apply to assess equipment reliability.

  15. User's guide to the Reliability Estimation System Testbed (REST)

    Science.gov (United States)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  16. Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M.; Carnes, Brian; Wittwer, Jonathan W.; Bichon, Barron J.; Copps, Kevin D.

    2006-10-01

    This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

  17. Estimation of some stochastic models used in reliability engineering

    International Nuclear Information System (INIS)

    Huovinen, T.

    1989-04-01

    The work aims to study the estimation of some stochastic models used in reliability engineering. In reliability engineering continuous probability distributions have been used as models for the lifetime of technical components. We consider here the following distributions: exponential, 2-mixture exponential, conditional exponential, Weibull, lognormal and gamma. Maximum likelihood method is used to estimate distributions from observed data which may be either complete or censored. We consider models based on homogeneous Poisson processes such as gamma-poisson and lognormal-poisson models for analysis of failure intensity. We study also a beta-binomial model for analysis of failure probability. The estimators of the parameters for three models are estimated by the matching moments method and in the case of gamma-poisson and beta-binomial models also by maximum likelihood method. A great deal of mathematical or statistical problems that arise in reliability engineering can be solved by utilizing point processes. Here we consider the statistical analysis of non-homogeneous Poisson processes to describe the failing phenomena of a set of components with a Weibull intensity function. We use the method of maximum likelihood to estimate the parameters of the Weibull model. A common cause failure can seriously reduce the reliability of a system. We consider a binomial failure rate (BFR) model as an application of the marked point processes for modelling common cause failure in a system. The parameters of the binomial failure rate model are estimated with the maximum likelihood method

  18. Investigation of MLE in nonparametric estimation methods of reliability function

    International Nuclear Information System (INIS)

    Ahn, Kwang Won; Kim, Yoon Ik; Chung, Chang Hyun; Kim, Kil Yoo

    2001-01-01

    There have been lots of trials to estimate a reliability function. In the ESReDA 20 th seminar, a new method in nonparametric way was proposed. The major point of that paper is how to use censored data efficiently. Generally there are three kinds of approach to estimate a reliability function in nonparametric way, i.e., Reduced Sample Method, Actuarial Method and Product-Limit (PL) Method. The above three methods have some limits. So we suggest an advanced method that reflects censored information more efficiently. In many instances there will be a unique maximum likelihood estimator (MLE) of an unknown parameter, and often it may be obtained by the process of differentiation. It is well known that the three methods generally used to estimate a reliability function in nonparametric way have maximum likelihood estimators that are uniquely exist. So, MLE of the new method is derived in this study. The procedure to calculate a MLE is similar just like that of PL-estimator. The difference of the two is that in the new method, the mass (or weight) of each has an influence of the others but the mass in PL-estimator not

  19. Case Study: Zutphen : Estimates of levee system reliability

    NARCIS (Netherlands)

    Roscoe, K.; Kothuis, Baukje; Kok, Matthijs

    2017-01-01

    Estimates of levee system reliability can conflict with experience and intuition. For example, a very high failure probability may be computed while no evidence of failure has been observed, or a very low failure probability when signs of failure have been detected.

  20. A generic method for estimating system reliability using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Ramirez-Marquez, Jose Emmanuel

    2009-01-01

    This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples

  1. A generic method for estimating system reliability using Bayesian networks

    Energy Technology Data Exchange (ETDEWEB)

    Doguc, Ozge [Stevens Institute of Technology, Hoboken, NJ 07030 (United States); Ramirez-Marquez, Jose Emmanuel [Stevens Institute of Technology, Hoboken, NJ 07030 (United States)], E-mail: jmarquez@stevens.edu

    2009-02-15

    This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples.

  2. An integrated approach to estimate storage reliability with initial failures based on E-Bayesian estimates

    International Nuclear Information System (INIS)

    Zhang, Yongjin; Zhao, Ming; Zhang, Shitao; Wang, Jiamei; Zhang, Yanjun

    2017-01-01

    Storage reliability that measures the ability of products in a dormant state to keep their required functions is studied in this paper. For certain types of products, Storage reliability may not always be 100% at the beginning of storage, unlike the operational reliability, which exist possible initial failures that are normally neglected in the models of storage reliability. In this paper, a new integrated technique, the non-parametric measure based on the E-Bayesian estimates of current failure probabilities is combined with the parametric measure based on the exponential reliability function, is proposed to estimate and predict the storage reliability of products with possible initial failures, where the non-parametric method is used to estimate the number of failed products and the reliability at each testing time, and the parameter method is used to estimate the initial reliability and the failure rate of storage product. The proposed method has taken into consideration that, the reliability test data of storage products containing the unexamined before and during the storage process, is available for providing more accurate estimates of both the initial failure probability and the storage failure probability. When storage reliability prediction that is the main concern in this field should be made, the non-parametric estimates of failure numbers can be used into the parametric models for the failure process in storage. In the case of exponential models, the assessment and prediction method for storage reliability is presented in this paper. Finally, a numerical example is given to illustrate the method. Furthermore, a detailed comparison between the proposed and traditional method, for examining the rationality of assessment and prediction on the storage reliability, is investigated. The results should be useful for planning a storage environment, decision-making concerning the maximum length of storage, and identifying the production quality. - Highlights:

  3. Reliability of Bluetooth Technology for Travel Time Estimation

    DEFF Research Database (Denmark)

    Araghi, Bahar Namaki; Olesen, Jonas Hammershøj; Krishnan, Rajesh

    2015-01-01

    . However, their corresponding impacts on accuracy and reliability of estimated travel time have not been evaluated. In this study, a controlled field experiment is conducted to collect both Bluetooth and GPS data for 1000 trips to be used as the basis for evaluation. Data obtained by GPS logger is used...... to calculate actual travel time, referred to as ground truth, and to geo-code the Bluetooth detection events. In this setting, reliability is defined as the percentage of devices captured per trip during the experiment. It is found that, on average, Bluetooth-enabled devices will be detected 80% of the time......-range antennae detect Bluetooth-enabled devices in a closer location to the sensor, thus providing a more accurate travel time estimate. However, the smaller the size of the detection zone, the lower the penetration rate, which could itself influence the accuracy of estimates. Therefore, there has to be a trade...

  4. Computer Model to Estimate Reliability Engineering for Air Conditioning Systems

    International Nuclear Information System (INIS)

    Afrah Al-Bossly, A.; El-Berry, A.; El-Berry, A.

    2012-01-01

    Reliability engineering is used to predict the performance and optimize design and maintenance of air conditioning systems. Air conditioning systems are expose to a number of failures. The failures of an air conditioner such as turn on, loss of air conditioner cooling capacity, reduced air conditioning output temperatures, loss of cool air supply and loss of air flow entirely can be due to a variety of problems with one or more components of an air conditioner or air conditioning system. Forecasting for system failure rates are very important for maintenance. This paper focused on the reliability of the air conditioning systems. Statistical distributions that were commonly applied in reliability settings: the standard (2 parameter) Weibull and Gamma distributions. After distributions parameters had been estimated, reliability estimations and predictions were used for evaluations. To evaluate good operating condition in a building, the reliability of the air conditioning system that supplies conditioned air to the several The company's departments. This air conditioning system is divided into two, namely the main chilled water system and the ten air handling systems that serves the ten departments. In a chilled-water system the air conditioner cools water down to 40-45 degree F (4-7 degree C). The chilled water is distributed throughout the building in a piping system and connected to air condition cooling units wherever needed. Data analysis has been done with support a computer aided reliability software, this is due to the Weibull and Gamma distributions indicated that the reliability for the systems equal to 86.012% and 77.7% respectively. A comparison between the two important families of distribution functions, namely, the Weibull and Gamma families was studied. It was found that Weibull method performed for decision making.

  5. ARTIFICIAL INTELLIGENCE CAPABILITIES FOR INCREASING ORGANIZATIONAL-TECHNOLOGICAL RELIABILITY OF CONSTRUCTION

    Directory of Open Access Journals (Sweden)

    Ginzburg Alexander Vital`evich

    2018-02-01

    Full Text Available The technology of artificial intelligence is actively being mastered in the world but there is not much talk about the capabilities of artificial intelligence in construction industry and this issue requires additional elaboration. As a rule, the decision to invest in a particular construction project is made on the basis of an assessment of the organizational and technological reliability of the construction process. Artificial intelligence can be a convenient quality tool for identifying, analyzing and subsequent control of the “pure” risks of the construction project, which not only will significantly reduce the financial and time expenditures for the investor’s decision-making process but also improve the organizational-technological reliability of the construction process as a whole. Subject: the algorithm of creation of artificial intelligence in the field of identification and analysis of potential risk events is presented, which will facilitate the creation of an independent analytical system for different stages of construction production: from the sketch to the working documentation and conduction of works directly on the construction site. Research objectives: the study of the possibility, methods and planning of the algorithm of works for creation of artificial intelligence technology in order to improve the organizational-technological reliability of the construction process. Materials and methods: the developments in the field of improving the organizational and technological reliability of construction were studied through the analysis and control of potential “pure” risks of the construction project, and the work was also carried out to integrate the technology of artificial intelligence into the area being studied. Results: An algorithm for creating artificial intelligence in the field of identification of potential “pure” risks of construction projects was presented. Conclusions: the obtained results are useful

  6. Reliability Estimation of the Pultrusion Process Using the First-Order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem Celal; Hattel, Jesper Henri

    2013-01-01

    In the present study the reliability estimation of the pultrusion process of a flat plate is analyzed by using the first order reliability method (FORM). The implementation of the numerical process model is validated by comparing the deterministic temperature and cure degree profiles...... with corresponding analyses in the literature. The centerline degree of cure at the exit (CDOCE) being less than a critical value and the maximum composite temperature (Tmax) during the process being greater than a critical temperature are selected as the limit state functions (LSFs) for the FORM. The cumulative...

  7. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  8. Modelling and estimating degradation processes with application in structural reliability

    International Nuclear Information System (INIS)

    Chiquet, J.

    2007-06-01

    The characteristic level of degradation of a given structure is modeled through a stochastic process called the degradation process. The random evolution of the degradation process is governed by a differential system with Markovian environment. We put the associated reliability framework by considering the failure of the structure once the degradation process reaches a critical threshold. A closed form solution of the reliability function is obtained thanks to Markov renewal theory. Then, we build an estimation methodology for the parameters of the stochastic processes involved. The estimation methods and the theoretical results, as well as the associated numerical algorithms, are validated on simulated data sets. Our method is applied to the modelling of a real degradation mechanism, known as crack growth, for which an experimental data set is considered. (authors)

  9. Application of subset simulation in reliability estimation of underground pipelines

    International Nuclear Information System (INIS)

    Tee, Kong Fah; Khan, Lutfor Rahman; Li, Hongshuang

    2014-01-01

    This paper presents a computational framework for implementing an advanced Monte Carlo simulation method, called Subset Simulation (SS) for time-dependent reliability prediction of underground flexible pipelines. The SS can provide better resolution for low failure probability level of rare failure events which are commonly encountered in pipeline engineering applications. Random samples of statistical variables are generated efficiently and used for computing probabilistic reliability model. It gains its efficiency by expressing a small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment and compared with direct Monte Carlo simulation (MCS) method. Reliability of a buried flexible steel pipe with time-dependent failure modes, namely, corrosion induced deflection, buckling, wall thrust and bending stress has been assessed in this study. The analysis indicates that corrosion induced excessive deflection is the most critical failure event whereas buckling is the least susceptible during the whole service life of the pipe. The study also shows that SS is robust method to estimate the reliability of buried pipelines and it is more efficient than MCS, especially in small failure probability prediction

  10. Generating human reliability estimates using expert judgment. Volume 2. Appendices

    International Nuclear Information System (INIS)

    Comer, M.K.; Seaver, D.A.; Stillwell, W.G.; Gaddy, C.D.

    1984-11-01

    The US Nuclear Regulatory Commission is conducting a research program to determine the practicality, acceptability, and usefulness of several different methods for obtaining human reliability data and estimates that can be used in nuclear power plant probabilistic risk assessments (PRA). One method, investigated as part of this overall research program, uses expert judgment to generate human error probability (HEP) estimates and associated uncertainty bounds. The project described in this document evaluated two techniques for using expert judgment: paired comparisons and direct numerical estimation. Volume 2 provides detailed procedures for using the techniques, detailed descriptions of the analyses performed to evaluate the techniques, and HEP estimates generated as part of this project. The results of the evaluation indicate that techniques using expert judgment should be given strong consideration for use in developing HEP estimates. Judgments were shown to be consistent and to provide HEP estimates with a good degree of convergent validity. Of the two techniques tested, direct numerical estimation appears to be preferable in terms of ease of application and quality of results

  11. Estimating passenger numbers in trains using existing weighing capabilities

    DEFF Research Database (Denmark)

    Nielsen, Bo Friis; Frølich, Laura; Nielsen, Otto Anker

    2013-01-01

    trains to control braking. This technique makes passenger counting cheaper and ensures a complete sample. The paper compares numbers estimated by this technique with manual counts and counts from an infrared system in trains in urban Copenhagen. It shows that the weighing system provides more accurate......Knowing passenger numbers is important for the planning and operation of the urban rail systems. Manual and electronic counting systems (typically infrared or video) are expensive and therefore entail small sample sizes. They usually count boarding and alighting passengers, which means that errors...... in estimates of total numbers of passengers propagate along train runs. Counting errors in manual and electronic counting systems are typically flow-dependent, making uncertainty a function of volume. This paper presents a new counting technique that exploits the weighing systems installed in most modern...

  12. Reliability Analysis and Overload Capability Assessment of Oil-Immersed Power Transformers

    Directory of Open Access Journals (Sweden)

    Chen Wang

    2016-01-01

    Full Text Available Smart grids have been constructed so as to guarantee the security and stability of the power grid in recent years. Power transformers are a most vital component in the complicated smart grid network. Any transformer failure can cause damage of the whole power system, within which the failures caused by overloading cannot be ignored. This research gives a new insight into overload capability assessment of transformers. The hot-spot temperature of the winding is the most critical factor in measuring the overload capacity of power transformers. Thus, the hot-spot temperature is calculated to obtain the duration running time of the power transformers under overloading conditions. Then the overloading probability is fitted with the mature and widely accepted Weibull probability density function. To guarantee the accuracy of this fitting, a new objective function is proposed to obtain the desired parameters in the Weibull distributions. In addition, ten different mutation scenarios are adopted in the differential evolutionary algorithm to optimize the parameter in the Weibull distribution. The final comprehensive overload capability of the power transformer is assessed by the duration running time as well as the overloading probability. Compared with the previous studies that take no account of the overloading probability, the assessment results obtained in this research are much more reliable.

  13. A note on reliability estimation of functionally diverse systems

    International Nuclear Information System (INIS)

    Littlewood, B.; Popov, P.; Strigini, L.

    1999-01-01

    It has been argued that functional diversity might be a plausible means of claiming independence of failures between two versions of a system. We present a model of functional diversity, in the spirit of earlier models of diversity such as those of Eckhardt and Lee, and Hughes. In terms of the model, we show that the claims for independence between functionally diverse systems seem rather unrealistic. Instead, it seems likely that functionally diverse systems will exhibit positively correlated failures, and thus will be less reliable than an assumption of independence would suggest. The result does not, of course, suggest that functional diversity is not worthwhile; instead, it places upon the evaluator of such a system the onus to estimate the degree of dependence so as to evaluate the reliability of the system

  14. Probabilistic confidence for decisions based on uncertain reliability estimates

    Science.gov (United States)

    Reid, Stuart G.

    2013-05-01

    Reliability assessments are commonly carried out to provide a rational basis for risk-informed decisions concerning the design or maintenance of engineering systems and structures. However, calculated reliabilities and associated probabilities of failure often have significant uncertainties associated with the possible estimation errors relative to the 'true' failure probabilities. For uncertain probabilities of failure, a measure of 'probabilistic confidence' has been proposed to reflect the concern that uncertainty about the true probability of failure could result in a system or structure that is unsafe and could subsequently fail. The paper describes how the concept of probabilistic confidence can be applied to evaluate and appropriately limit the probabilities of failure attributable to particular uncertainties such as design errors that may critically affect the dependability of risk-acceptance decisions. This approach is illustrated with regard to the dependability of structural design processes based on prototype testing with uncertainties attributable to sampling variability.

  15. Availability and Reliability of FSO Links Estimated from Visibility

    Directory of Open Access Journals (Sweden)

    M. Tatarko

    2012-06-01

    Full Text Available This paper is focused on estimation availability and reliability of FSO systems. Shortcut FSO means Free Space Optics. It is a system which allows optical transmission between two steady points. We can say that it is a last mile communication system. It is an optical communication system, but the propagation media is air. This solution of last mile does not require expensive optical fiber and establishing of connection is very simple. But there are some drawbacks which have a bad influence of quality of services and availability of the link. Number of phenomena in the atmosphere such as scattering, absorption and turbulence cause a large variation of receiving optical power and laser beam attenuation. The influence of absorption and turbulence can be significantly reduced by an appropriate design of FSO link. But the visibility has the main influence on quality of the optical transmission channel. Thus, in typical continental area where rain, snow or fog occurs is important to know their values. This article gives a description of device for measuring weather conditions and information about estimation of availability and reliability of FSO links in Slovakia.

  16. Designing airport checked-baggage-screening strategies considering system capability and reliability

    International Nuclear Information System (INIS)

    Feng Qianmei; Sahin, Hande; Kapur, Kailash C.

    2009-01-01

    Emerging image-based technologies are critical components of airport security for screening checked baggage. Since these new technologies differ widely in cost and accuracy, a comprehensive mathematical framework should be developed for selecting technology or combination of technologies for efficient 100% baggage screening. This paper addresses the problem of setting threshold values of these screening technologies and determining the optimal combination of technologies in a two-level screening system by considering system capability and human reliability. Probability and optimization techniques are used to quantify and evaluate the cost- and risk-effectiveness of various deployment configurations, which is captured by using a system life-cycle cost model that incorporates the deployment cost, operating cost, and costs associated with system decisions. Two system decision rules are studied for a two-level screening system. For each decision rule, two different optimization approaches are formulated and investigated from practitioner's perspective. Numerical examples for different decision rules, optimization approaches and system arrangements are demonstrated

  17. Reliability of fish size estimates obtained from multibeam imaging sonar

    Science.gov (United States)

    Hightower, Joseph E.; Magowan, Kevin J.; Brown, Lori M.; Fox, Dewayne A.

    2013-01-01

    Multibeam imaging sonars have considerable potential for use in fisheries surveys because the video-like images are easy to interpret, and they contain information about fish size, shape, and swimming behavior, as well as characteristics of occupied habitats. We examined images obtained using a dual-frequency identification sonar (DIDSON) multibeam sonar for Atlantic sturgeon Acipenser oxyrinchus oxyrinchus, striped bass Morone saxatilis, white perch M. americana, and channel catfish Ictalurus punctatus of known size (20–141 cm) to determine the reliability of length estimates. For ranges up to 11 m, percent measurement error (sonar estimate – total length)/total length × 100 varied by species but was not related to the fish's range or aspect angle (orientation relative to the sonar beam). Least-square mean percent error was significantly different from 0.0 for Atlantic sturgeon (x̄  =  −8.34, SE  =  2.39) and white perch (x̄  = 14.48, SE  =  3.99) but not striped bass (x̄  =  3.71, SE  =  2.58) or channel catfish (x̄  = 3.97, SE  =  5.16). Underestimating lengths of Atlantic sturgeon may be due to difficulty in detecting the snout or the longer dorsal lobe of the heterocercal tail. White perch was the smallest species tested, and it had the largest percent measurement errors (both positive and negative) and the lowest percentage of images classified as good or acceptable. Automated length estimates for the four species using Echoview software varied with position in the view-field. Estimates tended to be low at more extreme azimuthal angles (fish's angle off-axis within the view-field), but mean and maximum estimates were highly correlated with total length. Software estimates also were biased by fish images partially outside the view-field and when acoustic crosstalk occurred (when a fish perpendicular to the sonar and at relatively close range is detected in the side lobes of adjacent beams). These sources of

  18. Assessment of the Maximal Split-Half Coefficient to Estimate Reliability

    Science.gov (United States)

    Thompson, Barry L.; Green, Samuel B.; Yang, Yanyun

    2010-01-01

    The maximal split-half coefficient is computed by calculating all possible split-half reliability estimates for a scale and then choosing the maximal value as the reliability estimate. Osburn compared the maximal split-half coefficient with 10 other internal consistency estimates of reliability and concluded that it yielded the most consistently…

  19. COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS

    Science.gov (United States)

    Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...

  20. Reliability Estimation for Single-unit Ceramic Crown Restorations

    Science.gov (United States)

    Lekesiz, H.

    2014-01-01

    The objective of this study was to evaluate the potential of a survival prediction method for the assessment of ceramic dental restorations. For this purpose, fast-fracture and fatigue reliabilities for 2 bilayer (metal ceramic alloy core veneered with fluorapatite leucite glass-ceramic, d.Sign/d.Sign-67, by Ivoclar; glass-infiltrated alumina core veneered with feldspathic porcelain, VM7/In-Ceram Alumina, by Vita) and 3 monolithic (leucite-reinforced glass-ceramic, Empress, and ProCAD, by Ivoclar; lithium-disilicate glass-ceramic, Empress 2, by Ivoclar) single posterior crown restorations were predicted, and fatigue predictions were compared with the long-term clinical data presented in the literature. Both perfectly bonded and completely debonded cases were analyzed for evaluation of the influence of the adhesive/restoration bonding quality on estimations. Material constants and stress distributions required for predictions were calculated from biaxial tests and finite element analysis, respectively. Based on the predictions, In-Ceram Alumina presents the best fast-fracture resistance, and ProCAD presents a comparable resistance for perfect bonding; however, ProCAD shows a significant reduction of resistance in case of complete debonding. Nevertheless, it is still better than Empress and comparable with Empress 2. In-Ceram Alumina and d.Sign have the highest long-term reliability, with almost 100% survivability even after 10 years. When compared with clinical failure rates reported in the literature, predictions show a promising match with clinical data, and this indicates the soundness of the settings used in the proposed predictions. PMID:25048249

  1. 1/f noise as a reliability estimation for solar panels

    Science.gov (United States)

    Alabedra, R.; Orsal, B.

    The purpose of this work is a study of the 1/f noise from a forward biased dark solar cell as a nondestructive reliability estimation of solar panels. It is shown that one cell with a given defect can be detected in a solar panel by low frequency noise measurements at obscurity. One real solar panel of 5 cells in parallel and 5 cells in series is tested by this method. The cells for space application are n(+)p monocrystalline silicon junction with an area of 8 sq cm and a base resistivity of 10 ohm/cm. In the first part of this paper the I-V, Rd=f(1) characteristics of one cell or of a panel are not modified when a small defect is introduced by a mechanical constraint. In the second part, the theoretical results on the 1/f noise in a p-n junction under forward bias are recalled. It is shown that the noise of the cell with a defect is about 10 to 15 times higher than that of a good cell. If one good cell is replaced by a cell with defect in the panel 5 x 5, this leads to an increase of about 30 percent of the noise level of the panel.

  2. Power capability evaluation for lithium iron phosphate batteries based on multi-parameter constraints estimation

    Science.gov (United States)

    Wang, Yujie; Pan, Rui; Liu, Chang; Chen, Zonghai; Ling, Qiang

    2018-01-01

    The battery power capability is intimately correlated with the climbing, braking and accelerating performance of the electric vehicles. Accurate power capability prediction can not only guarantee the safety but also regulate driving behavior and optimize battery energy usage. However, the nonlinearity of the battery model is very complex especially for the lithium iron phosphate batteries. Besides, the hysteresis loop in the open-circuit voltage curve is easy to cause large error in model prediction. In this work, a multi-parameter constraints dynamic estimation method is proposed to predict the battery continuous period power capability. A high-fidelity battery model which considers the battery polarization and hysteresis phenomenon is presented to approximate the high nonlinearity of the lithium iron phosphate battery. Explicit analyses of power capability with multiple constraints are elaborated, specifically the state-of-energy is considered in power capability assessment. Furthermore, to solve the problem of nonlinear system state estimation, and suppress noise interference, the UKF based state observer is employed for power capability prediction. The performance of the proposed methodology is demonstrated by experiments under different dynamic characterization schedules. The charge and discharge power capabilities of the lithium iron phosphate batteries are quantitatively assessed under different time scales and temperatures.

  3. Estimation of reliability of a interleaving PFC boost converter

    Directory of Open Access Journals (Sweden)

    Gulam Amer Sandepudi

    2010-01-01

    Full Text Available Reliability plays an important role in power supplies. For other electronic equipment, a certain failure mode, at least for a part of the total system, can often be employed without serious (critical effects. However, for power supply no such condition can be accepted, since very high demands on its reliability must be achieved. At higher power levels, the continuous conduction mode (CCM boost converter is preferred topology for implementation a front end with PFC. As a result, significant efforts have been made to improve the performance of high boost converter. This paper is one of the efforts for improving the performance of the converter from the reliability point of view. In this paper, interleaving boost power factor correction converter is simulated with single switch in continuous conduction mode (CCM, discontinuous conduction mode (DCM and critical conduction mode (CRM under different output power ratings. Results of the converter are explored from reliability point of view.

  4. Reliability estimation for check valves and other components

    International Nuclear Information System (INIS)

    McElhaney, K.L.; Staunton, R.H.

    1996-01-01

    For years the nuclear industry has depended upon component operational reliability information compiled from reliability handbooks and other generic sources as well as private databases generated by recognized experts both within and outside the nuclear industry. Regrettably, these technical bases lacked the benefit of large-scale operational data and comprehensive data verification, and did not take into account the parameters and combinations of parameters that affect the determination of failure rates. This paper briefly examines the historic use of generic component reliability data, its sources, and its limitations. The concept of using a single failure rate for a particular component type is also examined. Particular emphasis is placed on check valves due to the information available on those components. The Appendix presents some of the results of the extensive analyses done by Oak Ridge National Laboratory (ORNL) on check valve performance

  5. A Survey of Software Reliability Modeling and Estimation

    Science.gov (United States)

    1983-09-01

    considered include: the Jelinski-Moranda Model, the ,Geometric Model,’ and Musa’s Model. A Monte -Carlo study of the behavior of the ’V"’"*least squares...ceedings Number 261, 1979, pp. 34-1, 34-11. IoelAmrit, AGieboSSukert, Alan and Goel, Ararat , "A Guidebookfor Software Reliability Assessment, 1980

  6. Evaluation and reliability of bone histological age estimation methods

    African Journals Online (AJOL)

    Human age estimation at death plays a vital role in forensic anthropology and bioarchaeology. Researchers used morphological and histological methods to estimate human age from their skeletal remains. This paper discussed different histological methods that used human long bones and ribs to determine age ...

  7. Engineer’s estimate reliability and statistical characteristics of bids

    Directory of Open Access Journals (Sweden)

    Fariborz M. Tehrani

    2016-12-01

    Full Text Available The objective of this report is to provide a methodology for examining bids and evaluating the performance of engineer’s estimates in capturing the true cost of projects. This study reviews the cost development for transportation projects in addition to two sources of uncertainties in a cost estimate, including modeling errors and inherent variability. Sample projects are highway maintenance projects with a similar scope of the work, size, and schedule. Statistical analysis of engineering estimates and bids examines the adaptability of statistical models for sample projects. Further, the variation of engineering cost estimates from inception to implementation has been presented and discussed for selected projects. Moreover, the applicability of extreme values theory is assessed for available data. The results indicate that the performance of engineer’s estimate is best evaluated based on trimmed average of bids, excluding discordant bids.

  8. Reliability estimates for selected sensors in fusion applications

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1996-09-01

    This report presents the results of a study to define several types of sensors in use, the qualitative reliability (failure modes) and quantitative reliability (average failure rates) for these types of process sensors. Temperature, pressure, flow, and level sensors are discussed for water coolant and for cryogenic coolants. The failure rates that have been found are useful for risk assessment and safety analysis. Repair times and calibration intervals are also given when found in the literature. All of these values can also be useful to plant operators and maintenance personnel. Designers may be able to make use of these data when planning systems. The final chapter in this report discusses failure rates for several types of personnel safety sensors, including ionizing radiation monitors, toxic and combustible gas detectors, humidity sensors, and magnetic field sensors. These data could be useful to industrial hygienists and other safety professionals when designing or auditing for personnel safety

  9. Sensitivity of Reliability Estimates in Partially Damaged RC Structures subject to Earthquakes, using Reduced Hysteretic Models

    DEFF Research Database (Denmark)

    Iwankiewicz, R.; Nielsen, Søren R. K.; Skjærbæk, P. S.

    The subject of the paper is the investigation of the sensitivity of structural reliability estimation by a reduced hysteretic model for a reinforced concrete frame under an earthquake excitation.......The subject of the paper is the investigation of the sensitivity of structural reliability estimation by a reduced hysteretic model for a reinforced concrete frame under an earthquake excitation....

  10. Reliability and precision of pellet-group counts for estimating landscape-level deer density

    Science.gov (United States)

    David S. deCalesta

    2013-01-01

    This study provides hitherto unavailable methodology for reliably and precisely estimating deer density within forested landscapes, enabling quantitative rather than qualitative deer management. Reliability and precision of the deer pellet-group technique were evaluated in 1 small and 2 large forested landscapes. Density estimates, adjusted to reflect deer harvest and...

  11. Alternative Estimates of the Reliability of College Grade Point Averages. Professional File. Article 130, Spring 2013

    Science.gov (United States)

    Saupe, Joe L.; Eimers, Mardy T.

    2013-01-01

    The purpose of this paper is to explore differences in the reliabilities of cumulative college grade point averages (GPAs), estimated for unweighted and weighted, one-semester, 1-year, 2-year, and 4-year GPAs. Using cumulative GPAs for a freshman class at a major university, we estimate internal consistency (coefficient alpha) reliabilities for…

  12. On estimation of reliability for pipe lines of heat power plants under cyclic loading

    International Nuclear Information System (INIS)

    Verezemskij, V.G.

    1986-01-01

    One of the possible methods to obtain a quantitative estimate of the reliability for pipe lines of the welded heat power plants under cyclic loading due to heating-cooling and due to vibration is considered. Reliability estimate is carried out for a common case of loading by simultaneous cycles with different amplitudes and loading asymmetry. It is shown that scattering of the breaking number of cycles for the metal of welds may perceptibly decrease reliability of the welded pipe line

  13. A reliable and economical method for gaining mouse embryonic fibroblasts capable of preparing feeder layers.

    Science.gov (United States)

    Jiang, Guangming; Wan, Xiaoju; Wang, Ming; Zhou, Jianhua; Pan, Jian; Wang, Baolong

    2016-08-01

    Mouse embryonic fibroblasts (MEFs) are widely used to prepare feeder layers for culturing embryonic stem cells (ESCs) or induced pluripotent stem cells (iPSCs) in vitro. Transportation lesions and exorbitant prices make the commercially obtained MEFs unsuitable for long term research. The aim of present study is to establish a method, which enables researchers to gain MEFs from mice and establish feeder layers by themselves in ordinary laboratories. MEFs were isolated from ICR mouse embryos at 12.5-17.5 day post-coitum (DPC) and cultured in vitro. At P2-P7, the cells were inactivated with mitomycin C or by X-ray irradiation. Then they were used to prepare feeder layers. The key factors of the whole protocol were analyzed to determine the optimal conditions for the method. The results revealed MEFs isolated at 12.5-13.5 DPC, and cultured to P3 were the best choice for feeder preparation, those P2 and P4-P5 MEFs were also suitable for the purpose. The P3-P5 MEFs treated with 10 μg/ml of mitomycin C for 3 h, or irradiated with X-ray at 1.5 Gy/min for 25 Gy were the most suitable feeder cells. Treating MEFs with 10 μg/ml of mitomycin C for 2.5 h, 15 μg/ml for 2.0 h, or irradiating the cells with 20 Gy of X-ray at 2.0 Gy/min could all serve as alternative methods for P3-P4 cells. Our study provides a reliable and economical way to obtain large amount of qualified MEFs for long term research of ESCs or iPSCs.

  14. Approximate estimation of system reliability via fault trees

    International Nuclear Information System (INIS)

    Dutuit, Y.; Rauzy, A.

    2005-01-01

    In this article, we show how fault tree analysis, carried out by means of binary decision diagrams (BDD), is able to approximate reliability of systems made of independent repairable components with a good accuracy and a good efficiency. We consider four algorithms: the Murchland lower bound, the Barlow-Proschan lower bound, the Vesely full approximation and the Vesely asymptotic approximation. For each of these algorithms, we consider an implementation based on the classical minimal cut sets/rare events approach and another one relying on the BDD technology. We present numerical results obtained with both approaches on various examples

  15. Providing Reliability Services through Demand Response: A Prelimnary Evaluation of the Demand Response Capabilities of Alcoa Inc.

    Energy Technology Data Exchange (ETDEWEB)

    Starke, Michael R [ORNL; Kirby, Brendan J [ORNL; Kueck, John D [ORNL; Todd, Duane [Alcoa; Caulfield, Michael [Alcoa; Helms, Brian [Alcoa

    2009-02-01

    Demand response is the largest underutilized reliability resource in North America. Historic demand response programs have focused on reducing overall electricity consumption (increasing efficiency) and shaving peaks but have not typically been used for immediate reliability response. Many of these programs have been successful but demand response remains a limited resource. The Federal Energy Regulatory Commission (FERC) report, 'Assessment of Demand Response and Advanced Metering' (FERC 2006) found that only five percent of customers are on some form of demand response program. Collectively they represent an estimated 37,000 MW of response potential. These programs reduce overall energy consumption, lower green house gas emissions by allowing fossil fuel generators to operate at increased efficiency and reduce stress on the power system during periods of peak loading. As the country continues to restructure energy markets with sophisticated marginal cost models that attempt to minimize total energy costs, the ability of demand response to create meaningful shifts in the supply and demand equations is critical to creating a sustainable and balanced economic response to energy issues. Restructured energy market prices are set by the cost of the next incremental unit of energy, so that as additional generation is brought into the market, the cost for the entire market increases. The benefit of demand response is that it reduces overall demand and shifts the entire market to a lower pricing level. This can be very effective in mitigating price volatility or scarcity pricing as the power system responds to changing demand schedules, loss of large generators, or loss of transmission. As a global producer of alumina, primary aluminum, and fabricated aluminum products, Alcoa Inc., has the capability to provide demand response services through its manufacturing facilities and uniquely through its aluminum smelting facilities. For a typical aluminum smelter

  16. Numerical Model based Reliability Estimation of Selective Laser Melting Process

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2014-01-01

    Selective laser melting is developing into a standard manufacturing technology with applications in various sectors. However, the process is still far from being at par with conventional processes such as welding and casting, the primary reason of which is the unreliability of the process. While...... of the selective laser melting process. A validated 3D finite-volume alternating-direction-implicit numerical technique is used to model the selective laser melting process, and is calibrated against results from single track formation experiments. Correlation coefficients are determined for process input...... parameters such as laser power, speed, beam profile, etc. Subsequently, uncertainties in the processing parameters are utilized to predict a range for the various outputs, using a Monte Carlo method based uncertainty analysis methodology, and the reliability of the process is established....

  17. Estimating the reliability of eyewitness identifications from police lineups.

    Science.gov (United States)

    Wixted, John T; Mickes, Laura; Dunn, John C; Clark, Steven E; Wells, William

    2016-01-12

    Laboratory-based mock crime studies have often been interpreted to mean that (i) eyewitness confidence in an identification made from a lineup is a weak indicator of accuracy and (ii) sequential lineups are diagnostically superior to traditional simultaneous lineups. Largely as a result, juries are increasingly encouraged to disregard eyewitness confidence, and up to 30% of law enforcement agencies in the United States have adopted the sequential procedure. We conducted a field study of actual eyewitnesses who were assigned to simultaneous or sequential photo lineups in the Houston Police Department over a 1-y period. Identifications were made using a three-point confidence scale, and a signal detection model was used to analyze and interpret the results. Our findings suggest that (i) confidence in an eyewitness identification from a fair lineup is a highly reliable indicator of accuracy and (ii) if there is any difference in diagnostic accuracy between the two lineup formats, it likely favors the simultaneous procedure.

  18. The Locomotor Capabilities Index; validity and reliability of the Swedish version in adults with lower limb amputation

    Directory of Open Access Journals (Sweden)

    Andersson Ingemar H

    2009-05-01

    Full Text Available Abstract Background The Locomotor Capabilities Index (LCI is a validated measure of lower-limb amputees' ability to perform activities with prosthesis. We have developed the LCI Swedish version and evaluated its validity and reliability. Methods Cross-cultural adaptation to Swedish included forward/backward translations and field testing. The Swedish LCI was then administered to 144 amputees (55 women, mean age 74 (40–93 years, attending post-rehabilitation prosthetic training. Construct validity was assessed by examining the relationship between the LCI and Timed "Up-and-Go" (TUG test and between the LCI and EQ-5D health utility index in 2 subgroups of 40 and 20 amputees, respectively. Discriminative validity was assessed by comparing scores in different age groups and in unilateral and bilateral amputees. Test-retest reliability (1–2 weeks was evaluated in 20 amputees (14 unilateral. Results The Swedish LCI showed good construct convergent validity, with high correlation with the TUG (r = -0.75 and the EQ-5D (r = 0.84, and discriminative validity, with significantly worse mean scores for older than younger and for bilateral than unilateral amputees (p Conclusion The Swedish version of the LCI demonstrated good validity and internal consistency in adult amputees. Test-retest reliability in a small subsample appears to be acceptable. The high ceiling effect of the LCI may imply that it would be most useful in assessing amputees with low to moderate functional abilities.

  19. Empirical Study of Travel Time Estimation and Reliability

    OpenAIRE

    Li, Ruimin; Chai, Huajun; Tang, Jin

    2013-01-01

    This paper explores the travel time distribution of different types of urban roads, the link and path average travel time, and variance estimation methods by analyzing the large-scale travel time dataset detected from automatic number plate readers installed throughout Beijing. The results show that the best-fitting travel time distribution for different road links in 15 min time intervals differs for different traffic congestion levels. The average travel time for all links on all days can b...

  20. Statistical estimation Monte Carlo for unreliability evaluation of highly reliable system

    International Nuclear Information System (INIS)

    Xiao Gang; Su Guanghui; Jia Dounan; Li Tianduo

    2000-01-01

    Based on analog Monte Carlo simulation, statistical Monte Carlo methods for unreliable evaluation of highly reliable system are constructed, including direct statistical estimation Monte Carlo method and weighted statistical estimation Monte Carlo method. The basal element is given, and the statistical estimation Monte Carlo estimators are derived. Direct Monte Carlo simulation method, bounding-sampling method, forced transitions Monte Carlo method, direct statistical estimation Monte Carlo and weighted statistical estimation Monte Carlo are used to evaluate unreliability of a same system. By comparing, weighted statistical estimation Monte Carlo estimator has smallest variance, and has highest calculating efficiency

  1. The push-off test: development of a simple, reliable test of upper extremity weight-bearing capability.

    Science.gov (United States)

    Vincent, Joshua I; MacDermid, Joy C; Michlovitz, Susan L; Rafuse, Richard; Wells-Rowsell, Christina; Wong, Owen; Bisbee, Leslie

    2014-01-01

    Longitudinal clinical measurement study. The push-off test (POT) is a novel and simple measure of upper extremity weight-bearing that can be measured with a grip dynamometer. There are no published studies on the validity and reliability of the POT. The relationship between upper extremity self-report activity/participation and impairment measures remain an unexplored realm. The primary purpose of this study is to estimate the intra and inter-rater reliability and construct validity of the POT. The secondary purpose is to estimate the relationship between upper extremity self-report activity/participation questionnaires and impairment measures. A convenience sample of 22 patients with wrist or elbow injuries were tested for POT, wrist/elbow range of motion (ROM), isometric wrist extension strength (WES) and grip strength; and completed two self-report activity/participation questionnaires: Disability of the Arm, Shoulder and the Hand (DASH) and Work Limitations Questionnaire (WLQ-26). POT's inter and intra-rater reliability and construct validity was tested. Pearson's correlations were run between the impairment measures and self-report questionnaires to look into the relationship amongst them. The POT demonstrated high inter-rater reliability (ICC affected = 0.97; 95% C.I. 0.93-0.99; ICC unaffected = 0.85; 95% C.I. 0.68-0.94) and intra-rater reliability (ICC affected = 0.96; 95% C.I. 0.92-0.97; ICC unaffected = 0.92; 95% C.I. 0.85-0.97). The POT was correlated moderately with the DASH (r = -0.47; p = 0.03). While examining the relationship between upper extremity self-reported activity/participation questionnaires and impairment measures the strongest correlation was between the DASH and the POT (r = -0.47; p = 0.03) and none of the correlations with the other physical impairment measures reached significance. At-work disability demonstrated insignificant correlations with physical impairments. The POT test provides a reliable and easily

  2. Scoring the Icecap-A Capability Instrument. Estimation of a UK General Population Tariff†

    Science.gov (United States)

    Flynn, Terry N; Huynh, Elisabeth; Peters, Tim J; Al-Janabi, Hareth; Clemens, Sam; Moody, Alison; Coast, Joanna

    2015-01-01

    This paper reports the results of a best–worst scaling (BWS) study to value the Investigating Choice Experiments Capability Measure for Adults (ICECAP-A), a new capability measure among adults, in a UK setting. A main effects plan plus its foldover was used to estimate weights for each of the four levels of all five attributes. The BWS study was administered to 413 randomly sampled individuals, together with sociodemographic and other questions. Scale-adjusted latent class analyses identified two preference and two (variance) scale classes. Ability to characterize preference and scale heterogeneity was limited, but data quality was good, and the final model exhibited a high pseudo-r-squared. After adjusting for heterogeneity, a population tariff was estimated. This showed that ‘attachment’ and ‘stability’ each account for around 22% of the space, and ‘autonomy’, ‘achievement’ and ‘enjoyment’ account for around 18% each. Across all attributes, greater value was placed on the difference between the lowest levels of capability than between the highest. This tariff will enable ICECAP-A to be used in economic evaluation both within the field of health and across public policy generally. © 2013 The Authors. Health Economics published by John Wiley & Sons Ltd. PMID:24254584

  3. Scoring the Icecap-a capability instrument. Estimation of a UK general population tariff.

    Science.gov (United States)

    Flynn, Terry N; Huynh, Elisabeth; Peters, Tim J; Al-Janabi, Hareth; Clemens, Sam; Moody, Alison; Coast, Joanna

    2015-03-01

    This paper reports the results of a best-worst scaling (BWS) study to value the Investigating Choice Experiments Capability Measure for Adults (ICECAP-A), a new capability measure among adults, in a UK setting. A main effects plan plus its foldover was used to estimate weights for each of the four levels of all five attributes. The BWS study was administered to 413 randomly sampled individuals, together with sociodemographic and other questions. Scale-adjusted latent class analyses identified two preference and two (variance) scale classes. Ability to characterize preference and scale heterogeneity was limited, but data quality was good, and the final model exhibited a high pseudo-r-squared. After adjusting for heterogeneity, a population tariff was estimated. This showed that 'attachment' and 'stability' each account for around 22% of the space, and 'autonomy', 'achievement' and 'enjoyment' account for around 18% each. Across all attributes, greater value was placed on the difference between the lowest levels of capability than between the highest. This tariff will enable ICECAP-A to be used in economic evaluation both within the field of health and across public policy generally. © 2013 The Authors. Health Economics published by John Wiley & Sons Ltd.

  4. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  5. Estimating the Parameters of Software Reliability Growth Models Using the Grey Wolf Optimization Algorithm

    OpenAIRE

    Alaa F. Sheta; Amal Abdel-Raouf

    2016-01-01

    In this age of technology, building quality software is essential to competing in the business market. One of the major principles required for any quality and business software product for value fulfillment is reliability. Estimating software reliability early during the software development life cycle saves time and money as it prevents spending larger sums fixing a defective software product after deployment. The Software Reliability Growth Model (SRGM) can be used to predict the number of...

  6. Investing American Recovery and Reinvestment Act Funds to Advance Capability, Reliability, and Performance in NASA Wind Tunnels

    Science.gov (United States)

    Sydnor, Goerge H.

    2010-01-01

    The National Aeronautics and Space Administration's (NASA) Aeronautics Test Program (ATP) is implementing five significant ground-based test facility projects across the nation with funding provided by the American Recovery and Reinvestment Act (ARRA). The projects were selected as the best candidates within the constraints of the ARRA and the strategic plan of ATP. They are a combination of much-needed large scale maintenance, reliability, and system upgrades plus creating new test beds for upcoming research programs. The projects are: 1.) Re-activation of a large compressor to provide a second source for compressed air and vacuum to the Unitary Plan Wind Tunnel at the Ames Research Center (ARC) 2.) Addition of high-altitude ice crystal generation at the Glenn Research Center Propulsion Systems Laboratory Test Cell 3, 3.) New refrigeration system and tunnel heat exchanger for the Icing Research Tunnel at the Glenn Research Center, 4.) Technical viability improvements for the National Transonic Facility at the Langley Research Center, and 5.) Modifications to conduct Environmentally Responsible Aviation and Rotorcraft research at the 14 x 22 Subsonic Tunnel at Langley Research Center. The selection rationale, problem statement, and technical solution summary for each project is given here. The benefits and challenges of the ARRA funded projects are discussed. Indirectly, this opportunity provides the advantages of developing experience in NASA's workforce in large projects and maintaining corporate knowledge in that very unique capability. It is envisioned that improved facilities will attract a larger user base and capabilities that are needed for current and future research efforts will offer revenue growth and future operations stability. Several of the chosen projects will maximize wind tunnel reliability and maintainability by using newer, proven technologies in place of older and obsolete equipment and processes. The projects will meet NASA's goal of

  7. Fault-tolerant embedded system design and optimization considering reliability estimation uncertainty

    International Nuclear Information System (INIS)

    Wattanapongskorn, Naruemon; Coit, David W.

    2007-01-01

    In this paper, we model embedded system design and optimization, considering component redundancy and uncertainty in the component reliability estimates. The systems being studied consist of software embedded in associated hardware components. Very often, component reliability values are not known exactly. Therefore, for reliability analysis studies and system optimization, it is meaningful to consider component reliability estimates as random variables with associated estimation uncertainty. In this new research, the system design process is formulated as a multiple-objective optimization problem to maximize an estimate of system reliability, and also, to minimize the variance of the reliability estimate. The two objectives are combined by penalizing the variance for prospective solutions. The two most common fault-tolerant embedded system architectures, N-Version Programming and Recovery Block, are considered as strategies to improve system reliability by providing system redundancy. Four distinct models are presented to demonstrate the proposed optimization techniques with or without redundancy. For many design problems, multiple functionally equivalent software versions have failure correlation even if they have been independently developed. The failure correlation may result from faults in the software specification, faults from a voting algorithm, and/or related faults from any two software versions. Our approach considers this correlation in formulating practical optimization models. Genetic algorithms with a dynamic penalty function are applied in solving this optimization problem, and reasonable and interesting results are obtained and discussed

  8. Estimating power capability of aged lithium-ion batteries in presence of communication delays

    Science.gov (United States)

    Fridholm, Björn; Wik, Torsten; Kuusisto, Hannes; Klintberg, Anton

    2018-04-01

    Efficient control of electrified powertrains requires accurate estimation of the power capability of the battery for the next few seconds into the future. When implemented in a vehicle, the power estimation is part of a control loop that may contain several networked controllers which introduces time delays that may jeopardize stability. In this article, we present and evaluate an adaptive power estimation method that robustly can handle uncertain health status and time delays. A theoretical analysis shows that stability of the closed loop system can be lost if the resistance of the model is under-estimated. Stability can, however, be restored by filtering the estimated power at the expense of slightly reduced bandwidth of the signal. The adaptive algorithm is experimentally validated in lab tests using an aged lithium-ion cell subject to a high power load profile in temperatures from -20 to +25 °C. The upper voltage limit was set to 4.15 V and the lower voltage limit to 2.6 V, where significant non-linearities are occurring and the validity of the model is limited. After an initial transient when the model parameters are adapted, the prediction accuracy is within ± 2 % of the actually available power.

  9. Investigation of the Capability of Compact Polarimetric SAR Interferometry to Estimate Forest Height

    Science.gov (United States)

    Zhang, Hong; Xie, Lei; Wang, Chao; Chen, Jiehong

    2013-08-01

    The main objective of this paper is to investigate the capability of compact Polarimetric SAR Interferometry (C-PolInSAR) on forest height estimation. For this, the pseudo fully polarimetric interferomteric (F-PolInSAR) covariance matrix is firstly reconstructed, then the three- stage inversion algorithm, hybrid algorithm, Music and Capon algorithm are applied to both C-PolInSAR covariance matrix and pseudo F-PolInSAR covariance matrix. The availability of forest height estimation is demonstrated using L-band data generated by simulator PolSARProSim and X-band airborne data acquired by East China Research Institute of Electronic Engineering, China Electronics Technology Group Corporation.

  10. Reliability/Cost Evaluation on Power System connected with Wind Power for the Reserve Estimation

    DEFF Research Database (Denmark)

    Lee, Go-Eun; Cha, Seung-Tae; Shin, Je-Seok

    2012-01-01

    Wind power is ideally a renewable energy with no fuel cost, but has a risk to reduce reliability of the whole system because of uncertainty of the output. If the reserve of the system is increased, the reliability of the system may be improved. However, the cost would be increased. Therefore...... the reserve needs to be estimated considering the trade-off between reliability and economic aspects. This paper suggests a methodology to estimate the appropriate reserve, when wind power is connected to the power system. As a case study, when wind power is connected to power system of Korea, the effects...

  11. A large-eddy simulation based power estimation capability for wind farms over complex terrain

    Science.gov (United States)

    Senocak, I.; Sandusky, M.; Deleon, R.

    2017-12-01

    There has been an increasing interest in predicting wind fields over complex terrain at the micro-scale for resource assessment, turbine siting, and power forecasting. These capabilities are made possible by advancements in computational speed from a new generation of computing hardware, numerical methods and physics modelling. The micro-scale wind prediction model presented in this work is based on the large-eddy simulation paradigm with surface-stress parameterization. The complex terrain is represented using an immersed-boundary method that takes into account the parameterization of the surface stresses. Governing equations of incompressible fluid flow are solved using a projection method with second-order accurate schemes in space and time. We use actuator disk models with rotation to simulate the influence of turbines on the wind field. Data regarding power production from individual turbines are mostly restricted because of proprietary nature of the wind energy business. Most studies report percentage drop of power relative to power from the first row. There have been different approaches to predict power production. Some studies simply report available wind power in the upstream, some studies estimate power production using power curves available from turbine manufacturers, and some studies estimate power as torque multiplied by rotational speed. In the present work, we propose a black-box approach that considers a control volume around a turbine and estimate the power extracted from the turbine based on the conservation of energy principle. We applied our wind power prediction capability to wind farms over flat terrain such as the wind farm over Mower County, Minnesota and the Horns Rev offshore wind farm in Denmark. The results from these simulations are in good agreement with published data. We also estimate power production from a hypothetical wind farm in complex terrain region and identify potential zones suitable for wind power production.

  12. An adaptive neuro fuzzy model for estimating the reliability of component-based software systems

    Directory of Open Access Journals (Sweden)

    Kirti Tyagi

    2014-01-01

    Full Text Available Although many algorithms and techniques have been developed for estimating the reliability of component-based software systems (CBSSs, much more research is needed. Accurate estimation of the reliability of a CBSS is difficult because it depends on two factors: component reliability and glue code reliability. Moreover, reliability is a real-world phenomenon with many associated real-time problems. Soft computing techniques can help to solve problems whose solutions are uncertain or unpredictable. A number of soft computing approaches for estimating CBSS reliability have been proposed. These techniques learn from the past and capture existing patterns in data. The two basic elements of soft computing are neural networks and fuzzy logic. In this paper, we propose a model for estimating CBSS reliability, known as an adaptive neuro fuzzy inference system (ANFIS, that is based on these two basic elements of soft computing, and we compare its performance with that of a plain FIS (fuzzy inference system for different data sets.

  13. Reliance on and Reliability of the Engineer’s Estimate in Heavy Civil Projects

    Directory of Open Access Journals (Sweden)

    George Okere

    2017-06-01

    Full Text Available To the contractor, the engineer’s estimate is the target number to aim for, and the basis for a contractor to evaluate the accuracy of their estimate. To the owner, the engineer’s estimate is the basis for funding, evaluation of bids, and for predicting project costs. As such the engineer’s estimate is the benchmark. This research sought to investigate the reliance on, and the reliability of the engineer’s estimate in heavy civil cost estimate. The research objective was to characterize the engineer’s estimate and allow owners and contractors re-evaluate or affirm their reliance on the engineer’s estimate. A literature review was conducted to understand the reliance on the engineer’s estimate, and secondary data from Washington State Department of Transportation was used to investigate the reliability of the engineer’s estimate. The findings show the need for practitioners to re-evaluate their reliance on the engineer’s estimate. The empirical data showed that, within various contexts, the engineer’s estimate fell outside the expected accuracy range of the low bids or the cost to complete projects. The study recommends direct tracking of costs by project owners while projects are under construction, the use of a second estimate to improve the accuracy of their estimates, and use of the cost estimating practices found in highly reputable construction companies.

  14. Sample size planning for composite reliability coefficients: accuracy in parameter estimation via narrow confidence intervals.

    Science.gov (United States)

    Terry, Leann; Kelley, Ken

    2012-11-01

    Composite measures play an important role in psychology and related disciplines. Composite measures almost always have error. Correspondingly, it is important to understand the reliability of the scores from any particular composite measure. However, the point estimates of the reliability of composite measures are fallible and thus all such point estimates should be accompanied by a confidence interval. When confidence intervals are wide, there is much uncertainty in the population value of the reliability coefficient. Given the importance of reporting confidence intervals for estimates of reliability, coupled with the undesirability of wide confidence intervals, we develop methods that allow researchers to plan sample size in order to obtain narrow confidence intervals for population reliability coefficients. We first discuss composite reliability coefficients and then provide a discussion on confidence interval formation for the corresponding population value. Using the accuracy in parameter estimation approach, we develop two methods to obtain accurate estimates of reliability by planning sample size. The first method provides a way to plan sample size so that the expected confidence interval width for the population reliability coefficient is sufficiently narrow. The second method ensures that the confidence interval width will be sufficiently narrow with some desired degree of assurance (e.g., 99% assurance that the 95% confidence interval for the population reliability coefficient will be less than W units wide). The effectiveness of our methods was verified with Monte Carlo simulation studies. We demonstrate how to easily implement the methods with easy-to-use and freely available software. ©2011 The British Psychological Society.

  15. Reliance on and Reliability of the Engineer’s Estimate in Heavy Civil Projects

    OpenAIRE

    Okere, George

    2017-01-01

    To the contractor, the engineer’s estimate is the target number to aim for, and the basis for a contractor to evaluate the accuracy of their estimate. To the owner, the engineer’s estimate is the basis for funding, evaluation of bids, and for predicting project costs. As such the engineer’s estimate is the benchmark. This research sought to investigate the reliance on, and the reliability of the engineer’s estimate in heavy civil cost estimate. The research objective was to characterize the e...

  16. An Energy-Based Limit State Function for Estimation of Structural Reliability in Shock Environments

    Directory of Open Access Journals (Sweden)

    Michael A. Guthrie

    2013-01-01

    Full Text Available limit state function is developed for the estimation of structural reliability in shock environments. This limit state function uses peak modal strain energies to characterize environmental severity and modal strain energies at failure to characterize the structural capacity. The Hasofer-Lind reliability index is briefly reviewed and its computation for the energy-based limit state function is discussed. Applications to two degree of freedom mass-spring systems and to a simple finite element model are considered. For these examples, computation of the reliability index requires little effort beyond a modal analysis, but still accounts for relevant uncertainties in both the structure and environment. For both examples, the reliability index is observed to agree well with the results of Monte Carlo analysis. In situations where fast, qualitative comparison of several candidate designs is required, the reliability index based on the proposed limit state function provides an attractive metric which can be used to compare and control reliability.

  17. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    International Nuclear Information System (INIS)

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  18. Root biomass in cereals, catch crops and weeds can be reliably estimated without considering aboveground biomass

    DEFF Research Database (Denmark)

    Hu, Teng; Sørensen, Peter; Wahlström, Ellen Margrethe

    2018-01-01

    and management factors may affect this allometric relationship making such estimates uncertain and biased. Therefore, we aimed to explore how root biomass for typical cereal crops, catch crops and weeds could most reliably be estimated. Published and unpublished data on aboveground and root biomass (corrected...

  19. Parameter estimation of component reliability models in PSA model of Krsko NPP

    International Nuclear Information System (INIS)

    Jordan Cizelj, R.; Vrbanic, I.

    2001-01-01

    In the paper, the uncertainty analysis of component reliability models for independent failures is shown. The present approach for parameter estimation of component reliability models in NPP Krsko is presented. Mathematical approaches for different types of uncertainty analyses are introduced and used in accordance with some predisposed requirements. Results of the uncertainty analyses are shown in an example for time-related components. As the most appropriate uncertainty analysis proved the Bayesian estimation with the numerical estimation of a posterior, which can be approximated with some appropriate probability distribution, in this paper with lognormal distribution.(author)

  20. Location estimation of approaching objects is modulated by the observer's inherent and momentary action capabilities

    NARCIS (Netherlands)

    Kandula, Manasa; Hofman, Dennis; Dijkerman, H Chris

    2016-01-01

    Action capability may be one of the factors that can influence our percept of the world. A distinction can be made between momentary action capability (action capability at that particular moment) and inherent action capability (representing a stable action capability). In the current study, we

  1. How Many Sleep Diary Entries Are Needed to Reliably Estimate Adolescent Sleep?

    Science.gov (United States)

    Arora, Teresa; Gradisar, Michael; Taheri, Shahrad; Carskadon, Mary A.

    2017-01-01

    Abstract Study Objectives: To investigate (1) how many nights of sleep diary entries are required for reliable estimates of five sleep-related outcomes (bedtime, wake time, sleep onset latency [SOL], sleep duration, and wake after sleep onset [WASO]) and (2) the test–retest reliability of sleep diary estimates of school night sleep across 12 weeks. Methods: Data were drawn from four adolescent samples (Australia [n = 385], Qatar [n = 245], United Kingdom [n = 770], and United States [n = 366]), who provided 1766 eligible sleep diary weeks for reliability analyses. We performed reliability analyses for each cohort using complete data (7 days), one to five school nights, and one to two weekend nights. We also performed test–retest reliability analyses on 12-week sleep diary data available from a subgroup of 55 US adolescents. Results: Intraclass correlation coefficients for bedtime, SOL, and sleep duration indicated good-to-excellent reliability from five weekday nights of sleep diary entries across all adolescent cohorts. Four school nights was sufficient for wake times in the Australian and UK samples, but not the US or Qatari samples. Only Australian adolescents showed good reliability for two weekend nights of bedtime reports; estimates of SOL were adequate for UK adolescents based on two weekend nights. WASO was not reliably estimated using 1 week of sleep diaries. We observed excellent test–rest reliability across 12 weeks of sleep diary data in a subsample of US adolescents. Conclusion: We recommend at least five weekday nights of sleep dairy entries to be made when studying adolescent bedtimes, SOL, and sleep duration. Adolescent sleep patterns were stable across 12 consecutive school weeks. PMID:28199718

  2. A Data-Driven Reliability Estimation Approach for Phased-Mission Systems

    Directory of Open Access Journals (Sweden)

    Hua-Feng He

    2014-01-01

    Full Text Available We attempt to address the issues associated with reliability estimation for phased-mission systems (PMS and present a novel data-driven approach to achieve reliability estimation for PMS using the condition monitoring information and degradation data of such system under dynamic operating scenario. In this sense, this paper differs from the existing methods only considering the static scenario without using the real-time information, which aims to estimate the reliability for a population but not for an individual. In the presented approach, to establish a linkage between the historical data and real-time information of the individual PMS, we adopt a stochastic filtering model to model the phase duration and obtain the updated estimation of the mission time by Bayesian law at each phase. At the meanwhile, the lifetime of PMS is estimated from degradation data, which are modeled by an adaptive Brownian motion. As such, the mission reliability can be real time obtained through the estimated distribution of the mission time in conjunction with the estimated lifetime distribution. We demonstrate the usefulness of the developed approach via a numerical example.

  3. Bayesian and Classical Estimation of Stress-Strength Reliability for Inverse Weibull Lifetime Models

    Directory of Open Access Journals (Sweden)

    Qixuan Bi

    2017-06-01

    Full Text Available In this paper, we consider the problem of estimating stress-strength reliability for inverse Weibull lifetime models having the same shape parameters but different scale parameters. We obtain the maximum likelihood estimator and its asymptotic distribution. Since the classical estimator doesn’t hold explicit forms, we propose an approximate maximum likelihood estimator. The asymptotic confidence interval and two bootstrap intervals are obtained. Using the Gibbs sampling technique, Bayesian estimator and the corresponding credible interval are obtained. The Metropolis-Hastings algorithm is used to generate random variates. Monte Carlo simulations are conducted to compare the proposed methods. Analysis of a real dataset is performed.

  4. Estimating Between-Person and Within-Person Subscore Reliability with Profile Analysis.

    Science.gov (United States)

    Bulut, Okan; Davison, Mark L; Rodriguez, Michael C

    2017-01-01

    Subscores are of increasing interest in educational and psychological testing due to their diagnostic function for evaluating examinees' strengths and weaknesses within particular domains of knowledge. Previous studies about the utility of subscores have mostly focused on the overall reliability of individual subscores and ignored the fact that subscores should be distinct and have added value over the total score. This study introduces a profile reliability approach that partitions the overall subscore reliability into within-person and between-person subscore reliability. The estimation of between-person reliability and within-person reliability coefficients is demonstrated using subscores from number-correct scoring, unidimensional and multidimensional item response theory scoring, and augmented scoring approaches via a simulation study and a real data study. The effects of various testing conditions, such as subtest length, correlations among subscores, and the number of subtests, are examined. Results indicate that there is a substantial trade-off between within-person and between-person reliability of subscores. Profile reliability coefficients can be useful in determining the extent to which subscores provide distinct and reliable information under various testing conditions.

  5. Integrated Reliability Estimation of a Nuclear Maintenance Robot including a Software

    Energy Technology Data Exchange (ETDEWEB)

    Eom, Heung Seop; Kim, Jae Hee; Jeong, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    Conventional reliability estimation techniques such as Fault Tree Analysis (FTA), Reliability Block Diagram (RBD), Markov Model, and Event Tree Analysis (ETA) have been used widely and approved in some industries. Then there are some limitations when we use them for a complicate robot systems including software such as intelligent reactor inspection robots. Therefore an expert's judgment plays an important role in estimating the reliability of a complicate system in practice, because experts can deal with diverse evidence related to the reliability and then perform an inference based on them. The proposed method in this paper combines qualitative and quantitative evidences and performs an inference like experts. Furthermore, it does the work in a formal and in a quantitative way unlike human experts, by the benefits of Bayesian Nets (BNs)

  6. Human error data collection as a precursor to the development of a human reliability assessment capability in air traffic management

    International Nuclear Information System (INIS)

    Kirwan, Barry; Gibson, W. Huw; Hickling, Brian

    2008-01-01

    Quantified risk and safety assessments are now required for safety cases for European air traffic management (ATM) services. Since ATM is highly human-dependent for its safety, this suggests a need for formal human reliability assessment (HRA), as carried out in other industries such as nuclear power. Since the fundamental aspect of HRA is human error data, in the form of human error probabilities (HEPs), it was decided to take a first step towards development of an ATM HRA approach by deriving some HEPs in an ATM context. This paper reports a study, which collected HEPs via analysing the results of a real-time simulation involving air traffic controllers (ATCOs) and pilots, with a focus on communication errors. This study did indeed derive HEPs that were found to be concordant with other known communication human error data. This is a first step, and shows promise for HRA in ATM, since HEPs have been derived which could be used in safety assessments, although these HEPs are for only one (albeit critical) aspect of ATCOs' tasks (communications). The paper discusses options and potential ways forward for the development of a full HRA capability in ATM

  7. "A Comparison of Consensus, Consistency, and Measurement Approaches to Estimating Interrater Reliability"

    OpenAIRE

    Steven E. Stemler

    2004-01-01

    This article argues that the general practice of describing interrater reliability as a single, unified concept is..at best imprecise, and at worst potentially misleading. Rather than representing a single concept, different..statistical methods for computing interrater reliability can be more accurately classified into one of three..categories based upon the underlying goals of analysis. The three general categories introduced and..described in this paper are: 1) consensus estimates, 2) cons...

  8. Reliability estimation for multiunit nuclear and fossil-fired industrial energy systems

    International Nuclear Information System (INIS)

    Sullivan, W.G.; Wilson, J.V.; Klepper, O.H.

    1977-01-01

    As petroleum-based fuels grow increasingly scarce and costly, nuclear energy may become an important alternative source of industrial energy. Initial applications would most likely include a mix of fossil-fired and nuclear sources of process energy. A means for determining the overall reliability of these mixed systems is a fundamental aspect of demonstrating their feasibility to potential industrial users. Reliability data from nuclear and fossil-fired plants are presented, and several methods of applying these data for calculating the reliability of reasonably complex industrial energy supply systems are given. Reliability estimates made under a number of simplifying assumptions indicate that multiple nuclear units or a combination of nuclear and fossil-fired plants could provide adequate reliability to meet industrial requirements for continuity of service

  9. Reliability estimation system: its application to the nuclear geophysical sampling of ore deposits

    International Nuclear Information System (INIS)

    Khaykovich, I.M.; Savosin, S.I.

    1992-01-01

    The reliability estimation system accepted in the Soviet Union for sampling data in nuclear geophysics is based on unique requirements in metrology and methodology. It involves estimating characteristic errors in calibration, as well as errors in measurement and interpretation. This paper describes the methods of estimating the levels of systematic and random errors at each stage of the problem. The data of nuclear geophysics sampling are considered to be reliable if there are no statistically significant, systematic differences between ore intervals determined by this method and by geological control, or by other methods of sampling; the reliability of the latter having been verified. The difference between the random errors is statistically insignificant. The system allows one to obtain information on the parameters of ore intervals with a guaranteed random error and without systematic errors. (Author)

  10. Stochastic models and reliability parameter estimation applicable to nuclear power plant safety

    International Nuclear Information System (INIS)

    Mitra, S.P.

    1979-01-01

    A set of stochastic models and related estimation schemes for reliability parameters are developed. The models are applicable for evaluating reliability of nuclear power plant systems. Reliability information is extracted from model parameters which are estimated from the type and nature of failure data that is generally available or could be compiled in nuclear power plants. Principally, two aspects of nuclear power plant reliability have been investigated: (1) The statistical treatment of inplant component and system failure data; (2) The analysis and evaluation of common mode failures. The model inputs are failure data which have been classified as either the time type of failure data or the demand type of failure data. Failures of components and systems in nuclear power plant are, in general, rare events.This gives rise to sparse failure data. Estimation schemes for treating sparse data, whenever necessary, have been considered. The following five problems have been studied: 1) Distribution of sparse failure rate component data. 2) Failure rate inference and reliability prediction from time type of failure data. 3) Analyses of demand type of failure data. 4) Common mode failure model applicable to time type of failure data. 5) Estimation of common mode failures from 'near-miss' demand type of failure data

  11. The relationship between cost estimates reliability and BIM adoption: SEM analysis

    Science.gov (United States)

    Ismail, N. A. A.; Idris, N. H.; Ramli, H.; Rooshdi, R. R. Raja Muhammad; Sahamir, S. R.

    2018-02-01

    This paper presents the usage of Structural Equation Modelling (SEM) approach in analysing the effects of Building Information Modelling (BIM) technology adoption in improving the reliability of cost estimates. Based on the questionnaire survey results, SEM analysis using SPSS-AMOS application examined the relationships between BIM-improved information and cost estimates reliability factors, leading to BIM technology adoption. Six hypotheses were established prior to SEM analysis employing two types of SEM models, namely the Confirmatory Factor Analysis (CFA) model and full structural model. The SEM models were then validated through the assessment on their uni-dimensionality, validity, reliability, and fitness index, in line with the hypotheses tested. The final SEM model fit measures are: P-value=0.000, RMSEA=0.0790.90, TLI=0.956>0.90, NFI=0.935>0.90 and ChiSq/df=2.259; indicating that the overall index values achieved the required level of model fitness. The model supports all the hypotheses evaluated, confirming that all relationship exists amongst the constructs are positive and significant. Ultimately, the analysis verified that most of the respondents foresee better understanding of project input information through BIM visualization, its reliable database and coordinated data, in developing more reliable cost estimates. They also perceive to accelerate their cost estimating task through BIM adoption.

  12. Estimated Value of Service Reliability for Electric Utility Customers in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, M.J.; Mercurio, Matthew; Schellenberg, Josh

    2009-06-01

    Information on the value of reliable electricity service can be used to assess the economic efficiency of investments in generation, transmission and distribution systems, to strategically target investments to customer segments that receive the most benefit from system improvements, and to numerically quantify the risk associated with different operating, planning and investment strategies. This paper summarizes research designed to provide estimates of the value of service reliability for electricity customers in the US. These estimates were obtained by analyzing the results from 28 customer value of service reliability studies conducted by 10 major US electric utilities over the 16 year period from 1989 to 2005. Because these studies used nearly identical interruption cost estimation or willingness-to-pay/accept methods it was possible to integrate their results into a single meta-database describing the value of electric service reliability observed in all of them. Once the datasets from the various studies were combined, a two-part regression model was used to estimate customer damage functions that can be generally applied to calculate customer interruption costs per event by season, time of day, day of week, and geographical regions within the US for industrial, commercial, and residential customers. Estimated interruption costs for different types of customers and of different duration are provided. Finally, additional research and development designed to expand the usefulness of this powerful database and analysis are suggested.

  13. Integration of Active and Passive Safety Technologies--A Method to Study and Estimate Field Capability.

    Science.gov (United States)

    Hu, Jingwen; Flannagan, Carol A; Bao, Shan; McCoy, Robert W; Siasoco, Kevin M; Barbat, Saeed

    2015-11-01

    The objective of this study is to develop a method that uses a combination of field data analysis, naturalistic driving data analysis, and computational simulations to explore the potential injury reduction capabilities of integrating passive and active safety systems in frontal impact conditions. For the purposes of this study, the active safety system is actually a driver assist (DA) feature that has the potential to reduce delta-V prior to a crash, in frontal or other crash scenarios. A field data analysis was first conducted to estimate the delta-V distribution change based on an assumption of 20% crash avoidance resulting from a pre-crash braking DA feature. Analysis of changes in driver head location during 470 hard braking events in a naturalistic driving study found that drivers' head positions were mostly in the center position before the braking onset, while the percentage of time drivers leaning forward or backward increased significantly after the braking onset. Parametric studies with a total of 4800 MADYMO simulations showed that both delta-V and occupant pre-crash posture had pronounced effects on occupant injury risks and on the optimal restraint designs. By combining the results for the delta-V and head position distribution changes, a weighted average of injury risk reduction of 17% and 48% was predicted by the 50th percentile Anthropomorphic Test Device (ATD) model and human body model, respectively, with the assumption that the restraint system can adapt to the specific delta-V and pre-crash posture. This study demonstrated the potential for further reducing occupant injury risk in frontal crashes by the integration of a passive safety system with a DA feature. Future analyses considering more vehicle models, various crash conditions, and variations of occupant characteristics, such as age, gender, weight, and height, are necessary to further investigate the potential capability of integrating passive and DA or active safety systems.

  14. What are estimated reimbursements for lower extremity prostheses capable of surgical and nonsurgical lengthening?

    Science.gov (United States)

    Henderson, Eric R; Pepper, Andrew M; Letson, G Douglas

    2012-04-01

    Growing prostheses accommodate skeletally immature patients with bone tumors undergoing limb-preserving surgery. Early devices required surgical procedures for lengthening; recent devices lengthen without surgery. Expenses for newer expandable devices that lengthen without surgery are more than for their predecessors but overall reimbursement amounts are not known. We sought to determine reimbursement amounts associated with lengthening of growing prostheses requiring surgical and nonsurgical lengthening. We retrospectively reviewed 17 patients with growing prostheses requiring surgical expansion and eight patients with prostheses capable of nonsurgical expansion. Insurance documents were reviewed to determine the reimbursement for implantation, lengthening, and complications. Growth data were obtained from the literature. Mean reimbursement amounts of surgical and nonsurgical lengthenings were $9950 and $272, respectively. Estimated reimbursements associated with implantation of a growing prosthesis varied depending on age, sex, and location. The largest difference was found for 4-year-old boys with distal femoral replacement where reimbursement for expansion to maturity for surgical and nonsurgical lengthening prostheses would be $379,000 and $208,000, respectively. For children requiring more than one surgical expansion, net reimbursements were lower when a noninvasive lengthening device was used. Annual per-prosthesis maintenance reimbursements to address complications for surgical and nonsurgical lengthening prostheses were $3386 and $1856, respectively. This study showed that reimbursements for lengthening of growing endoprostheses capable of nonsurgical expansion may be less expensive in younger patients, particularly male patients undergoing distal femur replacement, than endoprostheses requiring surgical lengthening. Longer outcomes studies are required to see if reimbursements for complications differ between devices. Level III, economic and decision

  15. Feasibility and reliability of digital imaging for estimating food selection and consumption from students' packed lunches.

    Science.gov (United States)

    Taylor, Jennifer C; Sutter, Carolyn; Ontai, Lenna L; Nishina, Adrienne; Zidenberg-Cherr, Sheri

    2018-01-01

    Although increasing attention is placed on the quality of foods in children's packed lunches, few studies have examined the capacity of observational methods to reliably determine both what is selected and consumed from these lunches. The objective of this project was to assess the feasibility and inter-rater reliability of digital imaging for determining selection and consumption from students' packed lunches, by adapting approaches previously applied to school lunches. Study 1 assessed feasibility and reliability of data collection among a sample of packed lunches (n = 155), while Study 2 further examined reliability in a larger sample of packed (n = 386) as well as school (n = 583) lunches. Based on the results from Study 1, it was feasible to collect and code most items in packed lunch images; missing data were most commonly attributed to packaging that limited visibility of contents. Across both studies, there was satisfactory reliability for determining food types selected, quantities selected, and quantities consumed in the eight food categories examined (weighted kappa coefficients 0.68-0.97 for packed lunches, 0.74-0.97 for school lunches), with lowest reliability for estimating condiments and meats/meat alternatives in packed lunches. In extending methods predominately applied to school lunches, these findings demonstrate the capacity of digital imaging for the objective estimation of selection and consumption from both school and packed lunches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Reliability of Semiautomated Computational Methods for Estimating Tibiofemoral Contact Stress in the Multicenter Osteoarthritis Study

    Directory of Open Access Journals (Sweden)

    Donald D. Anderson

    2012-01-01

    Full Text Available Recent findings suggest that contact stress is a potent predictor of subsequent symptomatic osteoarthritis development in the knee. However, much larger numbers of knees (likely on the order of hundreds, if not thousands need to be reliably analyzed to achieve the statistical power necessary to clarify this relationship. This study assessed the reliability of new semiautomated computational methods for estimating contact stress in knees from large population-based cohorts. Ten knees of subjects from the Multicenter Osteoarthritis Study were included. Bone surfaces were manually segmented from sequential 1.0 Tesla magnetic resonance imaging slices by three individuals on two nonconsecutive days. Four individuals then registered the resulting bone surfaces to corresponding bone edges on weight-bearing radiographs, using a semi-automated algorithm. Discrete element analysis methods were used to estimate contact stress distributions for each knee. Segmentation and registration reliabilities (day-to-day and interrater for peak and mean medial and lateral tibiofemoral contact stress were assessed with Shrout-Fleiss intraclass correlation coefficients (ICCs. The segmentation and registration steps of the modeling approach were found to have excellent day-to-day (ICC 0.93–0.99 and good inter-rater reliability (0.84–0.97. This approach for estimating compartment-specific tibiofemoral contact stress appears to be sufficiently reliable for use in large population-based cohorts.

  17. Safeprops: A Software for Fast and Reliable Estimation of Safety and Environmental Properties for Organic Compounds

    DEFF Research Database (Denmark)

    Jones, Mark Nicholas; Frutiger, Jerome; Abildskov, Jens

    We present a new software tool called SAFEPROPS which is able to estimate major safety-related and environmental properties for organic compounds. SAFEPROPS provides accurate, reliable and fast predictions using the Marrero-Gani group contribution (MG-GC) method. It is implemented using Python...... as the main programming language, while the necessary parameters together with their correlation matrix are obtained from a SQLite database which has been populated using off-line parameter and error estimation routines (Eq. 3-8)....

  18. ESTIMATING RELIABILITY OF DISTURBANCES IN SATELLITE TIME SERIES DATA BASED ON STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Z.-G. Zhou

    2016-06-01

    Full Text Available Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with “Change/ No change” by most of the present methods, while few methods focus on estimating reliability (or confidence level of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1 Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST. (2 Forecasting and detecting disturbances in new time series data. (3 Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI and Confidence Levels (CL. The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  19. Estimating the Reliability of Aggregated and Within-Person Centered Scores in Ecological Momentary Assessment

    Science.gov (United States)

    Huang, Po-Hsien; Weng, Li-Jen

    2012-01-01

    A procedure for estimating the reliability of test scores in the context of ecological momentary assessment (EMA) was proposed to take into account the characteristics of EMA measures. Two commonly used test scores in EMA were considered: the aggregated score (AGGS) and the within-person centered score (WPCS). Conceptually, AGGS and WPCS represent…

  20. Nonparametric Estimation of Interval Reliability for Discrete-Time Semi-Markov Systems

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos; Limnios, Nikolaos

    2016-01-01

    In this article, we consider a repairable discrete-time semi-Markov system with finite state space. The measure of the interval reliability is given as the probability of the system being operational over a given finite-length time interval. A nonparametric estimator is proposed for the interval...

  1. Uncertainty in reliability estimation : when do we know everything we know?

    NARCIS (Netherlands)

    Houben, M.J.H.A.; Sonnemans, P.J.M.; Newby, M.J.; Bris, R.; Guedes Soares, C.; Martorell, S.

    2009-01-01

    In this paperwe demonstrate the use of an adapted GroundedTheory approach through interviews and their analysis to determine explicit uncertainty (known unknowns) for reliability estimation in the early phases of product development.We have applied the adapted Grounded Theory approach in a case

  2. Estimating reliability coefficients with heterogeneous item weightings using Stata: A factor based approach

    NARCIS (Netherlands)

    Boermans, M.A.; Kattenberg, M.A.C.

    2011-01-01

    We show how to estimate a Cronbach's alpha reliability coefficient in Stata after running a principal component or factor analysis. Alpha evaluates to what extent items measure the same underlying content when the items are combined into a scale or used for latent variable. Stata allows for testing

  3. Can a sample of Landsat sensor scenes reliably estimate the global extent of tropical deforestation?

    Science.gov (United States)

    R. L. Czaplewski

    2003-01-01

    Tucker and Townshend (2000) conclude that wall-to-wall coverage is needed to avoid gross errors in estimations of deforestation rates' because tropical deforestation is concentrated along roads and rivers. They specifically question the reliability of the 10% sample of Landsat sensor scenes used in the global remote sensing survey conducted by the Food and...

  4. Perceptual and Acoustic Reliability Estimates for the Speech Disorders Classification System (SDCS)

    Science.gov (United States)

    Shriberg, Lawrence D.; Fourakis, Marios; Hall, Sheryl D.; Karlsson, Heather B.; Lohmeier, Heather L.; McSweeny, Jane L.; Potter, Nancy L.; Scheer-Cohen, Alison R.; Strand, Edythe A.; Tilkens, Christie M.; Wilson, David L.

    2010-01-01

    A companion paper describes three extensions to a classification system for paediatric speech sound disorders termed the Speech Disorders Classification System (SDCS). The SDCS uses perceptual and acoustic data reduction methods to obtain information on a speaker's speech, prosody, and voice. The present paper provides reliability estimates for…

  5. Integration of external estimated breeding values and associated reliabilities using correlations among traits and effects

    NARCIS (Netherlands)

    Vandenplas, J.; Colinet, F.G.; Glorieux, G.; Bertozzi, C.; Gengler, N.

    2015-01-01

    Based on a Bayesian view of linear mixed models, several studies showed the possibilities to integrate estimated breeding values (EBV) and associated reliabilities (REL) provided by genetic evaluations performed outside a given evaluation system into this genetic evaluation. Hereafter, the term

  6. Reliability of stellar inclination estimated from asteroseismology: analytical criteria, mock simulations and Kepler data analysis

    Science.gov (United States)

    Kamiaka, Shoya; Benomar, Othman; Suto, Yasushi

    2018-05-01

    Advances in asteroseismology of solar-like stars, now provide a unique method to estimate the stellar inclination i⋆. This enables to evaluate the spin-orbit angle of transiting planetary systems, in a complementary fashion to the Rossiter-McLaughlineffect, a well-established method to estimate the projected spin-orbit angle λ. Although the asteroseismic method has been broadly applied to the Kepler data, its reliability has yet to be assessed intensively. In this work, we evaluate the accuracy of i⋆ from asteroseismology of solar-like stars using 3000 simulated power spectra. We find that the low signal-to-noise ratio of the power spectra induces a systematic under-estimate (over-estimate) bias for stars with high (low) inclinations. We derive analytical criteria for the reliable asteroseismic estimate, which indicates that reliable measurements are possible in the range of 20° ≲ i⋆ ≲ 80° only for stars with high signal-to-noise ratio. We also analyse and measure the stellar inclination of 94 Kepler main-sequence solar-like stars, among which 33 are planetary hosts. According to our reliability criteria, a third of them (9 with planets, 22 without) have accurate stellar inclination. Comparison of our asteroseismic estimate of vsin i⋆ against spectroscopic measurements indicates that the latter suffers from a large uncertainty possibly due to the modeling of macro-turbulence, especially for stars with projected rotation speed vsin i⋆ ≲ 5km/s. This reinforces earlier claims, and the stellar inclination estimated from the combination of measurements from spectroscopy and photometric variation for slowly rotating stars needs to be interpreted with caution.

  7. Estimations of parameters in Pareto reliability model in the presence of masked data

    International Nuclear Information System (INIS)

    Sarhan, Ammar M.

    2003-01-01

    Estimations of parameters included in the individual distributions of the life times of system components in a series system are considered in this paper based on masked system life test data. We consider a series system of two independent components each has a Pareto distributed lifetime. The maximum likelihood and Bayes estimators for the parameters and the values of the reliability of the system's components at a specific time are obtained. Symmetrical triangular prior distributions are assumed for the unknown parameters to be estimated in obtaining the Bayes estimators of these parameters. Large simulation studies are done in order: (i) explain how one can utilize the theoretical results obtained; (ii) compare the maximum likelihood and Bayes estimates obtained of the underlying parameters; and (iii) study the influence of the masking level and the sample size on the accuracy of the estimates obtained

  8. Estimation of the human error probabilities in the human reliability analysis

    International Nuclear Information System (INIS)

    Liu Haibin; He Xuhong; Tong Jiejuan; Shen Shifei

    2006-01-01

    Human error data is an important issue of human reliability analysis (HRA). Using of Bayesian parameter estimation, which can use multiple information, such as the historical data of NPP and expert judgment data to modify the human error data, could get the human error data reflecting the real situation of NPP more truly. This paper, using the numeric compute program developed by the authors, presents some typical examples to illustrate the process of the Bayesian parameter estimation in HRA and discusses the effect of different modification data on the Bayesian parameter estimation. (authors)

  9. Reliability analysis based on a novel density estimation method for structures with correlations

    Directory of Open Access Journals (Sweden)

    Baoyu LI

    2017-06-01

    Full Text Available Estimating the Probability Density Function (PDF of the performance function is a direct way for structural reliability analysis, and the failure probability can be easily obtained by integration in the failure domain. However, efficiently estimating the PDF is still an urgent problem to be solved. The existing fractional moment based maximum entropy has provided a very advanced method for the PDF estimation, whereas the main shortcoming is that it limits the application of the reliability analysis method only to structures with independent inputs. While in fact, structures with correlated inputs always exist in engineering, thus this paper improves the maximum entropy method, and applies the Unscented Transformation (UT technique to compute the fractional moments of the performance function for structures with correlations, which is a very efficient moment estimation method for models with any inputs. The proposed method can precisely estimate the probability distributions of performance functions for structures with correlations. Besides, the number of function evaluations of the proposed method in reliability analysis, which is determined by UT, is really small. Several examples are employed to illustrate the accuracy and advantages of the proposed method.

  10. An automated method for estimating reliability of grid systems using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Emmanuel Ramirez-Marquez, Jose

    2012-01-01

    Grid computing has become relevant due to its applications to large-scale resource sharing, wide-area information transfer, and multi-institutional collaborating. In general, in grid computing a service requests the use of a set of resources, available in a grid, to complete certain tasks. Although analysis tools and techniques for these types of systems have been studied, grid reliability analysis is generally computation-intensive to obtain due to the complexity of the system. Moreover, conventional reliability models have some common assumptions that cannot be applied to the grid systems. Therefore, new analytical methods are needed for effective and accurate assessment of grid reliability. This study presents a new method for estimating grid service reliability, which does not require prior knowledge about the grid system structure unlike the previous studies. Moreover, the proposed method does not rely on any assumptions about the link and node failure rates. This approach is based on a data-mining algorithm, the K2, to discover the grid system structure from raw historical system data, that allows to find minimum resource spanning trees (MRST) within the grid then, uses Bayesian networks (BN) to model the MRST and estimate grid service reliability.

  11. Confidence Estimation of Reliability Indices of the System with Elements Duplication and Recovery

    Directory of Open Access Journals (Sweden)

    I. V. Pavlov

    2017-01-01

    Full Text Available The article considers a problem to estimate a confidence interval of the main reliability indices such as availability rate, mean time between failures, and operative availability (in the stationary state for the model of the system with duplication and independent recovery of elements.Presents a solution of the problem for a situation that often arises in practice, when there are unknown exact values of the reliability parameters of the elements, and only test data of the system or its individual parts (elements, subsystems for reliability are known. It should be noted that the problems of the confidence estimate of reliability indices of the complex systems based on the testing results of their individual elements are fairly common function in engineering practice when designing and running the various engineering systems. The available papers consider this problem, mainly, for non-recovery systems.Describes a solution of this problem for the important particular case when the system elements are duplicated by the reserved elements, and the elements that have failed in the course of system operation are recovered (regardless of the state of other elements.An approximate solution of this problem is obtained for the case of high reliability or "fast recovery" of elements on the assumption that the average recovery time of elements is small as compared to the average time between failures.

  12. Online Estimation of Peak Power Capability of Li-Ion Batteries in Electric Vehicles by a Hardware-in-Loop Approach

    Directory of Open Access Journals (Sweden)

    Fengchun Sun

    2012-05-01

    Full Text Available Battery peak power capability estimations play an important theoretical role for the proper use of the battery in electric vehicles. To address the failures in relaxation effects and real-time ability performance, neglecting the battery’s design limits and other issues of the traditional peak power capability calculation methods, a new approach based on the dynamic electrochemical-polarization (EP battery model, taking into consideration constraints of current, voltage, state of charge (SoC and power is proposed. A hardware-in-the-loop (HIL system is built for validating the online model-based peak power capability estimation approach of batteries used in hybrid electric vehicles (HEVs and a HIL test based on the Federal Urban Driving Schedules (FUDS is used to verify and evaluate its real-time computation performance, reliability and robustness. The results show the proposed approach gives a more accurate estimate compared with the hybrid pulse power characterization (HPPC method, avoiding over-charging or over-discharging and providing a powerful guarantee for the optimization of HEVs power systems. Furthermore, the HIL test provides valuable data and critical guidance to evaluate the accuracy of the developed battery algorithms.

  13. Methods for estimating the reliability of the RBMK fuel assemblies and elements

    International Nuclear Information System (INIS)

    Klemin, A.I.; Sitkarev, A.G.

    1985-01-01

    Applied non-parametric methods for calculation of point and interval estimations for the basic nomenclature of reliability factors for the RBMK fuel assemblies and elements are described. As the fuel assembly and element reliability factors, the average lifetime is considered at a preset operating time up to unloading due to fuel burnout as well as the average lifetime at the reactor transient operation and at the steady-state fuel reloading mode of reactor operation. The formulae obtained are included into the special standardized engineering documentation

  14. ARA and ARI imperfect repair models: Estimation, goodness-of-fit and reliability prediction

    International Nuclear Information System (INIS)

    Toledo, Maria Luíza Guerra de; Freitas, Marta A.; Colosimo, Enrico A.; Gilardoni, Gustavo L.

    2015-01-01

    An appropriate maintenance policy is essential to reduce expenses and risks related to equipment failures. A fundamental aspect to be considered when specifying such policies is to be able to predict the reliability of the systems under study, based on a well fitted model. In this paper, the classes of models Arithmetic Reduction of Age and Arithmetic Reduction of Intensity are explored. Likelihood functions for such models are derived, and a graphical method is proposed for model selection. A real data set involving failures in trucks used by a Brazilian mining is analyzed considering models with different memories. Parameters, namely, shape and scale for Power Law Process, and the efficiency of repair were estimated for the best fitted model. Estimation of model parameters allowed us to derive reliability estimators to predict the behavior of the failure process. These results are a valuable information for the mining company and can be used to support decision making regarding preventive maintenance policy. - Highlights: • Likelihood functions for imperfect repair models are derived. • A goodness-of-fit technique is proposed as a tool for model selection. • Failures in trucks owned by a Brazilian mining are modeled. • Estimation allowed deriving reliability predictors to forecast the future failure process of the trucks

  15. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    International Nuclear Information System (INIS)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee; Lee, Minuk; Choi, Jong-su; Hong, Sup

    2015-01-01

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF

  16. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee [Hanyang University, Seoul (Korea, Republic of); Lee, Minuk; Choi, Jong-su; Hong, Sup [Korea Research Insitute of Ships and Ocean Engineering, Daejeon (Korea, Republic of)

    2015-02-15

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF.

  17. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    Science.gov (United States)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  18. Online Reliable Peak Charge/Discharge Power Estimation of Series-Connected Lithium-Ion Battery Packs

    Directory of Open Access Journals (Sweden)

    Bo Jiang

    2017-03-01

    Full Text Available The accurate peak power estimation of a battery pack is essential to the power-train control of electric vehicles (EVs. It helps to evaluate the maximum charge and discharge capability of the battery system, and thus to optimally control the power-train system to meet the requirement of acceleration, gradient climbing and regenerative braking while achieving a high energy efficiency. A novel online peak power estimation method for series-connected lithium-ion battery packs is proposed, which considers the influence of cell difference on the peak power of the battery packs. A new parameter identification algorithm based on adaptive ratio vectors is designed to online identify the parameters of each individual cell in a series-connected battery pack. The ratio vectors reflecting cell difference are deduced strictly based on the analysis of battery characteristics. Based on the online parameter identification, the peak power estimation considering cell difference is further developed. Some validation experiments in different battery aging conditions and with different current profiles have been implemented to verify the proposed method. The results indicate that the ratio vector-based identification algorithm can achieve the same accuracy as the repetitive RLS (recursive least squares based identification while evidently reducing the computation cost, and the proposed peak power estimation method is more effective and reliable for series-connected battery packs due to the consideration of cell difference.

  19. Reliability of piping system components. Framework for estimating failure parameters from service data

    International Nuclear Information System (INIS)

    Nyman, R.; Hegedus, D.; Tomic, B.; Lydell, B.

    1997-12-01

    This report summarizes results and insights from the final phase of a R and D project on piping reliability sponsored by the Swedish Nuclear Power Inspectorate (SKI). The technical scope includes the development of an analysis framework for estimating piping reliability parameters from service data. The R and D has produced a large database on the operating experience with piping systems in commercial nuclear power plants worldwide. It covers the period 1970 to the present. The scope of the work emphasized pipe failures (i.e., flaws/cracks, leaks and ruptures) in light water reactors (LWRs). Pipe failures are rare events. A data reduction format was developed to ensure that homogenous data sets are prepared from scarce service data. This data reduction format distinguishes between reliability attributes and reliability influence factors. The quantitative results of the analysis of service data are in the form of conditional probabilities of pipe rupture given failures (flaws/cracks, leaks or ruptures) and frequencies of pipe failures. Finally, the R and D by SKI produced an analysis framework in support of practical applications of service data in PSA. This, multi-purpose framework, termed 'PFCA'-Pipe Failure Cause and Attribute- defines minimum requirements on piping reliability analysis. The application of service data should reflect the requirements of an application. Together with raw data summaries, this analysis framework enables the development of a prior and a posterior pipe rupture probability distribution. The framework supports LOCA frequency estimation, steam line break frequency estimation, as well as the development of strategies for optimized in-service inspection strategies

  20. Reliability of piping system components. Framework for estimating failure parameters from service data

    Energy Technology Data Exchange (ETDEWEB)

    Nyman, R [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hegedus, D; Tomic, B [ENCONET Consulting GesmbH, Vienna (Austria); Lydell, B [RSA Technologies, Vista, CA (United States)

    1997-12-01

    This report summarizes results and insights from the final phase of a R and D project on piping reliability sponsored by the Swedish Nuclear Power Inspectorate (SKI). The technical scope includes the development of an analysis framework for estimating piping reliability parameters from service data. The R and D has produced a large database on the operating experience with piping systems in commercial nuclear power plants worldwide. It covers the period 1970 to the present. The scope of the work emphasized pipe failures (i.e., flaws/cracks, leaks and ruptures) in light water reactors (LWRs). Pipe failures are rare events. A data reduction format was developed to ensure that homogenous data sets are prepared from scarce service data. This data reduction format distinguishes between reliability attributes and reliability influence factors. The quantitative results of the analysis of service data are in the form of conditional probabilities of pipe rupture given failures (flaws/cracks, leaks or ruptures) and frequencies of pipe failures. Finally, the R and D by SKI produced an analysis framework in support of practical applications of service data in PSA. This, multi-purpose framework, termed `PFCA`-Pipe Failure Cause and Attribute- defines minimum requirements on piping reliability analysis. The application of service data should reflect the requirements of an application. Together with raw data summaries, this analysis framework enables the development of a prior and a posterior pipe rupture probability distribution. The framework supports LOCA frequency estimation, steam line break frequency estimation, as well as the development of strategies for optimized in-service inspection strategies. 63 refs, 30 tabs, 22 figs.

  1. Model uncertainty and multimodel inference in reliability estimation within a longitudinal framework.

    Science.gov (United States)

    Alonso, Ariel; Laenen, Annouschka

    2013-05-01

    Laenen, Alonso, and Molenberghs (2007) and Laenen, Alonso, Molenberghs, and Vangeneugden (2009) proposed a method to assess the reliability of rating scales in a longitudinal context. The methodology is based on hierarchical linear models, and reliability coefficients are derived from the corresponding covariance matrices. However, finding a good parsimonious model to describe complex longitudinal data is a challenging task. Frequently, several models fit the data equally well, raising the problem of model selection uncertainty. When model uncertainty is high one may resort to model averaging, where inferences are based not on one but on an entire set of models. We explored the use of different model building strategies, including model averaging, in reliability estimation. We found that the approach introduced by Laenen et al. (2007, 2009) combined with some of these strategies may yield meaningful results in the presence of high model selection uncertainty and when all models are misspecified, in so far as some of them manage to capture the most salient features of the data. Nonetheless, when all models omit prominent regularities in the data, misleading results may be obtained. The main ideas are further illustrated on a case study in which the reliability of the Hamilton Anxiety Rating Scale is estimated. Importantly, the ambit of model selection uncertainty and model averaging transcends the specific setting studied in the paper and may be of interest in other areas of psychometrics. © 2012 The British Psychological Society.

  2. Validity and reliability of Nike + Fuelband for estimating physical activity energy expenditure.

    Science.gov (United States)

    Tucker, Wesley J; Bhammar, Dharini M; Sawyer, Brandon J; Buman, Matthew P; Gaesser, Glenn A

    2015-01-01

    The Nike + Fuelband is a commercially available, wrist-worn accelerometer used to track physical activity energy expenditure (PAEE) during exercise. However, validation studies assessing the accuracy of this device for estimating PAEE are lacking. Therefore, this study examined the validity and reliability of the Nike + Fuelband for estimating PAEE during physical activity in young adults. Secondarily, we compared PAEE estimation of the Nike + Fuelband with the previously validated SenseWear Armband (SWA). Twenty-four participants (n = 24) completed two, 60-min semi-structured routines consisting of sedentary/light-intensity, moderate-intensity, and vigorous-intensity physical activity. Participants wore a Nike + Fuelband and SWA, while oxygen uptake was measured continuously with an Oxycon Mobile (OM) metabolic measurement system (criterion). The Nike + Fuelband (ICC = 0.77) and SWA (ICC = 0.61) both demonstrated moderate to good validity. PAEE estimates provided by the Nike + Fuelband (246 ± 67 kcal) and SWA (238 ± 57 kcal) were not statistically different than OM (243 ± 67 kcal). Both devices also displayed similar mean absolute percent errors for PAEE estimates (Nike + Fuelband = 16 ± 13 %; SWA = 18 ± 18 %). Test-retest reliability for PAEE indicated good stability for Nike + Fuelband (ICC = 0.96) and SWA (ICC = 0.90). The Nike + Fuelband provided valid and reliable estimates of PAEE, that are similar to the previously validated SWA, during a routine that included approximately equal amounts of sedentary/light-, moderate- and vigorous-intensity physical activity.

  3. Validity and reliability of central blood pressure estimated by upper arm oscillometric cuff pressure.

    Science.gov (United States)

    Climie, Rachel E D; Schultz, Martin G; Nikolic, Sonja B; Ahuja, Kiran D K; Fell, James W; Sharman, James E

    2012-04-01

    Noninvasive central blood pressure (BP) independently predicts mortality, but current methods are operator-dependent, requiring skill to obtain quality recordings. The aims of this study were first, to determine the validity of an automatic, upper arm oscillometric cuff method for estimating central BP (O(CBP)) by comparison with the noninvasive reference standard of radial tonometry (T(CBP)). Second, we determined the intratest and intertest reliability of O(CBP). To assess validity, central BP was estimated by O(CBP) (Pulsecor R6.5B monitor) and compared with T(CBP) (SphygmoCor) in 47 participants free from cardiovascular disease (aged 57 ± 9 years) in supine, seated, and standing positions. Brachial mean arterial pressure (MAP) and diastolic BP (DBP) from the O(CBP) device were used to calibrate in both devices. Duplicate measures were recorded in each position on the same day to assess intratest reliability, and participants returned within 10 ± 7 days for repeat measurements to assess intertest reliability. There was a strong intraclass correlation (ICC = 0.987, P difference (1.2 ± 2.2 mm Hg) for central systolic BP (SBP) determined by O(CBP) compared with T(CBP). Ninety-six percent of all comparisons (n = 495 acceptable recordings) were within 5 mm Hg. With respect to reliability, there were strong correlations but higher limits of agreement for the intratest (ICC = 0.975, P difference 0.6 ± 4.5 mm Hg) and intertest (ICC = 0.895, P difference 4.3 ± 8.0 mm Hg) comparisons. Estimation of central SBP using cuff oscillometry is comparable to radial tonometry and has good reproducibility. As a noninvasive, relatively operator-independent method, O(CBP) may be as useful as T(CBP) for estimating central BP in clinical practice.

  4. Expanding Reliability Generalization Methods with KR-21 Estimates: An RG Study of the Coopersmith Self-Esteem Inventory.

    Science.gov (United States)

    Lane, Ginny G.; White, Amy E.; Henson, Robin K.

    2002-01-01

    Conducted a reliability generalizability study on the Coopersmith Self-Esteem Inventory (CSEI; S. Coopersmith, 1967) to examine the variability of reliability estimates across studies and to identify study characteristics that may predict this variability. Results show that reliability for CSEI scores can vary considerably, especially at the…

  5. Empiric reliability of diagnostic and prognostic estimations of physical standards of children, going in for sports.

    Directory of Open Access Journals (Sweden)

    Zaporozhanov V.A.

    2012-12-01

    Full Text Available In the conditions of sporting-pedagogical practices objective estimation of potential possibilities gettings busy already on the initial stages of long-term preparation examined as one of issues of the day. The proper quantitative information allows to individualize preparation of gettings in obedience to requirements to the guided processes busy. Research purpose - logically and metrical to rotin expedience of metrical method of calculations of reliability of results of the control measurings, in-use for diagnostics of psychophysical fitness and prognosis of growth of trade gettings busy in the select type of sport. Material and methods. Analysed the results of the control measurings on four indexes of psychophysical preparedness and estimation of experts of fitness 24th gettings busy composition of children of gymnastic school. The results of initial and final inspection of gymnasts on the same control tests processed the method of mathematical statistics. Expected the metrical estimations of reliability of measurings is stability, co-ordination and informing of control information for current diagnostics and prognosis of sporting possibilities inspected. Results. Expedience of the use in these aims of metrical operations of calculation of complex estimation of the psychophysical state of gettings busy is metrology grounded. Conclusions. Research results confirm expedience of calculation of complex estimation of psychophysical features gettings busy for diagnostics of fitness in the select type of sport and trade prognosis on the subsequent stages of preparation.

  6. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis

    Directory of Open Access Journals (Sweden)

    Adela-Eliza Dumitrascu

    2015-01-01

    Full Text Available Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram, which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.

  7. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis.

    Science.gov (United States)

    Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina

    2015-01-01

    Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.

  8. The Reliability Estimation for the Open Function of Cabin Door Affected by the Imprecise Judgment Corresponding to Distribution Hypothesis

    Science.gov (United States)

    Yu, Z. P.; Yue, Z. F.; Liu, W.

    2018-05-01

    With the development of artificial intelligence, more and more reliability experts have noticed the roles of subjective information in the reliability design of complex system. Therefore, based on the certain numbers of experiment data and expert judgments, we have divided the reliability estimation based on distribution hypothesis into cognition process and reliability calculation. Consequently, for an illustration of this modification, we have taken the information fusion based on intuitional fuzzy belief functions as the diagnosis model of cognition process, and finished the reliability estimation for the open function of cabin door affected by the imprecise judgment corresponding to distribution hypothesis.

  9. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  10. Generating human reliability estimates using expert judgment. Volume 1. Main report

    International Nuclear Information System (INIS)

    Comer, M.K.; Seaver, D.A.; Stillwell, W.G.; Gaddy, C.D.

    1984-11-01

    The US Nuclear Regulatory Commission is conducting a research program to determine the practicality, acceptability, and usefulness of several different methods for obtaining human reliability data and estimates that can be used in nuclear power plant probabilistic risk assessment (PRA). One method, investigated as part of this overall research program, uses expert judgment to generate human error probability (HEP) estimates and associated uncertainty bounds. The project described in this document evaluated two techniques for using expert judgment: paired comparisons and direct numerical estimation. Volume 1 of this report provides a brief overview of the background of the project, the procedure for using psychological scaling techniques to generate HEP estimates and conclusions from evaluation of the techniques. Results of the evaluation indicate that techniques using expert judgment should be given strong consideration for use in developing HEP estimates. In addition, HEP estimates for 35 tasks related to boiling water reactors (BMRs) were obtained as part of the evaluation. These HEP estimates are also included in the report

  11. Uncertainty analysis methods for estimation of reliability of passive system of VHTR

    International Nuclear Information System (INIS)

    Han, S.J.

    2012-01-01

    An estimation of reliability of passive system for the probabilistic safety assessment (PSA) of a very high temperature reactor (VHTR) is under development in Korea. The essential approach of this estimation is to measure the uncertainty of the system performance under a specific accident condition. The uncertainty propagation approach according to the simulation of phenomenological models (computer codes) is adopted as a typical method to estimate the uncertainty for this purpose. This presentation introduced the uncertainty propagation and discussed the related issues focusing on the propagation object and its surrogates. To achieve a sufficient level of depth of uncertainty results, the applicability of the propagation should be carefully reviewed. For an example study, Latin-hypercube sampling (LHS) method as a direct propagation was tested for a specific accident sequence of VHTR. The reactor cavity cooling system (RCCS) developed by KAERI was considered for this example study. This is an air-cooled type passive system that has no active components for its operation. The accident sequence is a low pressure conduction cooling (LPCC) accident that is considered as a design basis accident for the safety design of VHTR. This sequence is due to a large failure of the pressure boundary of the reactor system such as a guillotine break of coolant pipe lines. The presentation discussed the obtained insights (benefit and weakness) to apply an estimation of reliability of passive system

  12. ESTIMATION OF PARAMETERS AND RELIABILITY FUNCTION OF EXPONENTIATED EXPONENTIAL DISTRIBUTION: BAYESIAN APPROACH UNDER GENERAL ENTROPY LOSS FUNCTION

    Directory of Open Access Journals (Sweden)

    Sanjay Kumar Singh

    2011-06-01

    Full Text Available In this Paper we propose Bayes estimators of the parameters of Exponentiated Exponential distribution and Reliability functions under General Entropy loss function for Type II censored sample. The proposed estimators have been compared with the corresponding Bayes estimators obtained under Squared Error loss function and maximum likelihood estimators for their simulated risks (average loss over sample space.

  13. How to use an optimization-based method capable of balancing safety, reliability, and weight in an aircraft design process

    Energy Technology Data Exchange (ETDEWEB)

    Johansson, Cristina [Mendeley, Broderna Ugglasgatan, Linkoping (Sweden); Derelov, Micael; Olvander, Johan [Linkoping University, IEI, Dept. of Machine Design, Linkoping (Sweden)

    2017-03-15

    In order to help decision-makers in the early design phase to improve and make more cost-efficient system safety and reliability baselines of aircraft design concepts, a method (Multi-objective Optimization for Safety and Reliability Trade-off) that is able to handle trade-offs such as system safety, system reliability, and other characteristics, for instance weight and cost, is used. Multi-objective Optimization for Safety and Reliability Trade-off has been developed and implemented at SAAB Aeronautics. The aim of this paper is to demonstrate how the implemented method might work to aid the selection of optimal design alternatives. The method is a three-step method: step 1 involves the modelling of each considered target, step 2 is optimization, and step 3 is the visualization and selection of results (results processing). The analysis is performed within Architecture Design and Preliminary Design steps, according to the company's Product Development Process. The lessons learned regarding the use of the implemented trade-off method in the three cases are presented. The results are a handful of solutions, a basis to aid in the selection of a design alternative. While the implementation of the trade-off method is performed for companies, there is nothing to prevent adapting this method, with minimal modifications, for use in other industrial applications.

  14. How to use an optimization-based method capable of balancing safety, reliability, and weight in an aircraft design process

    International Nuclear Information System (INIS)

    Johansson, Cristina; Derelov, Micael; Olvander, Johan

    2017-01-01

    In order to help decision-makers in the early design phase to improve and make more cost-efficient system safety and reliability baselines of aircraft design concepts, a method (Multi-objective Optimization for Safety and Reliability Trade-off) that is able to handle trade-offs such as system safety, system reliability, and other characteristics, for instance weight and cost, is used. Multi-objective Optimization for Safety and Reliability Trade-off has been developed and implemented at SAAB Aeronautics. The aim of this paper is to demonstrate how the implemented method might work to aid the selection of optimal design alternatives. The method is a three-step method: step 1 involves the modelling of each considered target, step 2 is optimization, and step 3 is the visualization and selection of results (results processing). The analysis is performed within Architecture Design and Preliminary Design steps, according to the company's Product Development Process. The lessons learned regarding the use of the implemented trade-off method in the three cases are presented. The results are a handful of solutions, a basis to aid in the selection of a design alternative. While the implementation of the trade-off method is performed for companies, there is nothing to prevent adapting this method, with minimal modifications, for use in other industrial applications

  15. Pulse Power Capability Estimation of Lithium Titanate Oxide-based Batteries

    DEFF Research Database (Denmark)

    Stroe, Ana-Irina; Swierczynski, Maciej Jozef; Stroe, Daniel Loan

    2016-01-01

    The pulse power capability (PPC) represents one of the parameters that describe the performance behavior of Lithium-ion batteries independent on the application. Consequently, extended information about the Li-ion battery PPC and its dependence on the operating conditions become necessary. Thus......, this paper analyzes the power capability characteristic of a 13Ah high power Lithium Titanate Oxide-based battery and its dependence on temperature, load current and state-of-charge. Furthermore, a model to predict the discharging PPC of the battery cell at different temperatures and load currents for three...

  16. Reliability estimation of safety-critical software-based systems using Bayesian networks

    International Nuclear Information System (INIS)

    Helminen, A.

    2001-06-01

    Due to the nature of software faults and the way they cause system failures new methods are needed for the safety and reliability evaluation of software-based safety-critical automation systems in nuclear power plants. In the research project 'Programmable automation system safety integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002), various safety assessment methods and tools for software based systems are developed and evaluated. The project is financed together by the Radiation and Nuclear Safety Authority (STUK), the Ministry of Trade and Industry (KTM) and the Technical Research Centre of Finland (VTT). In this report the applicability of Bayesian networks to the reliability estimation of software-based systems is studied. The applicability is evaluated by building Bayesian network models for the systems of interest and performing simulations for these models. In the simulations hypothetical evidence is used for defining the parameter relations and for determining the ability to compensate disparate evidence in the models. Based on the experiences from modelling and simulations we are able to conclude that Bayesian networks provide a good method for the reliability estimation of software-based systems. (orig.)

  17. Estimating reliability of degraded system based on the probability density evolution with multi-parameter

    Directory of Open Access Journals (Sweden)

    Jiang Ge

    2017-01-01

    Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.

  18. Estimation of the reliability function for two-parameter exponentiated Rayleigh or Burr type X distribution

    Directory of Open Access Journals (Sweden)

    Anupam Pathak

    2014-11-01

    Full Text Available Abstract: Problem Statement: The two-parameter exponentiated Rayleigh distribution has been widely used especially in the modelling of life time event data. It provides a statistical model which has a wide variety of application in many areas and the main advantage is its ability in the context of life time event among other distributions. The uniformly minimum variance unbiased and maximum likelihood estimation methods are the way to estimate the parameters of the distribution. In this study we explore and compare the performance of the uniformly minimum variance unbiased and maximum likelihood estimators of the reliability function R(t=P(X>t and P=P(X>Y for the two-parameter exponentiated Rayleigh distribution. Approach: A new technique of obtaining these parametric functions is introduced in which major role is played by the powers of the parameter(s and the functional forms of the parametric functions to be estimated are not needed.  We explore the performance of these estimators numerically under varying conditions. Through the simulation study a comparison are made on the performance of these estimators with respect to the Biasness, Mean Square Error (MSE, 95% confidence length and corresponding coverage percentage. Conclusion: Based on the results of simulation study the UMVUES of R(t and ‘P’ for the two-parameter exponentiated Rayleigh distribution found to be superior than MLES of R(t and ‘P’.

  19. A rapid reliability estimation method for directed acyclic lifeline networks with statistically dependent components

    International Nuclear Information System (INIS)

    Kang, Won-Hee; Kliese, Alyce

    2014-01-01

    Lifeline networks, such as transportation, water supply, sewers, telecommunications, and electrical and gas networks, are essential elements for the economic and societal functions of urban areas, but their components are highly susceptible to natural or man-made hazards. In this context, it is essential to provide effective pre-disaster hazard mitigation strategies and prompt post-disaster risk management efforts based on rapid system reliability assessment. This paper proposes a rapid reliability estimation method for node-pair connectivity analysis of lifeline networks especially when the network components are statistically correlated. Recursive procedures are proposed to compound all network nodes until they become a single super node representing the connectivity between the origin and destination nodes. The proposed method is applied to numerical network examples and benchmark interconnected power and water networks in Memphis, Shelby County. The connectivity analysis results show the proposed method's reasonable accuracy and remarkable efficiency as compared to the Monte Carlo simulations

  20. Reliability assessment using Bayesian networks. Case study on quantative reliability estimation of a software-based motor protection relay

    International Nuclear Information System (INIS)

    Helminen, A.; Pulkkinen, U.

    2003-06-01

    In this report a quantitative reliability assessment of motor protection relay SPAM 150 C has been carried out. The assessment focuses to the methodological analysis of the quantitative reliability assessment using the software-based motor protection relay as a case study. The assessment method is based on Bayesian networks and tries to take the full advantage of the previous work done in a project called Programmable Automation System Safety Integrity assessment (PASSI). From the results and experiences achieved during the work it is justified to claim that the assessment method presented in the work enables a flexible use of qualitative and quantitative elements of reliability related evidence in a single reliability assessment. At the same time the assessment method is a concurrent way of reasoning one's beliefs and references about the reliability of the system. Full advantage of the assessment method is taken when using the method as a way to cultivate the information related to the reliability of software-based systems. The method can also be used as a communicational instrument in a licensing process of software-based systems. (orig.)

  1. Updated Value of Service Reliability Estimates for Electric Utility Customers in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, Michael [Nexant Inc., Burlington, MA (United States); Schellenberg, Josh [Nexant Inc., Burlington, MA (United States); Blundell, Marshall [Nexant Inc., Burlington, MA (United States)

    2015-01-01

    This report updates the 2009 meta-analysis that provides estimates of the value of service reliability for electricity customers in the United States (U.S.). The meta-dataset now includes 34 different datasets from surveys fielded by 10 different utility companies between 1989 and 2012. Because these studies used nearly identical interruption cost estimation or willingness-to-pay/accept methods, it was possible to integrate their results into a single meta-dataset describing the value of electric service reliability observed in all of them. Once the datasets from the various studies were combined, a two-part regression model was used to estimate customer damage functions that can be generally applied to calculate customer interruption costs per event by season, time of day, day of week, and geographical regions within the U.S. for industrial, commercial, and residential customers. This report focuses on the backwards stepwise selection process that was used to develop the final revised model for all customer classes. Across customer classes, the revised customer interruption cost model has improved significantly because it incorporates more data and does not include the many extraneous variables that were in the original specification from the 2009 meta-analysis. The backwards stepwise selection process led to a more parsimonious model that only included key variables, while still achieving comparable out-of-sample predictive performance. In turn, users of interruption cost estimation tools such as the Interruption Cost Estimate (ICE) Calculator will have less customer characteristics information to provide and the associated inputs page will be far less cumbersome. The upcoming new version of the ICE Calculator is anticipated to be released in 2015.

  2. Lifetime Reliability Estimate and Extreme Permanent Deformations of Randomly Excited Elasto-Plastic Structures

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1983-01-01

    plastic deformation during several loadings can be modelled as a filtered Poisson process. Using the Markov property of this quantity the considered first-passage problem as well as the related extreme distribution problems are then solved numerically, and the results are compared to simulation studies.......A method is presented for life-time reliability' estimates of randomly excited yielding systems, assuming the structure to be safe, when the plastic deformations are confined below certain limits. The accumulated plastic deformations during any single significant loading history are considered...

  3. Preliminary investigation on reliability of genomic estimated breeding values in the Danish and Swedish Holstein Population

    DEFF Research Database (Denmark)

    Su, G; Guldbrandtsen, B; Gregersen, V R

    2010-01-01

    or no effects, and a single prior distribution common for all SNP. It was found that, in general, the model with a common prior distribution of scaling factors had better predictive ability than any mixture prior models. Therefore, a common prior model was used to estimate SNP effects and breeding values......Abstract This study investigated the reliability of genomic estimated breeding values (GEBV) in the Danish Holstein population. The data in the analysis included 3,330 bulls with both published conventional EBV and single nucleotide polymorphism (SNP) markers. After data editing, 38,134 SNP markers...... were available. In the analysis, all SNP were fitted simultaneously as random effects in a Bayesian variable selection model, which allows heterogeneous variances for different SNP markers. The response variables were the official EBV. Direct GEBV were calculated as the sum of individual SNP effects...

  4. Between-day reliability of a method for non-invasive estimation of muscle composition.

    Science.gov (United States)

    Simunič, Boštjan

    2012-08-01

    Tensiomyography is a method for valid and non-invasive estimation of skeletal muscle fibre type composition. The validity of selected temporal tensiomyographic measures has been well established recently; there is, however, no evidence regarding the method's between-day reliability. Therefore it is the aim of this paper to establish the between-day repeatability of tensiomyographic measures in three skeletal muscles. For three consecutive days, 10 healthy male volunteers (mean±SD: age 24.6 ± 3.0 years; height 177.9 ± 3.9 cm; weight 72.4 ± 5.2 kg) were examined in a supine position. Four temporal measures (delay, contraction, sustain, and half-relaxation time) and maximal amplitude were extracted from the displacement-time tensiomyogram. A reliability analysis was performed with calculations of bias, random error, coefficient of variation (CV), standard error of measurement, and intra-class correlation coefficient (ICC) with a 95% confidence interval. An analysis of ICC demonstrated excellent agreement (ICC were over 0.94 in 14 out of 15 tested parameters). However, lower CV was observed in half-relaxation time, presumably because of the specifics of the parameter definition itself. These data indicate that for the three muscles tested, tensiomyographic measurements were reproducible across consecutive test days. Furthermore, we indicated the most possible origin of the lowest reliability detected in half-relaxation time. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Reliability estimate of unconfined compressive strength of black cotton soil stabilized with cement and quarry dust

    Directory of Open Access Journals (Sweden)

    Dayo Oluwatoyin AKANBI

    2017-06-01

    Full Text Available Reliability estimates of unconfined compressive strength values from laboratory results for specimens compacted at British Standard Light (BSLfor compacted quarry dust treated black cotton soil using cement for road sub – base material was developed by incorporating data obtained from Unconfined compressive strength (UCS test gotten from the laboratory test to produce a predictive model. Data obtained were incorporated into a FORTRAN-based first-order reliability program to obtain reliability index values. Variable factors such as water content relative to optimum (WRO, hydraulic modulus (HM, quarry dust (QD, cement (C, Tri-Calcium silicate (C3S, Di-calcium silicate (C2S, Tri-Calcium Aluminate (C3A, and maximum dry density (MDD produced acceptable safety index value of1.0and they were achieved at coefficient of variation (COV ranges of 10-100%. Observed trends indicate that WRO, C3S, C2S and MDD are greatly influenced by the COV and therefore must be strictly controlled in QD/C treated black cotton soil for use as sub-base material in road pavements. Stochastically, British Standard light (BSL can be used to model the 7 days unconfined compressive strength of compacted quarry dust/cement treated black cotton soil as a sub-base material for road pavement at all coefficient of variation (COV range 10 – 100% because the safety index obtained are higher than the acceptable 1.0 value.

  6. Hybrid time-variant reliability estimation for active control structures under aleatory and epistemic uncertainties

    Science.gov (United States)

    Wang, Lei; Xiong, Chuang; Wang, Xiaojun; Li, Yunlong; Xu, Menghui

    2018-04-01

    Considering that multi-source uncertainties from inherent nature as well as the external environment are unavoidable and severely affect the controller performance, the dynamic safety assessment with high confidence is of great significance for scientists and engineers. In view of this, the uncertainty quantification analysis and time-variant reliability estimation corresponding to the closed-loop control problems are conducted in this study under a mixture of random, interval, and convex uncertainties. By combining the state-space transformation and the natural set expansion, the boundary laws of controlled response histories are first confirmed with specific implementation of random items. For nonlinear cases, the collocation set methodology and fourth Rounge-Kutta algorithm are introduced as well. Enlightened by the first-passage model in random process theory as well as by the static probabilistic reliability ideas, a new definition of the hybrid time-variant reliability measurement is provided for the vibration control systems and the related solution details are further expounded. Two engineering examples are eventually presented to demonstrate the validity and applicability of the methodology developed.

  7. Regional inversion of CO2 ecosystem fluxes from atmospheric measurements. Reliability of the uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Broquet, G.; Chevallier, F.; Breon, F.M.; Yver, C.; Ciais, P.; Ramonet, M.; Schmidt, M. [Laboratoire des Sciences du Climat et de l' Environnement, CEA-CNRS-UVSQ, UMR8212, IPSL, Gif-sur-Yvette (France); Alemanno, M. [Servizio Meteorologico dell' Aeronautica Militare Italiana, Centro Aeronautica Militare di Montagna, Monte Cimone/Sestola (Italy); Apadula, F. [Research on Energy Systems, RSE, Environment and Sustainable Development Department, Milano (Italy); Hammer, S. [Universitaet Heidelberg, Institut fuer Umweltphysik, Heidelberg (Germany); Haszpra, L. [Hungarian Meteorological Service, Budapest (Hungary); Meinhardt, F. [Federal Environmental Agency, Kirchzarten (Germany); Necki, J. [AGH University of Science and Technology, Krakow (Poland); Piacentino, S. [ENEA, Laboratory for Earth Observations and Analyses, Palermo (Italy); Thompson, R.L. [Max Planck Institute for Biogeochemistry, Jena (Germany); Vermeulen, A.T. [Energy research Centre of the Netherlands ECN, EEE-EA, Petten (Netherlands)

    2013-07-01

    The Bayesian framework of CO2 flux inversions permits estimates of the retrieved flux uncertainties. Here, the reliability of these theoretical estimates is studied through a comparison against the misfits between the inverted fluxes and independent measurements of the CO2 Net Ecosystem Exchange (NEE) made by the eddy covariance technique at local (few hectares) scale. Regional inversions at 0.5{sup 0} resolution are applied for the western European domain where {approx}50 eddy covariance sites are operated. These inversions are conducted for the period 2002-2007. They use a mesoscale atmospheric transport model, a prior estimate of the NEE from a terrestrial ecosystem model and rely on the variational assimilation of in situ continuous measurements of CO2 atmospheric mole fractions. Averaged over monthly periods and over the whole domain, the misfits are in good agreement with the theoretical uncertainties for prior and inverted NEE, and pass the chi-square test for the variance at the 30% and 5% significance levels respectively, despite the scale mismatch and the independence between the prior (respectively inverted) NEE and the flux measurements. The theoretical uncertainty reduction for the monthly NEE at the measurement sites is 53% while the inversion decreases the standard deviation of the misfits by 38 %. These results build confidence in the NEE estimates at the European/monthly scales and in their theoretical uncertainty from the regional inverse modelling system. However, the uncertainties at the monthly (respectively annual) scale remain larger than the amplitude of the inter-annual variability of monthly (respectively annual) fluxes, so that this study does not engender confidence in the inter-annual variations. The uncertainties at the monthly scale are significantly smaller than the seasonal variations. The seasonal cycle of the inverted fluxes is thus reliable. In particular, the CO2 sink period over the European continent likely ends later than

  8. Improving reliability of state estimation programming and computing suite based on analyzing a fault tree

    Directory of Open Access Journals (Sweden)

    Kolosok Irina

    2017-01-01

    Full Text Available Reliable information on the current state parameters obtained as a result of processing the measurements from systems of the SCADA and WAMS data acquisition and processing through methods of state estimation (SE is a condition that enables to successfully manage an energy power system (EPS. SCADA and WAMS systems themselves, as any technical systems, are subject to failures and faults that lead to distortion and loss of information. The SE procedure enables to find erroneous measurements, therefore, it is a barrier for the distorted information to penetrate into control problems. At the same time, the programming and computing suite (PCS implementing the SE functions may itself provide a wrong decision due to imperfection of the software algorithms and errors. In this study, we propose to use a fault tree to analyze consequences of failures and faults in SCADA and WAMS and in the very SE procedure. Based on the analysis of the obtained measurement information and on the SE results, we determine the state estimation PCS fault tolerance level featuring its reliability.

  9. Estimating the Optimal Capacity for Reservoir Dam based on Reliability Level for Meeting Demands

    Directory of Open Access Journals (Sweden)

    Mehrdad Taghian

    2017-02-01

    Full Text Available Introduction: One of the practical and classic problems in the water resource studies is estimation of the optimal reservoir capacity to satisfy demands. However, full supplying demands for total periods need a very high dam to supply demands during severe drought conditions. That means a major part of reservoir capacity and costs is only usable for a short period of the reservoir lifetime, which would be unjustified in economic analysis. Thus, in the proposed method and model, the full meeting demand is only possible for a percent time of the statistical period that is according to reliability constraint. In the general methods, although this concept apparently seems simple, there is a necessity to add binary variables for meeting or not meeting demands in the linear programming model structures. Thus, with many binary variables, solving the problem will be time consuming and difficult. Another way to solve the problem is the application of the yield model. This model includes some simpler assumptions and that is so difficult to consider details of the water resource system. The applicationof evolutionary algorithms, for the problems have many constraints, is also very complicated. Therefore, this study pursues another solution. Materials and Methods: In this study, for development and improvement the usual methods, instead of mix integer linear programming (MILP and the above methods, a simulation model including flow network linear programming is used coupled with an interface manual code in Matlab to account the reliability based on output file of the simulation model. The acre reservoir simulation program (ARSP has been utilized as a simulation model. A major advantage of the ARSP is its inherent flexibility in defining the operating policies through a penalty structure specified by the user. The ARSP utilizes network flow optimization techniques to handle a subset of general linear programming (LP problems for individual time intervals

  10. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In the nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model to Dynamic Safety System(DDS) shows that the estimated reliability of the system is quite reasonable and realistic

  11. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model of dynamic safety system (DSS) shows that the estimated reliability of the system is quite reasonable and realistic. (author)

  12. Bayesian reliability analysis for non-periodic inspection with estimation of uncertain parameters; Bayesian shinraisei kaiseki wo tekiyoshita hiteiki kozo kensa ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Itagaki, H. [Yokohama National University, Yokohama (Japan). Faculty of Engineering; Asada, H.; Ito, S. [National Aerospace Laboratory, Tokyo (Japan); Shinozuka, M.

    1996-12-31

    Risk assessed structural positions in a pressurized fuselage of a transport-type aircraft applied with damage tolerance design are taken up as the subject of discussion. A small number of data obtained from inspections on the positions was used to discuss the Bayesian reliability analysis that can estimate also a proper non-periodic inspection schedule, while estimating proper values for uncertain factors. As a result, time period of generating fatigue cracks was determined according to procedure of detailed visual inspections. The analysis method was found capable of estimating values that are thought reasonable and the proper inspection schedule using these values, in spite of placing the fatigue crack progress expression in a very simple form and estimating both factors as the uncertain factors. Thus, the present analysis method was verified of its effectiveness. This study has discussed at the same time the structural positions, modeling of fatigue cracks generated and develop in the positions, conditions for destruction, damage factors, and capability of the inspection from different viewpoints. This reliability analysis method is thought effective also on such other structures as offshore structures. 18 refs., 8 figs., 1 tab.

  13. Bayesian reliability analysis for non-periodic inspection with estimation of uncertain parameters; Bayesian shinraisei kaiseki wo tekiyoshita hiteiki kozo kensa ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Itagaki, H [Yokohama National University, Yokohama (Japan). Faculty of Engineering; Asada, H; Ito, S [National Aerospace Laboratory, Tokyo (Japan); Shinozuka, M

    1997-12-31

    Risk assessed structural positions in a pressurized fuselage of a transport-type aircraft applied with damage tolerance design are taken up as the subject of discussion. A small number of data obtained from inspections on the positions was used to discuss the Bayesian reliability analysis that can estimate also a proper non-periodic inspection schedule, while estimating proper values for uncertain factors. As a result, time period of generating fatigue cracks was determined according to procedure of detailed visual inspections. The analysis method was found capable of estimating values that are thought reasonable and the proper inspection schedule using these values, in spite of placing the fatigue crack progress expression in a very simple form and estimating both factors as the uncertain factors. Thus, the present analysis method was verified of its effectiveness. This study has discussed at the same time the structural positions, modeling of fatigue cracks generated and develop in the positions, conditions for destruction, damage factors, and capability of the inspection from different viewpoints. This reliability analysis method is thought effective also on such other structures as offshore structures. 18 refs., 8 figs., 1 tab.

  14. Evaluating detection and estimation capabilities of magnetometer-based vehicle sensors

    Science.gov (United States)

    Slater, David M.; Jacyna, Garry M.

    2013-05-01

    In an effort to secure the northern and southern United States borders, MITRE has been tasked with developing Modeling and Simulation (M&S) tools that accurately capture the mapping between algorithm-level Measures of Performance (MOP) and system-level Measures of Effectiveness (MOE) for current/future surveillance systems deployed by the the Customs and Border Protection Office of Technology Innovations and Acquisitions (OTIA). This analysis is part of a larger M&S undertaking. The focus is on two MOPs for magnetometer-based Unattended Ground Sensors (UGS). UGS are placed near roads to detect passing vehicles and estimate properties of the vehicle's trajectory such as bearing and speed. The first MOP considered is the probability of detection. We derive probabilities of detection for a network of sensors over an arbitrary number of observation periods and explore how the probability of detection changes when multiple sensors are employed. The performance of UGS is also evaluated based on the level of variance in the estimation of trajectory parameters. We derive the Cramer-Rao bounds for the variances of the estimated parameters in two cases: when no a priori information is known and when the parameters are assumed to be Gaussian with known variances. Sample results show that UGS perform significantly better in the latter case.

  15. Estimating Ordinal Reliability for Likert-Type and Ordinal Item Response Data: A Conceptual, Empirical, and Practical Guide

    Science.gov (United States)

    Gadermann, Anne M.; Guhn, Martin; Zumbo, Bruno D.

    2012-01-01

    This paper provides a conceptual, empirical, and practical guide for estimating ordinal reliability coefficients for ordinal item response data (also referred to as Likert, Likert-type, ordered categorical, or rating scale item responses). Conventionally, reliability coefficients, such as Cronbach's alpha, are calculated using a Pearson…

  16. Using the modern CNC controllers capabilities for estimating the machining forces during the milling process

    Directory of Open Access Journals (Sweden)

    Breaz Radu-Eugen

    2017-01-01

    Full Text Available Machining forces can nowadays be measured by using 3D dynamometers, which are usually very expensive devices and hardly available for most of the CNC machine-tools users. On the other hand, modern CNC controllers have nowadays the ability to display and save many outputs within the machining process, such as the currents or even the torques at the shaft's level for the feed motors on each axis. These outputs can be used for estimating the machining forces, but it is to be noticed that the above-mentioned currents and torques are proportional with the overall resistant forces, which includes not only technological forces, but also friction, inertial and pre-tensioning forces. This paper presents an approach for estimating the machining forces during a milling process, by using the outputs stored in the CNC controller and separating the effects of technological forces from the other forces involved in the process. The separation was made by running two sets of experiments, one set for dry-run regime and the other one for machining regime.

  17. Reliable Dual Tensor Model Estimation in Single and Crossing Fibers Based on Jeffreys Prior

    Science.gov (United States)

    Yang, Jianfei; Poot, Dirk H. J.; Caan, Matthan W. A.; Su, Tanja; Majoie, Charles B. L. M.; van Vliet, Lucas J.; Vos, Frans M.

    2016-01-01

    Purpose This paper presents and studies a framework for reliable modeling of diffusion MRI using a data-acquisition adaptive prior. Methods Automated relevance determination estimates the mean of the posterior distribution of a rank-2 dual tensor model exploiting Jeffreys prior (JARD). This data-acquisition prior is based on the Fisher information matrix and enables the assessment whether two tensors are mandatory to describe the data. The method is compared to Maximum Likelihood Estimation (MLE) of the dual tensor model and to FSL’s ball-and-stick approach. Results Monte Carlo experiments demonstrated that JARD’s volume fractions correlated well with the ground truth for single and crossing fiber configurations. In single fiber configurations JARD automatically reduced the volume fraction of one compartment to (almost) zero. The variance in fractional anisotropy (FA) of the main tensor component was thereby reduced compared to MLE. JARD and MLE gave a comparable outcome in data simulating crossing fibers. On brain data, JARD yielded a smaller spread in FA along the corpus callosum compared to MLE. Tract-based spatial statistics demonstrated a higher sensitivity in detecting age-related white matter atrophy using JARD compared to both MLE and the ball-and-stick approach. Conclusions The proposed framework offers accurate and precise estimation of diffusion properties in single and dual fiber regions. PMID:27760166

  18. Girsanov's transformation based variance reduced Monte Carlo simulation schemes for reliability estimation in nonlinear stochastic dynamics

    Science.gov (United States)

    Kanjilal, Oindrila; Manohar, C. S.

    2017-07-01

    The study considers the problem of simulation based time variant reliability analysis of nonlinear randomly excited dynamical systems. Attention is focused on importance sampling strategies based on the application of Girsanov's transformation method. Controls which minimize the distance function, as in the first order reliability method (FORM), are shown to minimize a bound on the sampling variance of the estimator for the probability of failure. Two schemes based on the application of calculus of variations for selecting control signals are proposed: the first obtains the control force as the solution of a two-point nonlinear boundary value problem, and, the second explores the application of the Volterra series in characterizing the controls. The relative merits of these schemes, vis-à-vis the method based on ideas from the FORM, are discussed. Illustrative examples, involving archetypal single degree of freedom (dof) nonlinear oscillators, and a multi-degree of freedom nonlinear dynamical system, are presented. The credentials of the proposed procedures are established by comparing the solutions with pertinent results from direct Monte Carlo simulations.

  19. Reliability of using nondestructive tests to estimate compressive strength of building stones and bricks

    Directory of Open Access Journals (Sweden)

    Ali Abd Elhakam Aliabdo

    2012-09-01

    Full Text Available This study aims to investigate the relationships between Schmidt hardness rebound number (RN and ultrasonic pulse velocity (UPV versus compressive strength (fc of stones and bricks. Four types of rocks (marble, pink lime stone, white lime stone and basalt and two types of burned bricks and lime-sand bricks were studied. Linear and non-linear models were proposed. High correlations were found between RN and UPV versus compressive strength. Validation of proposed models was assessed using other specimens for each material. Linear models for each material showed good correlations than non-linear models. General model between RN and compressive strength of tested stones and bricks showed a high correlation with regression coefficient R2 value of 0.94. Estimation of compressive strength for the studied stones and bricks using their rebound number and ultrasonic pulse velocity in a combined method was generally more reliable than using rebound number or ultrasonic pulse velocity only.

  20. Reliability estimation of structures under stochastic loading—A case study on nuclear piping

    International Nuclear Information System (INIS)

    Hari Prasad, M.; Rami Reddy, G.; Dubey, P.N.; Srividya, A.; Verma, A.K.

    2013-01-01

    Highlights: ► Structures are generally subjected to different types of loadings. ► One such type of loading is random sequence and has been treated as a stochastic fatigue loading. ► In this methodology both stress amplitude and number of cycles to failure have been considered as random variables. ► The methodology has been demonstrated with a case study on nuclear piping. ► The failure probability of piping has been estimated as a function of time. - Abstract: Generally structures are subjected to different types of loadings throughout their life time. These loads can be either discrete in nature or continuous in nature and also these can be either stationary or non stationary processes. This means that the structural reliability analysis not only considers random variables but also considers random variables which are functions of time, referred to as stochastic processes. A stochastic process can be viewed as a family of random variables. When a structure is subjected to a random loading, based on the stresses developed in the structure and failure criteria the failure probability can be estimated. In practice the structures are designed with higher factor of safety to take care of such random loads. In such cases the structure will fail only when the random loads are cyclic in nature. In traditional reliability analysis, the variation in the load is treated as a random variable and to account for the number of occurrences of the loading the concept of extreme value theory is used. But with this method one is neglecting the damage accumulation that will take place from one loading to another loading. Hence, in this paper, a new way of dealing with these types of problems has been discussed by using the concept of stochastic fatigue loading. The random loading has been considered as earthquake loading. The methodology has been demonstrated with a case study on nuclear power plant piping.

  1. Estimation of process capability indices from the results of limit gauge inspection of dimensional parameters in machining industry

    Science.gov (United States)

    Masterenko, Dmitry A.; Metel, Alexander S.

    2018-03-01

    The process capability indices Cp, Cpk are widely used in the modern quality management as statistical measures of the ability of a process to produce output X within specification limits. The customer's requirement to ensure Cp ≥ 1.33 is often applied in contracts. Capability indices estimates may be calculated with the estimates of the mean µ and the variability 6σ, and for it, the quality characteristic in a sample of pieces should be measured. It requires, in turn, using advanced measuring devices and well-qualified staff. From the other hand, quality inspection by attributes, fulfilled with limit gauges (go/no-go) is much simpler and has a higher performance, but it does not give the numerical values of the quality characteristic. The described method allows estimating the mean and the variability of the process on the basis of the results of limit gauge inspection with certain lower limit LCL and upper limit UCL, which separates the pieces into three groups: where X control of the manufacturing process. It is very important for improving quality of articles in machining industry through their tolerance.

  2. Evaluating the capabilities of vegetation spectral indices on chlorophyll content estimation at Sentinel-2 spectral resolutions

    Science.gov (United States)

    Sun, Qi; Jiao, Quanjun; Dai, Huayang

    2018-03-01

    Chlorophyll is an important pigment in green plants for photosynthesis and obtaining the energy for growth and development. The rapid, nondestructive and accurate estimation of chlorophyll content is significant for understanding the crops growth, monitoring the disease and insect, and assessing the yield of crops. Sentinel-2 equipped with the Multi-Spectral Instrument (MSI), which will provide images with high spatial, spectral and temporal resolution. It covers the VNIR/SWIR spectral region in 13 bands and incorporates two new spectral bands in the red-edge region and a spatial resolution of 20nm, which can be used to derive vegetation indices using red-edge bands. In this paper, we will focus on assessing the potential of vegetation spectral indices for retrieving chlorophyll content from Sentinel-2 at different angles. Subsequently, we used in-situ spectral data and Sentinel-2 data to test the relationship between VIs and chlorophyll content. The REP, MTCI, CIred-edge, CIgreen, Macc01, TCARI/OSAVI [705,750], NDRE1 and NDRE2 were calculated. NDRE2 index displays a strongly similar result for hyperspectral and simulated Sentinel-2 spectral bands (R2 =0.53, R2 =0.51, for hyperspectral and Sentinel-2, respectively). At different observation angles, NDRE2 has the smallest difference in performance (R2 = 0.51, R2 =0.64, at 0° and 15° , respectively).

  3. On the Reliability of Source Time Functions Estimated Using Empirical Green's Function Methods

    Science.gov (United States)

    Gallegos, A. C.; Xie, J.; Suarez Salas, L.

    2017-12-01

    The Empirical Green's Function (EGF) method (Hartzell, 1978) has been widely used to extract source time functions (STFs). In this method, seismograms generated by collocated events with different magnitudes are deconvolved. Under a fundamental assumption that the STF of the small event is a delta function, the deconvolved Relative Source Time Function (RSTF) yields the large event's STF. While this assumption can be empirically justified by examination of differences in event size and frequency content of the seismograms, there can be a lack of rigorous justification of the assumption. In practice, a small event might have a finite duration when the RSTF is retrieved and interpreted as the large event STF with a bias. In this study, we rigorously analyze this bias using synthetic waveforms generated by convolving a realistic Green's function waveform with pairs of finite-duration triangular or parabolic STFs. The RSTFs are found using a time-domain based matrix deconvolution. We find when the STFs of smaller events are finite, the RSTFs are a series of narrow non-physical spikes. Interpreting these RSTFs as a series of high-frequency source radiations would be very misleading. The only reliable and unambiguous information we can retrieve from these RSTFs is the difference in durations and the moment ratio of the two STFs. We can apply a Tikhonov smoothing to obtain a single-pulse RSTF, but its duration is dependent on the choice of weighting, which may be subjective. We then test the Multi-Channel Deconvolution (MCD) method (Plourde & Bostock, 2017) which assumes that both STFs have finite durations to be solved for. A concern about the MCD method is that the number of unknown parameters is larger, which would tend to make the problem rank-deficient. Because the kernel matrix is dependent on the STFs to be solved for under a positivity constraint, we can only estimate the rank-deficiency with a semi-empirical approach. Based on the results so far, we find that the

  4. Reliability of different sampling densities for estimating and mapping lichen diversity in biomonitoring studies

    International Nuclear Information System (INIS)

    Ferretti, M.; Brambilla, E.; Brunialti, G.; Fornasier, F.; Mazzali, C.; Giordani, P.; Nimis, P.L.

    2004-01-01

    Sampling requirements related to lichen biomonitoring include optimal sampling density for obtaining precise and unbiased estimates of population parameters and maps of known reliability. Two available datasets on a sub-national scale in Italy were used to determine a cost-effective sampling density to be adopted in medium-to-large-scale biomonitoring studies. As expected, the relative error in the mean Lichen Biodiversity (Italian acronym: BL) values and the error associated with the interpolation of BL values for (unmeasured) grid cells increased as the sampling density decreased. However, the increase in size of the error was not linear and even a considerable reduction (up to 50%) in the original sampling effort led to a far smaller increase in errors in the mean estimates (<6%) and in mapping (<18%) as compared with the original sampling densities. A reduction in the sampling effort can result in considerable savings of resources, which can then be used for a more detailed investigation of potentially problematic areas. It is, however, necessary to decide the acceptable level of precision at the design stage of the investigation, so as to select the proper sampling density. - An acceptable level of precision must be decided before determining a sampling design

  5. Reliability of third molar development for age estimation in Gujarati population: A comparative study.

    Science.gov (United States)

    Gandhi, Neha; Jain, Sandeep; Kumar, Manish; Rupakar, Pratik; Choyal, Kanaram; Prajapati, Seema

    2015-01-01

    Age assessment may be a crucial step in postmortem profiling leading to confirmative identification. In children, Demirjian's method based on eight developmental stages was developed to determine maturity scores as a function of age and polynomial functions to determine age as a function of score. Of this study was to evaluate the reliability of age estimation using Demirjian's eight teeth method following the French maturity scores and Indian-specific formula from developmental stages of third molar with the help of orthopantomograms using the Demirjian method. Dental panoramic tomograms from 30 subjects each of known chronological age and sex were collected and were evaluated according to Demirjian's criteria. Age calculations were performed using Demirjian's formula and Indian formula. Statistical analysis used was Chi-square test and ANOVA test and the P values obtained were statistically significant. There was an average underestimation of age with both Indian and Demirjian's formulas. The mean absolute error was lower using Indian formula hence it can be applied for age estimation in present Gujarati population. Also, females were ahead of achieving dental maturity than males thus completion of dental development is attained earlier in females. Greater accuracy can be obtained if population-specific formulas considering the ethnic and environmental variation are derived performing the regression analysis.

  6. Reliability analysis of road network for estimation of public evacuation time around NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Bang, Sun-Young; Lee, Gab-Bock; Chung, Yang-Geun [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    2007-07-01

    The most strong protection method of radiation emergency preparedness is the evacuation of the public members when a great deal of radioactivity is released to environment. After the Three Mile Island (TMI) nuclear power plant meltdown in the United States and Chernobyl nuclear power plant disaster in the U.S.S.R, many advanced countries including the United States and Japan have continued research on estimation of public evacuation time as one of emergency countermeasure technologies. Also in South Korea, 'Framework Act on Civil Defense: Radioactive Disaster Preparedness Plan' was established in 1983 and nuclear power plants set up a radiation emergency plan and have regularly carried out radiation emergency preparedness trainings. Nonetheless, there is still a need to improve technology to estimate public evacuation time by executing precise analysis of traffic flow to prepare practical and efficient ways to protect the public. In this research, road network for Wolsong and Kori NPPs was constructed by CORSIM code and Reliability analysis of this road network was performed.

  7. Reliability of Nationwide Prevalence Estimates of Dementia: A Critical Appraisal Based on Brazilian Surveys.

    Directory of Open Access Journals (Sweden)

    Flávio Chaimowicz

    Full Text Available The nationwide dementia prevalence is usually calculated by applying the results of local surveys to countries' populations. To evaluate the reliability of such estimations in developing countries, we chose Brazil as an example. We carried out a systematic review of dementia surveys, ascertained their risk of bias, and present the best estimate of occurrence of dementia in Brazil.We carried out an electronic search of PubMed, Latin-American databases, and a Brazilian thesis database for surveys focusing on dementia prevalence in Brazil. The systematic review was registered at PROSPERO (CRD42014008815. Among the 35 studies found, 15 analyzed population-based random samples. However, most of them utilized inadequate criteria for diagnostics. Six studies without these limitations were further analyzed to assess the risk of selection, attrition, outcome and population bias as well as several statistical issues. All the studies presented moderate or high risk of bias in at least two domains due to the following features: high non-response, inaccurate cut-offs, and doubtful accuracy of the examiners. Two studies had limited external validity due to high rates of illiteracy or low income. The three studies with adequate generalizability and the lowest risk of bias presented a prevalence of dementia between 7.1% and 8.3% among subjects aged 65 years and older. However, after adjustment for accuracy of screening, the best available evidence points towards a figure between 15.2% and 16.3%.The risk of bias may strongly limit the generalizability of dementia prevalence estimates in developing countries. Extrapolations that have already been made for Brazil and Latin America were based on a prevalence that should have been adjusted for screening accuracy or not used at all due to severe bias. Similar evaluations regarding other developing countries are needed in order to verify the scope of these limitations.

  8. Human decomposition and the reliability of a 'Universal' model for post mortem interval estimations.

    Science.gov (United States)

    Cockle, Diane L; Bell, Lynne S

    2015-08-01

    Human decomposition is a complex biological process driven by an array of variables which are not clearly understood. The medico-legal community have long been searching for a reliable method to establish the post-mortem interval (PMI) for those whose deaths have either been hidden, or gone un-noticed. To date, attempts to develop a PMI estimation method based on the state of the body either at the scene or at autopsy have been unsuccessful. One recent study has proposed that two simple formulae, based on the level of decomposition humidity and temperature, could be used to accurately calculate the PMI for bodies outside, on or under the surface worldwide. This study attempted to validate 'Formula I' [1] (for bodies on the surface) using 42 Canadian cases with known PMIs. The results indicated that bodies exposed to warm temperatures consistently overestimated the known PMI by a large and inconsistent margin for Formula I estimations. And for bodies exposed to cold and freezing temperatures (less than 4°C), then the PMI was dramatically under estimated. The ability of 'Formulae II' to estimate the PMI for buried bodies was also examined using a set of 22 known Canadian burial cases. As these cases used in this study are retrospective, some of the data needed for Formula II was not available. The 4.6 value used in Formula II to represent the standard ratio of time that burial decelerates the rate of decomposition was examined. The average time taken to achieve each stage of decomposition both on, and under the surface was compared for the 118 known cases. It was found that the rate of decomposition was not consistent throughout all stages of decomposition. The rates of autolysis above and below the ground were equivalent with the buried cases staying in a state of putrefaction for a prolonged period of time. It is suggested that differences in temperature extremes and humidity levels between geographic regions may make it impractical to apply formulas developed in

  9. Simple and Reliable Method to Estimate the Fingertip Static Coefficient of Friction in Precision Grip.

    Science.gov (United States)

    Barrea, Allan; Bulens, David Cordova; Lefevre, Philippe; Thonnard, Jean-Louis

    2016-01-01

    The static coefficient of friction (µ static ) plays an important role in dexterous object manipulation. Minimal normal force (i.e., grip force) needed to avoid dropping an object is determined by the tangential force at the fingertip-object contact and the frictional properties of the skin-object contact. Although frequently assumed to be constant for all levels of normal force (NF, the force normal to the contact), µ static actually varies nonlinearly with NF and increases at low NF levels. No method is currently available to measure the relationship between µ static and NF easily. Therefore, we propose a new method allowing the simple and reliable measurement of the fingertip µ static at different NF levels, as well as an algorithm for determining µ static from measured forces and torques. Our method is based on active, back-and-forth movements of a subject's finger on the surface of a fixed six-axis force and torque sensor. µ static is computed as the ratio of the tangential to the normal force at slip onset. A negative power law captures the relationship between µ static and NF. Our method allows the continuous estimation of µ static as a function of NF during dexterous manipulation, based on the relationship between µ static and NF measured before manipulation.

  10. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  11. Using operational data to estimate the reliable yields of water-supply wells

    Science.gov (United States)

    Misstear, Bruce D. R.; Beeson, Sarah

    The reliable yield of a water-supply well depends on many different factors, including the properties of the well and the aquifer; the capacities of the pumps, raw-water mains, and treatment works; the interference effects from other wells; and the constraints imposed by ion licences, water quality, and environmental issues. A relatively simple methodology for estimating reliable yields has been developed that takes into account all of these factors. The methodology is based mainly on an analysis of water-level and source-output data, where such data are available. Good operational data are especially important when dealing with wells in shallow, unconfined, fissure-flow aquifers, where actual well performance may vary considerably from that predicted using a more analytical approach. Key issues in the yield-assessment process are the identification of a deepest advisable pumping water level, and the collection of the appropriate well, aquifer, and operational data. Although developed for water-supply operators in the United Kingdom, this approach to estimating the reliable yields of water-supply wells using operational data should be applicable to a wide range of hydrogeological conditions elsewhere. Résumé La productivité d'un puits capté pour l'adduction d'eau potable dépend de différents facteurs, parmi lesquels les propriétés du puits et de l'aquifère, la puissance des pompes, le traitement des eaux brutes, les effets d'interférences avec d'autres puits et les contraintes imposées par les autorisations d'exploitation, par la qualité des eaux et par les conditions environnementales. Une méthodologie relativement simple d'estimation de la productivité qui prenne en compte tous ces facteurs a été mise au point. Cette méthodologie est basée surtout sur une analyse des données concernant le niveau piézométrique et le débit de prélèvement, quand ces données sont disponibles. De bonnes données opérationnelles sont particuli

  12. A Comparison of the Approaches of Generalizability Theory and Item Response Theory in Estimating the Reliability of Test Scores for Testlet-Composed Tests

    Science.gov (United States)

    Lee, Guemin; Park, In-Yong

    2012-01-01

    Previous assessments of the reliability of test scores for testlet-composed tests have indicated that item-based estimation methods overestimate reliability. This study was designed to address issues related to the extent to which item-based estimation methods overestimate the reliability of test scores composed of testlets and to compare several…

  13. Estimating the reliability of glycemic index values and potential sources of methodological and biological variability.

    Science.gov (United States)

    Matthan, Nirupa R; Ausman, Lynne M; Meng, Huicui; Tighiouart, Hocine; Lichtenstein, Alice H

    2016-10-01

    The utility of glycemic index (GI) values for chronic disease risk management remains controversial. Although absolute GI value determinations for individual foods have been shown to vary significantly in individuals with diabetes, there is a dearth of data on the reliability of GI value determinations and potential sources of variability among healthy adults. We examined the intra- and inter-individual variability in glycemic response to a single food challenge and methodologic and biological factors that potentially mediate this response. The GI value for white bread was determined by using standardized methodology in 63 volunteers free from chronic disease and recruited to differ by sex, age (18-85 y), and body mass index [BMI (in kg/m 2 ): 20-35]. Volunteers randomly underwent 3 sets of food challenges involving glucose (reference) and white bread (test food), both providing 50 g available carbohydrates. Serum glucose and insulin were monitored for 5 h postingestion, and GI values were calculated by using different area under the curve (AUC) methods. Biochemical variables were measured by using standard assays and body composition by dual-energy X-ray absorptiometry. The mean ± SD GI value for white bread was 62 ± 15 when calculated by using the recommended method. Mean intra- and interindividual CVs were 20% and 25%, respectively. Increasing sample size, replication of reference and test foods, and length of blood sampling, as well as AUC calculation method, did not improve the CVs. Among the biological factors assessed, insulin index and glycated hemoglobin values explained 15% and 16% of the variability in mean GI value for white bread, respectively. These data indicate that there is substantial variability in individual responses to GI value determinations, demonstrating that it is unlikely to be a good approach to guiding food choices. Additionally, even in healthy individuals, glycemic status significantly contributes to the variability in GI value

  14. Integration of external estimated breeding values and associated reliabilities using correlations among traits and effects.

    Science.gov (United States)

    Vandenplas, J; Colinet, F G; Glorieux, G; Bertozzi, C; Gengler, N

    2015-12-01

    Based on a Bayesian view of linear mixed models, several studies showed the possibilities to integrate estimated breeding values (EBV) and associated reliabilities (REL) provided by genetic evaluations performed outside a given evaluation system into this genetic evaluation. Hereafter, the term "internal" refers to this given genetic evaluation system, and the term "external" refers to all other genetic evaluations performed outside the internal evaluation system. Bayesian approaches integrate external information (i.e., external EBV and associated REL) by altering both the mean and (co)variance of the prior distributions of the additive genetic effects based on the knowledge of this external information. Extensions of the Bayesian approaches to multivariate settings are interesting because external information expressed on other scales, measurement units, or trait definitions, or associated with different heritabilities and genetic parameters than the internal traits, could be integrated into a multivariate genetic evaluation without the need to convert external information to the internal traits. Therefore, the aim of this study was to test the integration of external EBV and associated REL, expressed on a 305-d basis and genetically correlated with a trait of interest, into a multivariate genetic evaluation using a random regression test-day model for the trait of interest. The approach we used was a multivariate Bayesian approach. Results showed that the integration of external information led to a genetic evaluation for the trait of interest for, at least, animals associated with external information, as accurate as a bivariate evaluation including all available phenotypic information. In conclusion, the multivariate Bayesian approaches have the potential to integrate external information correlated with the internal phenotypic traits, and potentially to the different random regressions, into a multivariate genetic evaluation. This allows the use of different

  15. Reliability Estimation with Uncertainties Consideration for High Power IGBTs in 2.3 MW Wind Turbine Converter System

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Ma, Ke

    2012-01-01

    This paper investigates the lifetime of high power IGBTs (insulated gate bipolar transistors) used in large wind turbine applications. Since the IGBTs are critical components in a wind turbine power converter, it is of great importance to assess their reliability in the design phase of the turbine....... Minimum, maximum and average junction temperatures profiles for the grid side IGBTs are estimated at each wind speed input values. The selected failure mechanism is the crack propagation in solder joint under the silicon die. Based on junction temperature profiles and physics of failure model......, the probabilistic and determinist damage models are presented with estimated fatigue lives. Reliably levels were assessed by means of First Order Reliability Method taking into account uncertainties....

  16. A flexible latent class approach to estimating test-score reliability

    NARCIS (Netherlands)

    van der Palm, D.W.; van der Ark, L.A.; Sijtsma, K.

    2014-01-01

    The latent class reliability coefficient (LCRC) is improved by using the divisive latent class model instead of the unrestricted latent class model. This results in the divisive latent class reliability coefficient (DLCRC), which unlike LCRC avoids making subjective decisions about the best solution

  17. On estimation of reliability of a nuclear power plant with tokamak reactor

    International Nuclear Information System (INIS)

    Klemin, A.I.; Smetannikov, V.P.; Shiverskij, E.A.

    1982-01-01

    The results of the analysis of INTOR plant reliability are presented. The first stage of the analysis consists in the calculation of the INTOR plant structural reliability factors (15 ibs main systems have been considered). For each system the failure flow parameter (W(1/h)) and operational readiness Ksub(r) have been determined which for the plant as a whole besides these factors-technological utilization coefficient Ksub(TU) and mean-cycles-between failures Tsub(o). The second stage of the reliability analysis consists in investigating methods of improving its reliability factors reratively to the one calculated at the first level stage. It is shown that the reliability of the whole plant to the most essential extent is determined by the power supply system reliability. The following as to the influence extent on the INTOR plant reliability is the cryogenic system. Calculations of the INTOR plant reliability factors have given the following values: W=4,5x10 -3 1/h. Tsub(o)=152 h, Ksub(r)=0,71, Ksub(TU)=o,4 g

  18. Lifetime prediction and reliability estimation methodology for Stirling-type pulse tube refrigerators by gaseous contamination accelerated degradation testing

    Science.gov (United States)

    Wan, Fubin; Tan, Yuanyuan; Jiang, Zhenhua; Chen, Xun; Wu, Yinong; Zhao, Peng

    2017-12-01

    Lifetime and reliability are the two performance parameters of premium importance for modern space Stirling-type pulse tube refrigerators (SPTRs), which are required to operate in excess of 10 years. Demonstration of these parameters provides a significant challenge. This paper proposes a lifetime prediction and reliability estimation method that utilizes accelerated degradation testing (ADT) for SPTRs related to gaseous contamination failure. The method was experimentally validated via three groups of gaseous contamination ADT. First, the performance degradation model based on mechanism of contamination failure and material outgassing characteristics of SPTRs was established. Next, a preliminary test was performed to determine whether the mechanism of contamination failure of the SPTRs during ADT is consistent with normal life testing. Subsequently, the experimental program of ADT was designed for SPTRs. Then, three groups of gaseous contamination ADT were performed at elevated ambient temperatures of 40 °C, 50 °C, and 60 °C, respectively and the estimated lifetimes of the SPTRs under normal condition were obtained through acceleration model (Arrhenius model). The results show good fitting of the degradation model with the experimental data. Finally, we obtained the reliability estimation of SPTRs through using the Weibull distribution. The proposed novel methodology enables us to take less than one year time to estimate the reliability of the SPTRs designed for more than 10 years.

  19. Nuclear reactor component populations, reliability data bases, and their relationship to failure rate estimation and uncertainty analysis

    International Nuclear Information System (INIS)

    Martz, H.F.; Beckman, R.J.

    1981-12-01

    Probabilistic risk analyses are used to assess the risks inherent in the operation of existing and proposed nuclear power reactors. In performing such risk analyses the failure rates of various components which are used in a variety of reactor systems must be estimated. These failure rate estimates serve as input to fault trees and event trees used in the analyses. Component failure rate estimation is often based on relevant field failure data from different reliability data sources such as LERs, NPRDS, and the In-Plant Data Program. Various statistical data analysis and estimation methods have been proposed over the years to provide the required estimates of the component failure rates. This report discusses the basis and extent to which statistical methods can be used to obtain component failure rate estimates. The report is expository in nature and focuses on the general philosophical basis for such statistical methods. Various terms and concepts are defined and illustrated by means of numerous simple examples

  20. Reliability of different mark-recapture methods for population size estimation tested against reference population sizes constructed from field data.

    Directory of Open Access Journals (Sweden)

    Annegret Grimm

    Full Text Available Reliable estimates of population size are fundamental in many ecological studies and biodiversity conservation. Selecting appropriate methods to estimate abundance is often very difficult, especially if data are scarce. Most studies concerning the reliability of different estimators used simulation data based on assumptions about capture variability that do not necessarily reflect conditions in natural populations. Here, we used data from an intensively studied closed population of the arboreal gecko Gehyra variegata to construct reference population sizes for assessing twelve different population size estimators in terms of bias, precision, accuracy, and their 95%-confidence intervals. Two of the reference populations reflect natural biological entities, whereas the other reference populations reflect artificial subsets of the population. Since individual heterogeneity was assumed, we tested modifications of the Lincoln-Petersen estimator, a set of models in programs MARK and CARE-2, and a truncated geometric distribution. Ranking of methods was similar across criteria. Models accounting for individual heterogeneity performed best in all assessment criteria. For populations from heterogeneous habitats without obvious covariates explaining individual heterogeneity, we recommend using the moment estimator or the interpolated jackknife estimator (both implemented in CAPTURE/MARK. If data for capture frequencies are substantial, we recommend the sample coverage or the estimating equation (both models implemented in CARE-2. Depending on the distribution of catchabilities, our proposed multiple Lincoln-Petersen and a truncated geometric distribution obtained comparably good results. The former usually resulted in a minimum population size and the latter can be recommended when there is a long tail of low capture probabilities. Models with covariates and mixture models performed poorly. Our approach identified suitable methods and extended options to

  1. Reliability of single aliquot regenerative protocol (SAR) for dose estimation in quartz at different burial temperatures: A simulation study

    International Nuclear Information System (INIS)

    Koul, D.K.; Pagonis, V.; Patil, P.

    2016-01-01

    The single aliquot regenerative protocol (SAR) is a well-established technique for estimating naturally acquired radiation doses in quartz. This simulation work examines the reliability of SAR protocol for samples which experienced different ambient temperatures in nature in the range of −10 to 40 °C. The contribution of various experimental variables used in SAR protocols to the accuracy and precision of the method is simulated for different ambient temperatures. Specifically the effects of paleo-dose, test dose, pre-heating temperature and cut-heat temperature on the accuracy of equivalent dose (ED) estimation are simulated by using random combinations of the concentrations of traps and centers using a previously published comprehensive quartz model. The findings suggest that the ambient temperature has a significant bearing on the reliability of natural dose estimation using SAR protocol, especially for ambient temperatures above 0 °C. The main source of these inaccuracies seems to be thermal sensitization of the quartz samples caused by the well-known thermal transfer of holes between luminescence centers in quartz. The simulations suggest that most of this inaccuracy in the dose estimation can be removed by delivering the laboratory doses in pulses (pulsed irradiation procedures). - Highlights: • Ambient temperatures affect the reliability of SAR. • It overestimates the dose with increase in burial temperature and burial time periods. • Elevated temperature irradiation does not correct for these overestimations. • Inaccuracies in dose estimation can be removed by incorporating pulsed irradiation procedures.

  2. Anti-deception: reliable EEG-based biometrics with real-time capability from the neural response of face rapid serial visual presentation.

    Science.gov (United States)

    Wu, Qunjian; Yan, Bin; Zeng, Ying; Zhang, Chi; Tong, Li

    2018-05-03

    The electroencephalogram (EEG) signal represents a subject's specific brain activity patterns and is considered as an ideal biometric given its superior invisibility, non-clonality, and non-coercion. In order to enhance its applicability in identity authentication, a novel EEG-based identity authentication method is proposed based on self- or non-self-face rapid serial visual presentation. In contrast to previous studies that extracted EEG features from rest state or motor imagery, the designed paradigm could obtain a distinct and stable biometric trait with a lower time cost. Channel selection was applied to select specific channels for each user to enhance system portability and improve discriminability between users and imposters. Two different imposter scenarios were designed to test system security, which demonstrate the capability of anti-deception. Fifteen users and thirty imposters participated in the experiment. The mean authentication accuracy values for the two scenarios were 91.31 and 91.61%, with 6 s time cost, which illustrated the precision and real-time capability of the system. Furthermore, in order to estimate the repeatability and stability of our paradigm, another data acquisition session is conducted for each user. Using the classification models generated from the previous sessions, a mean false rejected rate of 7.27% has been achieved, which demonstrates the robustness of our paradigm. Experimental results reveal that the proposed paradigm and methods are effective for EEG-based identity authentication.

  3. An application of the fault tree analysis for the power system reliability estimation

    International Nuclear Information System (INIS)

    Volkanovski, A.; Cepin, M.; Mavko, B.

    2007-01-01

    The power system is a complex system with its main function to produce, transfer and provide consumers with electrical energy. Combinations of failures of components in the system can result in a failure of power delivery to certain load points and in some cases in a full blackout of power system. The power system reliability directly affects safe and reliable operation of nuclear power plants because the loss of offsite power is a significant contributor to the core damage frequency in probabilistic safety assessments of nuclear power plants. The method, which is based on the integration of the fault tree analysis with the analysis of the power flows in the power system, was developed and implemented for power system reliability assessment. The main contributors to the power system reliability are identified, both quantitatively and qualitatively. (author)

  4. A Simplified Procedure for Reliability Estimation of Underground Concrete Barriers against Normal Missile Impact

    Directory of Open Access Journals (Sweden)

    N. A. Siddiqui

    2011-06-01

    Full Text Available Underground concrete barriers are frequently used to protect strategic structures like Nuclear power plants (NPP, deep under the soil against any possible high velocity missile impact. For a given range and type of missile (or projectile it is of paramount importance to examine the reliability of underground concrete barriers under expected uncertainties involved in the missile, concrete, and soil parameters. In this paper, a simple procedure for the reliability assessment of underground concrete barriers against normal missile impact has been presented using the First Order Reliability Method (FORM. The presented procedure is illustrated by applying it to a concrete barrier that lies at a certain depth in the soil. Some parametric studies are also conducted to obtain the design values which make the barrier as reliable as desired.

  5. On the q-Weibull distribution for reliability applications: An adaptive hybrid artificial bee colony algorithm for parameter estimation

    International Nuclear Information System (INIS)

    Xu, Meng; Droguett, Enrique López; Lins, Isis Didier; Chagas Moura, Márcio das

    2017-01-01

    The q-Weibull model is based on the Tsallis non-extensive entropy and is able to model various behaviors of the hazard rate function, including bathtub curves, by using a single set of parameters. Despite its flexibility, the q-Weibull has not been widely used in reliability applications partly because of the complicated parameters estimation. In this work, the parameters of the q-Weibull are estimated by the maximum likelihood (ML) method. Due to the intricate system of nonlinear equations, derivative-based optimization methods may fail to converge. Thus, the heuristic optimization method of artificial bee colony (ABC) is used instead. To deal with the slow convergence of ABC, it is proposed an adaptive hybrid ABC (AHABC) algorithm that dynamically combines Nelder-Mead simplex search method with ABC for the ML estimation of the q-Weibull parameters. Interval estimates for the q-Weibull parameters, including confidence intervals based on the ML asymptotic theory and on bootstrap methods, are also developed. The AHABC is validated via numerical experiments involving the q-Weibull ML for reliability applications and results show that it produces faster and more accurate convergence when compared to ABC and similar approaches. The estimation procedure is applied to real reliability failure data characterized by a bathtub-shaped hazard rate. - Highlights: • Development of an Adaptive Hybrid ABC (AHABC) algorithm for q-Weibull distribution. • AHABC combines local Nelder-Mead simplex method with ABC to enhance local search. • AHABC efficiently finds the optimal solution for the q-Weibull ML problem. • AHABC outperforms ABC and self-adaptive hybrid ABC in accuracy and convergence speed. • Useful model for reliability data with non-monotonic hazard rate.

  6. Application of Fault Tree Analysis for Estimating Temperature Alarm Circuit Reliability

    International Nuclear Information System (INIS)

    El-Shanshoury, A.I.; El-Shanshoury, G.I.

    2011-01-01

    Fault Tree Analysis (FTA) is one of the most widely-used methods in system reliability analysis. It is a graphical technique that provides a systematic description of the combinations of possible occurrences in a system, which can result in an undesirable outcome. The presented paper deals with the application of FTA method in analyzing temperature alarm circuit. The criticality failure of this circuit comes from failing to alarm when temperature exceeds a certain limit. In order for a circuit to be safe, a detailed analysis of the faults causing circuit failure is performed by configuring fault tree diagram (qualitative analysis). Calculations of circuit quantitative reliability parameters such as Failure Rate (FR) and Mean Time between Failures (MTBF) are also done by using Relex 2009 computer program. Benefits of FTA are assessing system reliability or safety during operation, improving understanding of the system, and identifying root causes of equipment failures

  7. Estimating Value of Congestion and of Reliability from Observation of Route Choice Behavior of Car Drivers

    DEFF Research Database (Denmark)

    Prato, Carlo Giacomo; Rasmussen, Thomas Kjær; Nielsen, Otto Anker

    2014-01-01

    In recent years, a consensus has been reached about the relevance of calculating the value of congestion and the value of reliability for better understanding and therefore better prediction of travel behavior. The current study proposed a revealed preference approach that used a large amount...... both congestion and reliability terms. Results illustrated that the value of time and the value of congestion were significantly higher in the peak period because of possible higher penalties for drivers being late and consequently possible higher time pressure. Moreover, results showed...... that the marginal rate of substitution between travel time reliability and total travel time did not vary across periods and traffic conditions, with the obvious caveat that the absolute values were significantly higher for the peak period. Last, results showed the immense potential of exploiting the growing...

  8. Efficient Estimation of Extreme Non-linear Roll Motions using the First-order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2007-01-01

    In on-board decision support systems efficient procedures are needed for real-time estimation of the maximum ship responses to be expected within the next few hours, given on-line information on the sea state and user defined ranges of possible headings and speeds. For linear responses standard...... frequency domain methods can be applied. To non-linear responses like the roll motion, standard methods like direct time domain simulations are not feasible due to the required computational time. However, the statistical distribution of non-linear ship responses can be estimated very accurately using...... the first-order reliability method (FORM), well-known from structural reliability problems. To illustrate the proposed procedure, the roll motion is modelled by a simplified non-linear procedure taking into account non-linear hydrodynamic damping, time-varying restoring and wave excitation moments...

  9. Estimating The Reliability of the Lawrence Livermore National Laboratory (LLNL) Flash X-ray (FXR) Machine

    International Nuclear Information System (INIS)

    Ong, M M; Kihara, R; Zentler, J M; Kreitzer, B R; DeHope, W J

    2007-01-01

    At Lawrence Livermore National Laboratory (LLNL), our flash X-ray accelerator (FXR) is used on multi-million dollar hydrodynamic experiments. Because of the importance of the radiographs, FXR must be ultra-reliable. Flash linear accelerators that can generate a 3 kA beam at 18 MeV are very complex. They have thousands, if not millions, of critical components that could prevent the machine from performing correctly. For the last five years, we have quantified and are tracking component failures. From this data, we have determined that the reliability of the high-voltage gas-switches that initiate the pulses, which drive the accelerator cells, dominates the statistics. The failure mode is a single-switch pre-fire that reduces the energy of the beam and degrades the X-ray spot-size. The unfortunate result is a lower resolution radiograph. FXR is a production machine that allows only a modest number of pulses for testing. Therefore, reliability switch testing that requires thousands of shots is performed on our test stand. Study of representative switches has produced pre-fire statistical information and probability distribution curves. This information is applied to FXR to develop test procedures and determine individual switch reliability using a minimal number of accelerator pulses

  10. R&D program benefits estimation: DOE Office of Electricity Delivery and Energy Reliability

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2006-12-04

    The overall mission of the U.S. Department of Energy’s Office of Electricity Delivery and Energy Reliability (OE) is to lead national efforts to modernize the electric grid, enhance the security and reliability of the energy infrastructure, and facilitate recovery from disruptions to the energy supply. In support of this mission, OE conducts a portfolio of research and development (R&D) activities to advance technologies to enhance electric power delivery. Multiple benefits are anticipated to result from the deployment of these technologies, including higher quality and more reliable power, energy savings, and lower cost electricity. In addition, OE engages State and local government decision-makers and the private sector to address issues related to the reliability and security of the grid, including responding to national emergencies that affect energy delivery. The OE R&D activities are comprised of four R&D lines: High Temperature Superconductivity (HTS), Visualization and Controls (V&C), Energy Storage and Power Electronics (ES&PE), and Distributed Systems Integration (DSI).

  11. Estimation of electromagnetic pumps reliability based on the results of their exploitation

    International Nuclear Information System (INIS)

    Vitkovskij, I.V.; Kirillov, I.R.; Chajka, P.Yu.; Kryuchkov, E.A.; Poplavskij, V.M.; Nosov, Yu.V.; Oshkanov, N.N.

    2007-01-01

    Main factors, determining the service life of induction electromagnetic pumps (IEP), are analyzed. It is shown that the IEP serviceability depends mainly on the winding reliability. The main damaging factors, acting on the windings, are noted. The expressions for calculation of the failure intensity for the coil and case insulations are obtained [ru

  12. Attenuation of the Squared Canonical Correlation Coefficient under Varying Estimates of Score Reliability

    Science.gov (United States)

    Wilson, Celia M.

    2010-01-01

    Research pertaining to the distortion of the squared canonical correlation coefficient has traditionally been limited to the effects of sampling error and associated correction formulas. The purpose of this study was to compare the degree of attenuation of the squared canonical correlation coefficient under varying conditions of score reliability.…

  13. Reliability and validity of food portion size estimation from images using manual flexible digital virtual meshes

    Science.gov (United States)

    The eButton takes frontal images at 4 second intervals throughout the day. A three-dimensional (3D) manually administered wire mesh procedure has been developed to quantify portion sizes from the two-dimensional (2D) images. This paper reports a test of the interrater reliability and validity of use...

  14. Application of fuzzy-MOORA method: Ranking of components for reliability estimation of component-based software systems

    Directory of Open Access Journals (Sweden)

    Zeeshan Ali Siddiqui

    2016-01-01

    Full Text Available Component-based software system (CBSS development technique is an emerging discipline that promises to take software development into a new era. As hardware systems are presently being constructed from kits of parts, software systems may also be assembled from components. It is more reliable to reuse software than to create. It is the glue code and individual components reliability that contribute to the reliability of the overall system. Every component contributes to overall system reliability according to the number of times it is being used, some components are of critical usage, known as usage frequency of component. The usage frequency decides the weight of each component. According to their weights, each component contributes to the overall reliability of the system. Therefore, ranking of components may be obtained by analyzing their reliability impacts on overall application. In this paper, we propose the application of fuzzy multi-objective optimization on the basis of ratio analysis, Fuzzy-MOORA. The method helps us find the best suitable alternative, software component, from a set of available feasible alternatives named software components. It is an accurate and easy to understand tool for solving multi-criteria decision making problems that have imprecise and vague evaluation data. By the use of ratio analysis, the proposed method determines the most suitable alternative among all possible alternatives, and dimensionless measurement will realize the job of ranking of components for estimating CBSS reliability in a non-subjective way. Finally, three case studies are shown to illustrate the use of the proposed technique.

  15. Automatic training and reliability estimation for 3D ASM applied to cardiac MRI segmentation.

    Science.gov (United States)

    Tobon-Gomez, Catalina; Sukno, Federico M; Butakoff, Constantine; Huguet, Marina; Frangi, Alejandro F

    2012-07-07

    Training active shape models requires collecting manual ground-truth meshes in a large image database. While shape information can be reused across multiple imaging modalities, intensity information needs to be imaging modality and protocol specific. In this context, this study has two main purposes: (1) to test the potential of using intensity models learned from MRI simulated datasets and (2) to test the potential of including a measure of reliability during the matching process to increase robustness. We used a population of 400 virtual subjects (XCAT phantom), and two clinical populations of 40 and 45 subjects. Virtual subjects were used to generate simulated datasets (MRISIM simulator). Intensity models were trained both on simulated and real datasets. The trained models were used to segment the left ventricle (LV) and right ventricle (RV) from real datasets. Segmentations were also obtained with and without reliability information. Performance was evaluated with point-to-surface and volume errors. Simulated intensity models obtained average accuracy comparable to inter-observer variability for LV segmentation. The inclusion of reliability information reduced volume errors in hypertrophic patients (EF errors from 17 ± 57% to 10 ± 18%; LV MASS errors from -27 ± 22 g to -14 ± 25 g), and in heart failure patients (EF errors from -8 ± 42% to -5 ± 14%). The RV model of the simulated images needs further improvement to better resemble image intensities around the myocardial edges. Both for real and simulated models, reliability information increased segmentation robustness without penalizing accuracy.

  16. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    OpenAIRE

    Xiao, Ning-Cong; Li, Yan-Feng; Wang, Zhonglai; Peng, Weiwen; Huang, Hong-Zhong

    2013-01-01

    In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to cal...

  17. Availability estimation of repairable systems using reliability graph with general gates (RGGG)

    International Nuclear Information System (INIS)

    Goh, Gyoung Tae

    2009-02-01

    By performing risk analysis, we may obtain sufficient information about the system to redesign it and lower the probability of the occurrence of an accident or mitigate the ensuing consequences. The concept of reliability is widely used to express risk of systems. The reliability is used for non-repairable systems. But nuclear power plant systems are repairable systems. With repairable systems, repairable components can improve the availability of a system because faults that are generated in components can be recovered. Hence, the availability of the system is more proper concept in case of repairable systems. Reliability graph with general gate (RGGG) is one of the system reliability analysis methods. The RGGG is a very intuitiveness method as compared with other methods. But the RGGG has not been applied to repairable systems yet. The objective of this study is to extend the RGGG in order to enable one to analyze repairable system. Determining the probability table for each node is a critical process to calculate the system availability in the RGGG method. Therefore finding the proper algorithms and making probability tables in various situations are the major a part of this study. The other part is an example of applying RGGG method to a real system. We find the proper algorithms and probability tables for independent repairable systems, dependent series repairable systems, and k-out-of-m (K/M) redundant parallel repairable systems in this study. We can evaluate the availability of real system using these probability tables. An example for a real system is shown in the latter part of this study. For the purpose of this analysis, the charging pumps subsystem of the chemical and volume control system (CVCS) was selected. The RGGG method extended for repairable systems has the same characteristic of intuitiveness as the original RGGG method and we can confirm that the availability analysis result from the repairable RGGG method is exact

  18. Reliable Quantification of the Potential for Equations Based on Spot Urine Samples to Estimate Population Salt Intake

    DEFF Research Database (Denmark)

    Huang, Liping; Crino, Michelle; Wu, Jason Hy

    2016-01-01

    to a standard format. Individual participant records will be compiled and a series of analyses will be completed to: (1) compare existing equations for estimating 24-hour salt intake from spot urine samples with 24-hour urine samples, and assess the degree of bias according to key demographic and clinical......BACKGROUND: Methods based on spot urine samples (a single sample at one time-point) have been identified as a possible alternative approach to 24-hour urine samples for determining mean population salt intake. OBJECTIVE: The aim of this study is to identify a reliable method for estimating mean...... population salt intake from spot urine samples. This will be done by comparing the performance of existing equations against one other and against estimates derived from 24-hour urine samples. The effects of factors such as ethnicity, sex, age, body mass index, antihypertensive drug use, health status...

  19. Validity and Reliability of the Brazilian Version of the Rapid Estimate of Adult Literacy in Dentistry--BREALD-30.

    Science.gov (United States)

    Junkes, Monica C; Fraiz, Fabian C; Sardenberg, Fernanda; Lee, Jessica Y; Paiva, Saul M; Ferreira, Fernanda M

    2015-01-01

    The aim of the present study was to translate, perform the cross-cultural adaptation of the Rapid Estimate of Adult Literacy in Dentistry to Brazilian-Portuguese language and test the reliability and validity of this version. After translation and cross-cultural adaptation, interviews were conducted with 258 parents/caregivers of children in treatment at the pediatric dentistry clinics and health units in Curitiba, Brazil. To test the instrument's validity, the scores of Brazilian Rapid Estimate of Adult Literacy in Dentistry (BREALD-30) were compared based on occupation, monthly household income, educational attainment, general literacy, use of dental services and three dental outcomes. The BREALD-30 demonstrated good internal reliability. Cronbach's alpha ranged from 0.88 to 0.89 when words were deleted individually. The analysis of test-retest reliability revealed excellent reproducibility (intraclass correlation coefficient = 0.983 and Kappa coefficient ranging from moderate to nearly perfect). In the bivariate analysis, BREALD-30 scores were significantly correlated with the level of general literacy (rs = 0.593) and income (rs = 0.327) and significantly associated with occupation, educational attainment, use of dental services, self-rated oral health and the respondent's perception regarding his/her child's oral health. However, only the association between the BREALD-30 score and the respondent's perception regarding his/her child's oral health remained significant in the multivariate analysis. The BREALD-30 demonstrated satisfactory psychometric properties and is therefore applicable to adults in Brazil.

  20. Estimation of reliability on digital plant protection system in nuclear power plants using fault simulation with self-checking

    International Nuclear Information System (INIS)

    Lee, Jun Seok; Kim, Suk Joon; Seong, Poong Hyun

    2004-01-01

    Safety-critical digital systems in nuclear power plants require high design reliability. Reliable software design and accurate prediction methods for the system reliability are important problems. In the reliability analysis, the error detection coverage of the system is one of the crucial factors, however, it is difficult to evaluate the error detection coverage of digital instrumentation and control system in nuclear power plants due to complexity of the system. To evaluate the error detection coverage for high efficiency and low cost, the simulation based fault injections with self checking are needed for digital instrumentation and control system in nuclear power plants. The target system is local coincidence logic in digital plant protection system and a simplified software modeling for this target system is used in this work. C++ based hardware description of micro computer simulator system is used to evaluate the error detection coverage of the system. From the simulation result, it is possible to estimate the error detection coverage of digital plant protection system in nuclear power plants using simulation based fault injection method with self checking. (author)

  1. Reliable methods for computer simulation error control and a posteriori estimates

    CERN Document Server

    Neittaanmäki, P

    2004-01-01

    Recent decades have seen a very rapid success in developing numerical methods based on explicit control over approximation errors. It may be said that nowadays a new direction is forming in numerical analysis, the main goal of which is to develop methods ofreliable computations. In general, a reliable numerical method must solve two basic problems: (a) generate a sequence of approximations that converges to a solution and (b) verify the accuracy of these approximations. A computer code for such a method must consist of two respective blocks: solver and checker.In this book, we are chie

  2. Baseline Assessment of the Department of the Army Cost Estimating and Analysis (CE/A) and Cost Management (CM) Capabilities

    National Research Council Canada - National Science Library

    Doyle, Michael C

    2005-01-01

    .../A) and cost management (CM) capabilities. In particular, it supports the Deputy Assistant Secretary of the Army- Cost AND Economics' mission to provide DA with cost, performance and economic analysis in the form of expertise, models, data...

  3. A new lifetime estimation model for a quicker LED reliability prediction

    Science.gov (United States)

    Hamon, B. H.; Mendizabal, L.; Feuillet, G.; Gasse, A.; Bataillou, B.

    2014-09-01

    LED reliability and lifetime prediction is a key point for Solid State Lighting adoption. For this purpose, one hundred and fifty LEDs have been aged for a reliability analysis. LEDs have been grouped following nine current-temperature stress conditions. Stress driving current was fixed between 350mA and 1A and ambient temperature between 85C and 120°C. Using integrating sphere and I(V) measurements, a cross study of the evolution of electrical and optical characteristics has been done. Results show two main failure mechanisms regarding lumen maintenance. The first one is the typically observed lumen depreciation and the second one is a much more quicker depreciation related to an increase of the leakage and non radiative currents. Models of the typical lumen depreciation and leakage resistance depreciation have been made using electrical and optical measurements during the aging tests. The combination of those models allows a new method toward a quicker LED lifetime prediction. These two models have been used for lifetime predictions for LEDs.

  4. Limits to the reliability of size-based fishing status estimation for data-poor stocks

    DEFF Research Database (Denmark)

    Kokkalis, Alexandros; Thygesen, Uffe Høgsbro; Nielsen, Anders

    2015-01-01

    For stocks which are considered “data-poor” no knowledge exist about growth, mortality or recruitment. The only available information is from catches. Here we examine the ability to assess the level of exploitation of a data-poor stock based only on information of the size of individuals in catches....... The model is a formulation of the classic Beverton–Holt theory in terms of size where stock parameters describing growth, natural mortality, recruitment, etc. are determined from life-history invariants. A simulation study was used to compare the reliability of assessments performed under different...... to a considerable improvement in the assessment. Overall, the simulation study demonstrates that it may be possible to classify a data-poor stock as undergoing over- or under-fishing, while the exact status, i.e., how much the fishing mortality is above or below Fmsy, can only be assessed with a substantial...

  5. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    Directory of Open Access Journals (Sweden)

    Ning-Cong Xiao

    2013-12-01

    Full Text Available In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to calculate the maximum entropy density function of uncertainty parameters more accurately for it does not need any additional information and assumptions. Finally, two optimization models are presented which can be used to determine the lower and upper bounds of systems probability of failure under vague environment conditions. Two numerical examples are investigated to demonstrate the proposed method.

  6. Quick and reliable estimation of power distribution in a PHWR by ANN

    International Nuclear Information System (INIS)

    Dubey, B.P.; Jagannathan, V.; Kataria, S.K.

    1998-01-01

    Knowledge of the distribution of power in all the channels of a Pressurised Heavy Water Reactor (PHWR) as a result of a perturbation caused by one or more of the regulating devices is very important from the operation and maintenance point of view of the reactor. Theoretical design codes available for this purpose take several minutes to calculate the channel power distribution on modern PCs. Artificial Neural networks (ANNs) have been employed in predicting channel power distribution of Indian PHWRs for any given configuration of regulating devices of the reactor. ANNs produce the result much faster and with good accuracy. This paper describes the methodology of ANN, its reliability, the validation range, and scope for its possible on-line use in the actual reactor

  7. Reliability-based fatigue life estimation of shear riveted connections considering dependency of rivet hole failures

    Directory of Open Access Journals (Sweden)

    Leonetti* Davide

    2018-01-01

    Full Text Available Standards and guidelines for the fatigue design of riveted connections make use of a stress range-endurance (S-N curve based on the net section stress range regardless of the number and the position of the rivets. Almost all tests on which S-N curves are based, are performed with a minimum number of rivets. However, the number of rivets in a row is expected to increase the fail-safe behaviour of the connection, whereas the number of rows is supposed to decrease the theoretical stress concentration at the critical locations, and hence these aspects are not considered in the S-N curves. This paper presents a numerical model predicting the fatigue life of riveted connections by performing a system reliability analysis on a double cover plated riveted butt joint. The connection is considered in three geometries, with different number of rivets in a row and different number of rows. The stress state in the connection is evaluated using a finite element model in which the friction coefficient and the clamping force in the rivets are considered in a deterministic manner. The probability of failure is evaluated for the main plate, and fatigue failure is assumed to be originating at the sides of the rivet holes, the critical locations, or hot-spots. The notch stress approach is applied to assess the fatigue life, considered to be a stochastic quantity. Unlike other system reliability models available in the literature, the evaluation of the probability of failure takes into account the stochastic dependence between the failures at each critical location modelled as a parallel system, which means considering the change of the state of stress in the connection when a ligament between two rivets fails. A sensitivity study is performed to evaluate the effect of the pretension in the rivet and the friction coefficient on the fatigue life.

  8. Reliability of estimating the room volume from a single room impulse response

    OpenAIRE

    Kuster, M.

    2008-01-01

    The methods investigated for the room volume estimation are based on geometrical acoustics, eigenmode, and diffuse field models and no data other than the room impulse response are available. The measurements include several receiver positions in a total of 12 rooms of vastly different sizes and acoustic characteristics. The limitations in identifying the pivotal specular reflections of the geometrical acoustics model in measured room impulse responses are examined both theoretically and expe...

  9. Evaluation of environmental management cost estimating capabilities for the subject area ''Life-cycle economics for radioactive waste management and environmental remediation''

    International Nuclear Information System (INIS)

    Hombach, W.G.

    1995-01-01

    This paper provides a comprehensive perspective on the scope of Environmental Management (EM) activities and on the existing capability to estimate their costs. The scope is defined in terms of both activities and associated cost driving factors. The capability to estimate this scope was determined by evaluating existing cost estimating tools identified through a survey of the US Department of Energy (DOE), the US Department of Defense (DoD), the US Environmental Protection Agency, and private industry. This paper is largely based on the results of a report produced for the Office of the Secretary of Defense, US Department of Defense, entitled, Evaluation of Environmental Management Cost-Estimating Capabilities of Major Defense Acquisition Programs, March 22, 1995. The DoD sponsored report was designed to have a broad application relevant not only to DoD, but to other government agencies, and industry. In addition to DoD, it has particular application to DOE because significant portions of the analyses and data were derived from DOE environmental management databases, cost models, reports, and work breakdown structures. This paper provides the basis used and methodology employed to conduct an evaluations of selected EM cost estimating tools. The following topics are discussed: Life Cycle of EM Activities; Major Elements of EM Activities; Cost Tool Evaluation Matrix; Results of Cost Tool Evaluations; Cost Tool Development Plan

  10. Development of a reliable estimation procedure of radioactivity inventory in a BWR plant due to neutron irradiation for decommissioning

    Directory of Open Access Journals (Sweden)

    Tanaka Ken-ichi

    2017-01-01

    Full Text Available Reliable information of radioactivity inventory resulted from the radiological characterization is important in order to plan decommissioning planning and is also crucial in order to promote decommissioning in effectiveness and in safe. The information is referred to by planning of decommissioning strategy and by an application to regulator. Reliable information of radioactivity inventory can be used to optimize the decommissioning processes. In order to perform the radiological characterization reliably, we improved a procedure of an evaluation of neutron-activated materials for a Boiling Water Reactor (BWR. Neutron-activated materials are calculated with calculation codes and their validity should be verified with measurements. The evaluation of neutron-activated materials can be divided into two processes. One is a distribution calculation of neutron-flux. Another is an activation calculation of materials. The distribution calculation of neutron-flux is performed with neutron transport calculation codes with appropriate cross section library to simulate neutron transport phenomena well. Using the distribution of neutron-flux, we perform distribution calculations of radioactivity concentration. We also estimate a time dependent distribution of radioactivity classification and a radioactive-waste classification. The information obtained from the evaluation is utilized by other tasks in the preparatory tasks to make the decommissioning plan and the activity safe and rational.

  11. Estimates of the burst reliability of thin-walled cylinders designed to meet the ASME Code allowables

    International Nuclear Information System (INIS)

    Stancampiano, P.A.; Zemanick, P.P.

    1976-01-01

    Pressure containment components in nuclear power plants are designed by the conventional deterministic safety factor approach to meet the requirements of the ASME Pressure Vessel Code, Section III. The inevitable variabilities and uncertainties associated with the design, manufacture, installation, and service processes suggest a probabilistic design approach may also be pertinent. Accordingly, the burst reliabilities of two thin-walled 304 SS cylindrical vessels such as might be employed in liquid metal plants are estimated. A large vessel fabricated from rolled plate per ASME SA-240 and a smaller pipe sized vessel also fabricated from rolled plate per ASME SA-358 are considered. The vessels are sized to just meet the allowable ASME Code primary membrance stresses at 800 0 F (427 0 C). The bursting probability that the operating pressure is greater than the burst strength of the cylinders is calculated using stress-strength interference theory by direct Monte Carlo simulation on a high speed digital computer. A sensitivity study is employed to identify those design parameters which have the greatest effect on the reliability. The effects of preservice quality assurance defect inspections on the reliability are also evaluated parametrically

  12. Development of a reliable estimation procedure of radioactivity inventory in a BWR plant due to neutron irradiation for decommissioning

    Science.gov (United States)

    Tanaka, Ken-ichi; Ueno, Jun

    2017-09-01

    Reliable information of radioactivity inventory resulted from the radiological characterization is important in order to plan decommissioning planning and is also crucial in order to promote decommissioning in effectiveness and in safe. The information is referred to by planning of decommissioning strategy and by an application to regulator. Reliable information of radioactivity inventory can be used to optimize the decommissioning processes. In order to perform the radiological characterization reliably, we improved a procedure of an evaluation of neutron-activated materials for a Boiling Water Reactor (BWR). Neutron-activated materials are calculated with calculation codes and their validity should be verified with measurements. The evaluation of neutron-activated materials can be divided into two processes. One is a distribution calculation of neutron-flux. Another is an activation calculation of materials. The distribution calculation of neutron-flux is performed with neutron transport calculation codes with appropriate cross section library to simulate neutron transport phenomena well. Using the distribution of neutron-flux, we perform distribution calculations of radioactivity concentration. We also estimate a time dependent distribution of radioactivity classification and a radioactive-waste classification. The information obtained from the evaluation is utilized by other tasks in the preparatory tasks to make the decommissioning plan and the activity safe and rational.

  13. Systematic review of survival time in experimental mouse stroke with impact on reliability of infarct estimation

    DEFF Research Database (Denmark)

    Klarskov, Carina Kirstine; Klarskov, Mikkel Buster; Hasseldam, Henrik

    2016-01-01

    infarcts with more substantial edema. Purpose: This paper will give an overview of previous studies of experimental mouse stroke, and correlate survival time to peak time of edema formation. Furthermore, investigations of whether the included studies corrected the infarct measurements for edema...... of reasons for the translational problems from mouse experimental stroke to clinical trials probably exists, including infarct size estimations around the peak time of edema formation. Furthermore, edema is a more prominent feature of stroke in mice than in humans, because of the tendency to produce larger...... of the investigated process. Our findings indicate a need for more research in this area, and establishment of common correction methodology....

  14. Formulating informative, data-based priors for failure probability estimation in reliability analysis

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2007-01-01

    Priors play an important role in the use of Bayesian methods in risk analysis, and using all available information to formulate an informative prior can lead to more accurate posterior inferences. This paper examines the practical implications of using five different methods for formulating an informative prior for a failure probability based on past data. These methods are the method of moments, maximum likelihood (ML) estimation, maximum entropy estimation, starting from a non-informative 'pre-prior', and fitting a prior based on confidence/credible interval matching. The priors resulting from the use of these different methods are compared qualitatively, and the posteriors are compared quantitatively based on a number of different scenarios of observed data used to update the priors. The results show that the amount of information assumed in the prior makes a critical difference in the accuracy of the posterior inferences. For situations in which the data used to formulate the informative prior is an accurate reflection of the data that is later observed, the ML approach yields the minimum variance posterior. However, the maximum entropy approach is more robust to differences between the data used to formulate the prior and the observed data because it maximizes the uncertainty in the prior subject to the constraints imposed by the past data

  15. Framework for estimating response time data to conduct a seismic human reliability analysis - its feasibility

    International Nuclear Information System (INIS)

    Park, Jinkyun; Kin, Yochan; Jung, Wondea; Jang, Seung Cheol

    2014-01-01

    This is because the PSA has been used for several decades as the representative tool to evaluate the safety of NPPs. To this end, it is inevitable to evaluate human error probabilities (HEPs) in conducting important tasks being considered in the PSA framework (i.e., HFEs; human failure events), which are able to significantly affect the safety of NPPs. In addition, it should be emphasized that the provision of a realistic human performance data is an important precondition for calculating HEPs under a seismic condition. Unfortunately, it seems that HRA methods being currently used for calculating HEPs under a seismic event do not properly consider the performance variation of human operators. For this reason, in this paper, a framework to estimate response time data that are critical for calculating HEPs is suggested with respect to a seismic intensity. This paper suggested a systematic framework for estimating response time data that would be one of the most critical for calculating HEPs. Although extensive review of existing literatures is indispensable for identifying response times of human operators who have to conduct a series of tasks prescribed in procedures based on a couple of wrong indications, it is highly expected that response time data for seismic HRA can be properly secured through revisiting response time data collected from diverse situations without concerning a seismic event

  16. MARINE CORPS ASIA PACIFIC REALIGNMENT: DOD Should Resolve Capability Deficiencies and Infrastructure Risks and Revise Cost Estimates

    Science.gov (United States)

    2017-04-01

    Hawaii beginning in 2027. The addition of the Marines will likely cause additional strain on already stressed training ranges in Hawaii. As of April... controlled —identified as best practices in the GAO Schedule Assessment Guide.39 A reliable schedule allows program management to decide between...possible sequences of activities, determine the flexibility of the schedule according to available resources, predict the consequences of managerial

  17. A fast and reliable method for simultaneous waveform, amplitude and latency estimation of single-trial EEG/MEG data.

    Directory of Open Access Journals (Sweden)

    Wouter D Weeda

    Full Text Available The amplitude and latency of single-trial EEG/MEG signals may provide valuable information concerning human brain functioning. In this article we propose a new method to reliably estimate single-trial amplitude and latency of EEG/MEG signals. The advantages of the method are fourfold. First, no a-priori specified template function is required. Second, the method allows for multiple signals that may vary independently in amplitude and/or latency. Third, the method is less sensitive to noise as it models data with a parsimonious set of basis functions. Finally, the method is very fast since it is based on an iterative linear least squares algorithm. A simulation study shows that the method yields reliable estimates under different levels of latency variation and signal-to-noise ratioÕs. Furthermore, it shows that the existence of multiple signals can be correctly determined. An application to empirical data from a choice reaction time study indicates that the method describes these data accurately.

  18. Validity and Reliability of the Brazilian Version of the Rapid Estimate of Adult Literacy in Dentistry – BREALD-30

    Science.gov (United States)

    Junkes, Monica C.; Fraiz, Fabian C.; Sardenberg, Fernanda; Lee, Jessica Y.; Paiva, Saul M.; Ferreira, Fernanda M.

    2015-01-01

    Objective The aim of the present study was to translate, perform the cross-cultural adaptation of the Rapid Estimate of Adult Literacy in Dentistry to Brazilian-Portuguese language and test the reliability and validity of this version. Methods After translation and cross-cultural adaptation, interviews were conducted with 258 parents/caregivers of children in treatment at the pediatric dentistry clinics and health units in Curitiba, Brazil. To test the instrument's validity, the scores of Brazilian Rapid Estimate of Adult Literacy in Dentistry (BREALD-30) were compared based on occupation, monthly household income, educational attainment, general literacy, use of dental services and three dental outcomes. Results The BREALD-30 demonstrated good internal reliability. Cronbach’s alpha ranged from 0.88 to 0.89 when words were deleted individually. The analysis of test-retest reliability revealed excellent reproducibility (intraclass correlation coefficient = 0.983 and Kappa coefficient ranging from moderate to nearly perfect). In the bivariate analysis, BREALD-30 scores were significantly correlated with the level of general literacy (rs = 0.593) and income (rs = 0.327) and significantly associated with occupation, educational attainment, use of dental services, self-rated oral health and the respondent’s perception regarding his/her child's oral health. However, only the association between the BREALD-30 score and the respondent’s perception regarding his/her child's oral health remained significant in the multivariate analysis. Conclusion The BREALD-30 demonstrated satisfactory psychometric properties and is therefore applicable to adults in Brazil. PMID:26158724

  19. Reliability and validity of the Turkish version of the Rapid Estimate of Adult Literacy in Dentistry (TREALD-30).

    Science.gov (United States)

    Peker, Kadriye; Köse, Taha Emre; Güray, Beliz; Uysal, Ömer; Erdem, Tamer Lütfi

    2017-04-01

    To culturally adapt the Turkish version of Rapid Estimate of Adult Literacy in Dentistry (TREALD-30) for Turkish-speaking adult dental patients and to evaluate its psychometric properties. After translation and cross-cultural adaptation, TREALD-30 was tested in a sample of 127 adult patients who attended a dental school clinic in Istanbul. Data were collected through clinical examinations and self-completed questionnaires, including TREALD-30, the Oral Health Impact Profile (OHIP), the Rapid Estimate of Adult Literacy in Medicine (REALM), two health literacy screening questions, and socio-behavioral characteristics. Psychometric properties were examined using Classical Test Theory (CTT) and Rasch analysis. Internal consistency (Cronbach's Alpha = 0.91) and test-retest reliability (Intraclass correlation coefficient = 0.99) were satisfactory for TREALD-30. It exhibited good convergent and predictive validity. Monthly family income, years of education, dental flossing, health literacy, and health literacy skills were found as stronger predictors of patients'oral health literacy (OHL). Confirmatory factor analysis (CFA) confirmed a two-factor model. The Rasch model explained 37.9% of the total variance in this dataset. In addition, TREALD-30 had eleven misfitting items, which indicated evidence of multidimensionality. The reliability indeces provided in Rasch analysis (person separation reliability = 0.91 and expected-a-posteriori/plausible reliability = 0.94) indicated that TREALD-30 had acceptable reliability. TREALD-30 showed satisfactory psychometric properties. It may be used to identify patients with low OHL. Socio-demographic factors, oral health behaviors and health literacy skills should be taken into account when planning future studies to assess the OHL in both clinical and community settings.

  20. Intra-rater reliability of motor unit number estimation and quantitative motor unit analysis in subjects with amyotrophic lateral sclerosis.

    Science.gov (United States)

    Ives, Colleen T; Doherty, Timothy J

    2014-01-01

    To assess the intra-rater reliability of decomposition-enhanced spike-triggered averaging (DE-STA) motor unit number estimation (MUNE) and quantitative motor unit potential analysis in the upper trapezius (UT) and biceps brachii (BB) of subjects with amyotrophic lateral sclerosis (ALS) and to compare the results from the UT to control data. Patients diagnosed with clinically probable or definite ALS completed the experimental protocol twice with the same evaluator for the UT (n=10) and BB (n=9). Intra-rater reliability for the UT was good for the maximum compound muscle action potential (CMAP) (ICC=0.88), mean surface-detected motor unit potential (S-MUP) (ICC=0.87) and MUNE (ICC=0.88), and for the BB was moderate for maximum CMAP (ICC=0.61), and excellent for mean S-MUP (ICC=0.94) and MUNE (ICC=0.93). A significant difference between tests was found for UT MUNE. Comparing subjects with ALS to control subjects, UT maximum CMAP (p<0.01) and MUNE (p<0.001) values were significantly lower, and mean S-MUP values significantly greater (p<0.05) in subjects with ALS. This study has demonstrated the ability of the DE-STA MUNE technique to collect highly reliable data from two separate muscle groups and to detect the underlying pathophysiology of the disease. This was the first study to examine the reliability of this technique in subjects with ALS, and demonstrates its potential for future use as an outcome measure in ALS clinical trials and studies of ALS disease severity and natural history. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  1. Assimilation of Remotely Sensed Soil Moisture Profiles into a Crop Modeling Framework for Reliable Yield Estimations

    Science.gov (United States)

    Mishra, V.; Cruise, J.; Mecikalski, J. R.

    2017-12-01

    Much effort has been expended recently on the assimilation of remotely sensed soil moisture into operational land surface models (LSM). These efforts have normally been focused on the use of data derived from the microwave bands and results have often shown that improvements to model simulations have been limited due to the fact that microwave signals only penetrate the top 2-5 cm of the soil surface. It is possible that model simulations could be further improved through the introduction of geostationary satellite thermal infrared (TIR) based root zone soil moisture in addition to the microwave deduced surface estimates. In this study, root zone soil moisture estimates from the TIR based Atmospheric Land Exchange Inverse (ALEXI) model were merged with NASA Soil Moisture Active Passive (SMAP) based surface estimates through the application of informational entropy. Entropy can be used to characterize the movement of moisture within the vadose zone and accounts for both advection and diffusion processes. The Principle of Maximum Entropy (POME) can be used to derive complete soil moisture profiles and, fortuitously, only requires a surface boundary condition as well as the overall mean moisture content of the soil column. A lower boundary can be considered a soil parameter or obtained from the LSM itself. In this study, SMAP provided the surface boundary while ALEXI supplied the mean and the entropy integral was used to tie the two together and produce the vertical profile. However, prior to the merging, the coarse resolution (9 km) SMAP data were downscaled to the finer resolution (4.7 km) ALEXI grid. The disaggregation scheme followed the Soil Evaporative Efficiency approach and again, all necessary inputs were available from the TIR model. The profiles were then assimilated into a standard agricultural crop model (Decision Support System for Agrotechnology, DSSAT) via the ensemble Kalman Filter. The study was conducted over the Southeastern United States for the

  2. Precipitation estimates and comparison of satellite rainfall data to in situ rain gauge observations to further develop the watershed-modeling capabilities for the Lower Mekong River Basin

    Science.gov (United States)

    Dandridge, C.; Lakshmi, V.; Sutton, J. R. P.; Bolten, J. D.

    2017-12-01

    This study focuses on the lower region of the Mekong River Basin (MRB), an area including Burma, Cambodia, Vietnam, Laos, and Thailand. This region is home to expansive agriculture that relies heavily on annual precipitation over the basin for its prosperity. Annual precipitation amounts are regulated by the global monsoon system and therefore vary throughout the year. This research will lead to improved prediction of floods and management of floodwaters for the MRB. We compare different satellite estimates of precipitation to each other and to in-situ precipitation estimates for the Mekong River Basin. These comparisons will help us determine which satellite precipitation estimates are better at predicting precipitation in the MRB and will help further our understanding of watershed-modeling capabilities for the basin. In this study we use: 1) NOAA's PERSIANN daily 0.25° precipitation estimate Climate Data Record (CDR), 2) NASA's Tropical Rainfall Measuring Mission (TRMM) daily 0.25° estimate, and 3) NASA's Global Precipitation Measurement (GPM) daily 0.1 estimate and 4) 488 in-situ stations located in the lower MRB provide daily precipitation estimates. The PERSIANN CDR precipitation estimate was able to provide the longest data record because it is available from 1983 to present. The TRMM precipitation estimate is available from 2000 to present and the GPM precipitation estimates are available from 2015 to present. It is for this reason that we provide several comparisons between our precipitation estimates. Comparisons were done between each satellite product and the in-situ precipitation estimates based on geographical location and date using the entire available data record for each satellite product for daily, monthly, and yearly precipitation estimates. We found that monthly PERSIANN precipitation estimates were able to explain up to 90% of the variability in station precipitation depending on station location.

  3. CO{sub 2}-recycling by plants: how reliable is the carbon isotope estimation?

    Energy Technology Data Exchange (ETDEWEB)

    Siegwolf, R T.W.; Saurer, M [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Koerner, C [Basel Univ., Basel (Switzerland)

    1997-06-01

    In the study of plant carbon relations, the amount of the respiratory losses from the soil was estimated, determining the gradient of the stable isotope {sup 13}C with increasing plant canopy height. According to the literature 8-26% of the CO{sub 2} released in the forests by soil and plant respiratory processes are reassimilated (recycled) by photosynthesis during the day. Our own measurements however, which we conducted in grass land showed diverging results from no indicating of carbon recycling, to a considerable {delta}{sup 13}C gradient suggesting a high carbon recycling rate. The role of other factors, such as air humidity and irradiation which influence the {delta}{sup 13}C in a canopy as well, are discussed. (author) 3 figs., 4 refs.

  4. Classification of Amazonian rosewood essential oil by Raman spectroscopy and PLS-DA with reliability estimation.

    Science.gov (United States)

    Almeida, Mariana R; Fidelis, Carlos H V; Barata, Lauro E S; Poppi, Ronei J

    2013-12-15

    The Amazon tree Aniba rosaeodora Ducke (rosewood) provides an essential oil valuable for the perfume industry, but after decades of predatory extraction it is at risk of extinction. The extraction of the essential oil from wood implies the cutting of the tree, and then the study of oil extracted from the leaves is important as a sustainable alternative. The goal of this study was to test the applicability of Raman spectroscopy and Partial Least Square Discriminant Analysis (PLS-DA) as means to classify the essential oil extracted from different parties (wood, leaves and branches) of the Brazilian tree A. rosaeodora. For the development of classification models, the Raman spectra were split into two sets: training and test. The value of the limit that separates the classes was calculated based on the distribution of samples of training. This value was calculated in a manner that the classes are divided with a lower probability of incorrect classification for future estimates. The best model presented sensitivity and specificity of 100%, predictive accuracy and efficiency of 100%. These results give an overall vision of the behavior of the model, but do not give information about individual samples; in this case, the confidence interval for each sample of classification was also calculated using the resampling bootstrap technique. The methodology developed have the potential to be an alternative for standard procedures used for oil analysis and it can be employed as screening method, since it is fast, non-destructive and robust. © 2013 Elsevier B.V. All rights reserved.

  5. Reliability of estimated glomerular filtration rate in patients treated with platinum containing therapy

    DEFF Research Database (Denmark)

    Lauritsen, Jakob; Gundgaard, Maria G; Mortensen, Mette S

    2014-01-01

    (median percentage error), precision (median absolute percentage error) and accuracy (p10 and p30). The precision of carboplatin dosage based on eGFR was calculated. Data on mGFR, eGFR, and PCr were available in 390 patients, with a total of ∼ 1,600 measurements. Median PCr and mGFR synchronically...... decreased after chemotherapy, yielding high bias and low precision of most estimates. Post-chemotherapy, bias ranged from -0.2% (MDRD after four cycles) to 33.8% (CKD-EPI after five cycles+), precision ranged from 11.6% (MDRD after four cycles) to 33.8% (CKD-EPI after five cycles+) and accuracy (p30) ranged...... from 37.5% (CKD-EPI after five cycles+) to 86.9% (MDRD after four cycles). Although MDRD appeared acceptable after chemotherapy because of high accuracy, this equation underestimated GFR in all other measurements. Before and years after treatment, Cockcroft-Gault and Wright offered best results...

  6. Methods of Estimation the Reliability and Increasing the Informativeness of the Laboratory Results (Analysis of the Laboratory Case of Measurement the Indicators of Thyroid Function)

    OpenAIRE

    N A Kovyazina; N A Alhutova; N N Zybina; N M Kalinina

    2014-01-01

    The goal of the study was to demonstrate the multilevel laboratory quality management system and point at the methods of estimating the reliability and increasing the amount of information content of the laboratory results (on the example of the laboratory case). Results. The article examines the stages of laboratory quality management which has helped to estimate the reliability of the results of determining Free T3, Free T4 and TSH. The measurement results are presented by the expanded unce...

  7. Reliable estimation of antimicrobial use and its evolution between 2010 and 2013 in French swine farms.

    Science.gov (United States)

    Hémonic, Anne; Chauvin, Claire; Delzescaux, Didier; Verliat, Fabien; Corrégé, Isabelle

    2018-01-01

    There has been a strong implication of both the French swine industry and the national authorities on reducing the use of antimicrobials in swine production since 2010. The annual monitoring of antimicrobial sales by the French Veterinary Medicines Agency (Anses-ANMV) provides estimates but not detailed figures on actual on-farm usage of antimicrobials in swine production. In order to provide detailed information on the 2010 and 2013 antimicrobial use in the French swine industry, the methodology of cross-sectional retrospective study on a representative sample of at least 150 farms has been elected. The analysis of the collected data shows a strong and significant decrease in antimicrobial exposure of pigs between 2010 and 2013. Over three years, the average number of days of treatment significantly decreased by 29% in suckling piglets and by 19% in weaned piglets. In fattening pigs, the drop (- 29%) was not statistically significant. Only usage in sows did increase over that period (+ 17%, non-significant), which might be associated with the transition to group-housing of pregnant sows that took place at the time. Also, over that period, the use of third- and fourth generation cephalosporins in suckling piglets decreased by 89%, and by 82% in sows, which confirms that the voluntary moratorium on these classes of antimicrobials decided at the end of 2010 has been effectively implemented. The methodology of random sampling of farms appears as a precise and robust tool to monitor antimicrobial use within a production animal species, able to fulfil industry and national authorities' objectives and requirements to assess the outcome of concerted efforts on antimicrobial use reduction. It demonstrates that the use of antimicrobials decreased in the French swine industry between 2010 and 2013, including the classes considered as critical for human medicine.

  8. Estimating the impact of structural directionality: How reliable are undirected connectomes?

    Directory of Open Access Journals (Sweden)

    Penelope Kale

    2018-06-01

    Full Text Available Directionality is a fundamental feature of network connections. Most structural brain networks are intrinsically directed because of the nature of chemical synapses, which comprise most neuronal connections. Because of the limitations of noninvasive imaging techniques, the directionality of connections between structurally connected regions of the human brain cannot be confirmed. Hence, connections are represented as undirected, and it is still unknown how this lack of directionality affects brain network topology. Using six directed brain networks from different species and parcellations (cat, mouse, C. elegans, and three macaque networks, we estimate the inaccuracies in network measures (degree, betweenness, clustering coefficient, path length, global efficiency, participation index, and small-worldness associated with the removal of the directionality of connections. We employ three different methods to render directed brain networks undirected: (a remove unidirectional connections, (b add reciprocal connections, and (c combine equal numbers of removed and added unidirectional connections. We quantify the extent of inaccuracy in network measures introduced through neglecting connection directionality for individual nodes and across the network. We find that the coarse division between core and peripheral nodes remains accurate for undirected networks. However, hub nodes differ considerably when directionality is neglected. Comparing the different methods to generate undirected networks from directed ones, we generally find that the addition of reciprocal connections (false positives causes larger errors in graph-theoretic measures than the removal of the same number of directed connections (false negatives. These findings suggest that directionality plays an essential role in shaping brain networks and highlight some limitations of undirected connectomes. Most brain networks are inherently directed because of the nature of chemical synapses

  9. A practical approach for calculating reliable cost estimates from observational data: application to cost analyses in maternal and child health.

    Science.gov (United States)

    Salemi, Jason L; Comins, Meg M; Chandler, Kristen; Mogos, Mulubrhan F; Salihu, Hamisu M

    2013-08-01

    Comparative effectiveness research (CER) and cost-effectiveness analysis are valuable tools for informing health policy and clinical care decisions. Despite the increased availability of rich observational databases with economic measures, few researchers have the skills needed to conduct valid and reliable cost analyses for CER. The objectives of this paper are to (i) describe a practical approach for calculating cost estimates from hospital charges in discharge data using publicly available hospital cost reports, and (ii) assess the impact of using different methods for cost estimation in maternal and child health (MCH) studies by conducting economic analyses on gestational diabetes (GDM) and pre-pregnancy overweight/obesity. In Florida, we have constructed a clinically enhanced, longitudinal, encounter-level MCH database covering over 2.3 million infants (and their mothers) born alive from 1998 to 2009. Using this as a template, we describe a detailed methodology to use publicly available data to calculate hospital-wide and department-specific cost-to-charge ratios (CCRs), link them to the master database, and convert reported hospital charges to refined cost estimates. We then conduct an economic analysis as a case study on women by GDM and pre-pregnancy body mass index (BMI) status to compare the impact of using different methods on cost estimation. Over 60 % of inpatient charges for birth hospitalizations came from the nursery/labor/delivery units, which have very different cost-to-charge markups (CCR = 0.70) than the commonly substituted hospital average (CCR = 0.29). Using estimated mean, per-person maternal hospitalization costs for women with GDM as an example, unadjusted charges ($US14,696) grossly overestimated actual cost, compared with hospital-wide ($US3,498) and department-level ($US4,986) CCR adjustments. However, the refined cost estimation method, although more accurate, did not alter our conclusions that infant/maternal hospitalization costs

  10. Estimation of gingival crevicular blood glucose level for the screening of diabetes mellitus: A simple yet reliable method.

    Science.gov (United States)

    Parihar, Sarita; Tripathi, Richik; Parihar, Ajit Vikram; Samadi, Fahad M; Chandra, Akhilesh; Bhavsar, Neeta

    2016-01-01

    This study was designed to assess the reliability of blood glucose level estimation in gingival crevicular blood(GCB) for screening diabetes mellitus. 70 patients were included in study. A randomized, double-blind clinical trial was performed. Among these, 39 patients were diabetic (including 4 patients who were diagnosed during the study) and rest 31 patients were non-diabetic. GCB obtained during routine periodontal examination was analyzed by glucometer to know blood glucose level. The same patient underwent for finger stick blood (FSB) glucose level estimation with glucometer and venous blood (VB) glucose level with standardized laboratory method as per American Diabetes Association Guidelines. 1 All the three blood glucose levels were compared. Periodontal parameters were also recorded including gingival index (GI) and probing pocket depth (PPD). A strong positive correlation ( r ) was observed between glucose levels of GCB with FSB and VB with the values of 0.986 and 0.972 in diabetic group and 0.820 and 0.721 in non-diabetic group. As well, the mean values of GI and PPD were more in diabetic group than non-diabetic group with the statistically significant difference ( p  blood glucose level as the values were closest to glucose levels estimated by VB. The technique is safe, easy to perform and non-invasive to the patient and can increase the frequency of diagnosing diabetes during routine periodontal therapy.

  11. Reliability of CKD-EPI predictive equation in estimating chronic kidney disease prevalence in the Croatian endemic nephropathy area.

    Science.gov (United States)

    Fuček, Mirjana; Dika, Živka; Karanović, Sandra; Vuković Brinar, Ivana; Premužić, Vedran; Kos, Jelena; Cvitković, Ante; Mišić, Maja; Samardžić, Josip; Rogić, Dunja; Jelaković, Bojan

    2018-02-15

    Chronic kidney disease (CKD) is a significant public health problem and it is not possible to precisely predict its progression to terminal renal failure. According to current guidelines, CKD stages are classified based on the estimated glomerular filtration rate (eGFR) and albuminuria. Aims of this study were to determine the reliability of predictive equation in estimation of CKD prevalence in Croatian areas with endemic nephropathy (EN), compare the results with non-endemic areas, and to determine if the prevalence of CKD stages 3-5 was increased in subjects with EN. A total of 1573 inhabitants of the Croatian Posavina rural area from 6 endemic and 3 non-endemic villages were enrolled. Participants were classified according to the modified criteria of the World Health Organization for EN. Estimated GFR was calculated using Chronic Kidney Disease Epidemiology Collaboration equation (CKD-EPI). The results showed a very high CKD prevalence in the Croatian rural area (19%). CKD prevalence was significantly higher in EN then in non EN villages with the lowest eGFR value in diseased subgroup. eGFR correlated significantly with the diagnosis of EN. Kidney function assessment using CKD-EPI predictive equation proved to be a good marker in differentiating the study subgroups, remained as one of the diagnostic criteria for EN.

  12. Girsanov's transformation based variance reduced Monte Carlo simulation schemes for reliability estimation in nonlinear stochastic dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Kanjilal, Oindrila, E-mail: oindrila@civil.iisc.ernet.in; Manohar, C.S., E-mail: manohar@civil.iisc.ernet.in

    2017-07-15

    The study considers the problem of simulation based time variant reliability analysis of nonlinear randomly excited dynamical systems. Attention is focused on importance sampling strategies based on the application of Girsanov's transformation method. Controls which minimize the distance function, as in the first order reliability method (FORM), are shown to minimize a bound on the sampling variance of the estimator for the probability of failure. Two schemes based on the application of calculus of variations for selecting control signals are proposed: the first obtains the control force as the solution of a two-point nonlinear boundary value problem, and, the second explores the application of the Volterra series in characterizing the controls. The relative merits of these schemes, vis-à-vis the method based on ideas from the FORM, are discussed. Illustrative examples, involving archetypal single degree of freedom (dof) nonlinear oscillators, and a multi-degree of freedom nonlinear dynamical system, are presented. The credentials of the proposed procedures are established by comparing the solutions with pertinent results from direct Monte Carlo simulations. - Highlights: • The distance minimizing control forces minimize a bound on the sampling variance. • Establishing Girsanov controls via solution of a two-point boundary value problem. • Girsanov controls via Volterra's series representation for the transfer functions.

  13. The capability of professional- and lay-rescuers to estimate the chest compression-depth target: a short, randomized experiment.

    Science.gov (United States)

    van Tulder, Raphael; Laggner, Roberta; Kienbacher, Calvin; Schmid, Bernhard; Zajicek, Andreas; Haidvogel, Jochen; Sebald, Dieter; Laggner, Anton N; Herkner, Harald; Sterz, Fritz; Eisenburger, Philip

    2015-04-01

    In CPR, sufficient compression depth is essential. The American Heart Association ("at least 5cm", AHA-R) and the European Resuscitation Council ("at least 5cm, but not to exceed 6cm", ERC-R) recommendations differ, and both are hardly achieved. This study aims to investigate the effects of differing target depth instructions on compression depth performances of professional and lay-rescuers. 110 professional-rescuers and 110 lay-rescuers were randomized (1:1, 4 groups) to estimate the AHA-R or ERC-R on a paper sheet (given horizontal axis) using a pencil and to perform chest compressions according to AHA-R or ERC-R on a manikin. Distance estimation and compression depth were the outcome variables. Professional-rescuers estimated the distance according to AHA-R in 19/55 (34.5%) and to ERC-R in 20/55 (36.4%) cases (p=0.84). Professional-rescuers achieved correct compression depth according to AHA-R in 39/55 (70.9%) and to ERC-R in 36/55 (65.4%) cases (p=0.97). Lay-rescuers estimated the distance correctly according to AHA-R in 18/55 (32.7%) and to ERC-R in 20/55 (36.4%) cases (p=0.59). Lay-rescuers yielded correct compression depth according to AHA-R in 39/55 (70.9%) and to ERC-R in 26/55 (47.3%) cases (p=0.02). Professional and lay-rescuers have severe difficulties in correctly estimating distance on a sheet of paper. Professional-rescuers are able to yield AHA-R and ERC-R targets likewise. In lay-rescuers AHA-R was associated with significantly higher success rates. The inability to estimate distance could explain the failure to appropriately perform chest compressions. For teaching lay-rescuers, the AHA-R with no upper limit of compression depth might be preferable. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Determinants of the reliability of ultrasound tomography sound speed estimates as a surrogate for volumetric breast density

    Energy Technology Data Exchange (ETDEWEB)

    Khodr, Zeina G.; Pfeiffer, Ruth M.; Gierach, Gretchen L., E-mail: GierachG@mail.nih.gov [Department of Health and Human Services, Division of Cancer Epidemiology and Genetics, National Cancer Institute, 9609 Medical Center Drive MSC 9774, Bethesda, Maryland 20892 (United States); Sak, Mark A.; Bey-Knight, Lisa [Karmanos Cancer Institute, Wayne State University, 4100 John R, Detroit, Michigan 48201 (United States); Duric, Nebojsa; Littrup, Peter [Karmanos Cancer Institute, Wayne State University, 4100 John R, Detroit, Michigan 48201 and Delphinus Medical Technologies, 46701 Commerce Center Drive, Plymouth, Michigan 48170 (United States); Ali, Haythem; Vallieres, Patricia [Henry Ford Health System, 2799 W Grand Boulevard, Detroit, Michigan 48202 (United States); Sherman, Mark E. [Division of Cancer Prevention, National Cancer Institute, Department of Health and Human Services, 9609 Medical Center Drive MSC 9774, Bethesda, Maryland 20892 (United States)

    2015-10-15

    Purpose: High breast density, as measured by mammography, is associated with increased breast cancer risk, but standard methods of assessment have limitations including 2D representation of breast tissue, distortion due to breast compression, and use of ionizing radiation. Ultrasound tomography (UST) is a novel imaging method that averts these limitations and uses sound speed measures rather than x-ray imaging to estimate breast density. The authors evaluated the reproducibility of measures of speed of sound and changes in this parameter using UST. Methods: One experienced and five newly trained raters measured sound speed in serial UST scans for 22 women (two scans per person) to assess inter-rater reliability. Intrarater reliability was assessed for four raters. A random effects model was used to calculate the percent variation in sound speed and change in sound speed attributable to subject, scan, rater, and repeat reads. The authors estimated the intraclass correlation coefficients (ICCs) for these measures based on data from the authors’ experienced rater. Results: Median (range) time between baseline and follow-up UST scans was five (1–13) months. Contributions of factors to sound speed variance were differences between subjects (86.0%), baseline versus follow-up scans (7.5%), inter-rater evaluations (1.1%), and intrarater reproducibility (∼0%). When evaluating change in sound speed between scans, 2.7% and ∼0% of variation were attributed to inter- and intrarater variation, respectively. For the experienced rater’s repeat reads, agreement for sound speed was excellent (ICC = 93.4%) and for change in sound speed substantial (ICC = 70.4%), indicating very good reproducibility of these measures. Conclusions: UST provided highly reproducible sound speed measurements, which reflect breast density, suggesting that UST has utility in sensitively assessing change in density.

  15. Validating the absolute reliability of a fat free mass estimate equation in hemodialysis patients using near-infrared spectroscopy.

    Science.gov (United States)

    Kono, Kenichi; Nishida, Yusuke; Moriyama, Yoshihumi; Taoka, Masahiro; Sato, Takashi

    2015-06-01

    The assessment of nutritional states using fat free mass (FFM) measured with near-infrared spectroscopy (NIRS) is clinically useful. This measurement should incorporate the patient's post-dialysis weight ("dry weight"), in order to exclude the effects of any change in water mass. We therefore used NIRS to investigate the regression, independent variables, and absolute reliability of FFM in dry weight. The study included 47 outpatients from the hemodialysis unit. Body weight was measured before dialysis, and FFM was measured using NIRS before and after dialysis treatment. Multiple regression analysis was used to estimate the FFM in dry weight as the dependent variable. The measured FFM before dialysis treatment (Mw-FFM), and the difference between measured and dry weight (Mw-Dw) were independent variables. We performed Bland-Altman analysis to detect errors between the statistically estimated FFM and the measured FFM after dialysis treatment. The multiple regression equation to estimate the FFM in dry weight was: Dw-FFM = 0.038 + (0.984 × Mw-FFM) + (-0.571 × [Mw-Dw]); R(2)  = 0.99). There was no systematic bias between the estimated and the measured values of FFM in dry weight. Using NIRS, FFM in dry weight can be calculated by an equation including FFM in measured weight and the difference between the measured weight and the dry weight. © 2015 The Authors. Therapeutic Apheresis and Dialysis © 2015 International Society for Apheresis.

  16. Reliability estimation of a N- M-cold-standby redundancy system in a multicomponent stress-strength model with generalized half-logistic distribution

    Science.gov (United States)

    Liu, Yiming; Shi, Yimin; Bai, Xuchao; Zhan, Pei

    2018-01-01

    In this paper, we study the estimation for the reliability of a multicomponent system, named N- M-cold-standby redundancy system, based on progressive Type-II censoring sample. In the system, there are N subsystems consisting of M statistically independent distributed strength components, and only one of these subsystems works under the impact of stresses at a time and the others remain as standbys. Whenever the working subsystem fails, one from the standbys takes its place. The system fails when the entire subsystems fail. It is supposed that the underlying distributions of random strength and stress both belong to the generalized half-logistic distribution with different shape parameter. The reliability of the system is estimated by using both classical and Bayesian statistical inference. Uniformly minimum variance unbiased estimator and maximum likelihood estimator for the reliability of the system are derived. Under squared error loss function, the exact expression of the Bayes estimator for the reliability of the system is developed by using the Gauss hypergeometric function. The asymptotic confidence interval and corresponding coverage probabilities are derived based on both the Fisher and the observed information matrices. The approximate highest probability density credible interval is constructed by using Monte Carlo method. Monte Carlo simulations are performed to compare the performances of the proposed reliability estimators. A real data set is also analyzed for an illustration of the findings.

  17. Room at the Mountain: Estimated Maximum Amounts of Commercial Spent Nuclear Fuel Capable of Disposal in a Yucca Mountain Repository

    International Nuclear Information System (INIS)

    Kessler, John H.; Kemeny, John; King, Fraser; Ross, Alan M.; Ross, Benjamen

    2006-01-01

    The purpose of this paper is to present an initial analysis of the maximum amount of commercial spent nuclear fuel (CSNF) that could be emplaced into a geological repository at Yucca Mountain. This analysis identifies and uses programmatic, material, and geological constraints and factors that affect this estimation of maximum amount of CSNF for disposal. The conclusion of this initial analysis is that the current legislative limit on Yucca Mountain disposal capacity, 63,000 MTHM of CSNF, is a small fraction of the available physical capacity of the Yucca Mountain system assuming the current high-temperature operating mode (HTOM) design. EPRI is confident that at least four times the legislative limit for CSNF (∼260,000 MTHM) can be emplaced in the Yucca Mountain system. It is possible that with additional site characterization, upwards of nine times the legislative limit (∼570,000 MTHM) could be emplaced. (authors)

  18. Capability Paternalism

    NARCIS (Netherlands)

    Claassen, R.J.G.|info:eu-repo/dai/nl/269266224

    A capability approach prescribes paternalist government actions to the extent that it requires the promotion of specific functionings, instead of the corresponding capabilities. Capability theorists have argued that their theories do not have much of these paternalist implications, since promoting

  19. Assessing the impact of uncertainty on flood risk estimates with reliability analysis using 1-D and 2-D hydraulic models

    Directory of Open Access Journals (Sweden)

    L. Altarejos-García

    2012-07-01

    Full Text Available This paper addresses the use of reliability techniques such as Rosenblueth's Point-Estimate Method (PEM as a practical alternative to more precise Monte Carlo approaches to get estimates of the mean and variance of uncertain flood parameters water depth and velocity. These parameters define the flood severity, which is a concept used for decision-making in the context of flood risk assessment. The method proposed is particularly useful when the degree of complexity of the hydraulic models makes Monte Carlo inapplicable in terms of computing time, but when a measure of the variability of these parameters is still needed. The capacity of PEM, which is a special case of numerical quadrature based on orthogonal polynomials, to evaluate the first two moments of performance functions such as the water depth and velocity is demonstrated in the case of a single river reach using a 1-D HEC-RAS model. It is shown that in some cases, using a simple variable transformation, statistical distributions of both water depth and velocity approximate the lognormal. As this distribution is fully defined by its mean and variance, PEM can be used to define the full probability distribution function of these flood parameters and so allowing for probability estimations of flood severity. Then, an application of the method to the same river reach using a 2-D Shallow Water Equations (SWE model is performed. Flood maps of mean and standard deviation of water depth and velocity are obtained, and uncertainty in the extension of flooded areas with different severity levels is assessed. It is recognized, though, that whenever application of Monte Carlo method is practically feasible, it is a preferred approach.

  20. The transverse diameter of the chest on routine radiographs reliably estimates gestational age and weight in premature infants

    Energy Technology Data Exchange (ETDEWEB)

    Dietz, Kelly R. [University of Minnesota, Department of Radiology, Minneapolis, MN (United States); Zhang, Lei [University of Minnesota, Biostatistical Design and Analysis Center, Minneapolis, MN (United States); Seidel, Frank G. [Lucile Packard Children' s Hospital, Department of Radiology, Stanford, CA (United States)

    2015-08-15

    Prior to digital radiography it was possible for a radiologist to easily estimate the size of a patient on an analog film. Because variable magnification may be applied at the time of processing an image, it is now more difficult to visually estimate an infant's size on the monitor. Since gestational age and weight significantly impact the differential diagnosis of neonatal diseases and determine the expected size of kidneys or appearance of the brain by MRI or US, this information is useful to a pediatric radiologist. Although this information may be present in the electronic medical record, it is frequently not readily available to the pediatric radiologist at the time of image interpretation. To determine if there was a correlation between gestational age and weight of a premature infant with their transverse chest diameter (rib to rib) on admission chest radiographs. This retrospective study was approved by the institutional review board, which waived informed consent. The maximum transverse chest diameter outer rib to outer rib was measured on admission portable chest radiographs of 464 patients admitted to the neonatal intensive care unit (NICU) during the 2010 calendar year. Regression analysis was used to investigate the association between chest diameter and gestational age/birth weight. Quadratic term of chest diameter was used in the regression model. Chest diameter was statistically significantly associated with both gestational age (P < 0.0001) and birth weight (P < 0.0001). An infant's gestational age and birth weight can be reliably estimated by comparing a simple measurement of the transverse chest diameter on digital chest radiograph with the tables and graphs in our study. (orig.)

  1. Linear Interaction Energy Based Prediction of Cytochrome P450 1A2 Binding Affinities with Reliability Estimation.

    Directory of Open Access Journals (Sweden)

    Luigi Capoferri

    Full Text Available Prediction of human Cytochrome P450 (CYP binding affinities of small ligands, i.e., substrates and inhibitors, represents an important task for predicting drug-drug interactions. A quantitative assessment of the ligand binding affinity towards different CYPs can provide an estimate of inhibitory activity or an indication of isoforms prone to interact with the substrate of inhibitors. However, the accuracy of global quantitative models for CYP substrate binding or inhibition based on traditional molecular descriptors can be limited, because of the lack of information on the structure and flexibility of the catalytic site of CYPs. Here we describe the application of a method that combines protein-ligand docking, Molecular Dynamics (MD simulations and Linear Interaction Energy (LIE theory, to allow for quantitative CYP affinity prediction. Using this combined approach, a LIE model for human CYP 1A2 was developed and evaluated, based on a structurally diverse dataset for which the estimated experimental uncertainty was 3.3 kJ mol-1. For the computed CYP 1A2 binding affinities, the model showed a root mean square error (RMSE of 4.1 kJ mol-1 and a standard error in prediction (SDEP in cross-validation of 4.3 kJ mol-1. A novel approach that includes information on both structural ligand description and protein-ligand interaction was developed for estimating the reliability of predictions, and was able to identify compounds from an external test set with a SDEP for the predicted affinities of 4.6 kJ mol-1 (corresponding to 0.8 pKi units.

  2. An Efficient and Reliable Statistical Method for Estimating Functional Connectivity in Large Scale Brain Networks Using Partial Correlation.

    Science.gov (United States)

    Wang, Yikai; Kang, Jian; Kemmer, Phebe B; Guo, Ying

    2016-01-01

    Currently, network-oriented analysis of fMRI data has become an important tool for understanding brain organization and brain networks. Among the range of network modeling methods, partial correlation has shown great promises in accurately detecting true brain network connections. However, the application of partial correlation in investigating brain connectivity, especially in large-scale brain networks, has been limited so far due to the technical challenges in its estimation. In this paper, we propose an efficient and reliable statistical method for estimating partial correlation in large-scale brain network modeling. Our method derives partial correlation based on the precision matrix estimated via Constrained L1-minimization Approach (CLIME), which is a recently developed statistical method that is more efficient and demonstrates better performance than the existing methods. To help select an appropriate tuning parameter for sparsity control in the network estimation, we propose a new Dens-based selection method that provides a more informative and flexible tool to allow the users to select the tuning parameter based on the desired sparsity level. Another appealing feature of the Dens-based method is that it is much faster than the existing methods, which provides an important advantage in neuroimaging applications. Simulation studies show that the Dens-based method demonstrates comparable or better performance with respect to the existing methods in network estimation. We applied the proposed partial correlation method to investigate resting state functional connectivity using rs-fMRI data from the Philadelphia Neurodevelopmental Cohort (PNC) study. Our results show that partial correlation analysis removed considerable between-module marginal connections identified by full correlation analysis, suggesting these connections were likely caused by global effects or common connection to other nodes. Based on partial correlation, we find that the most significant

  3. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  4. Theoretical basis, application, reliability, and sample size estimates of a Meridian Energy Analysis Device for Traditional Chinese Medicine Research

    Directory of Open Access Journals (Sweden)

    Ming-Yen Tsai

    Full Text Available OBJECTIVES: The Meridian Energy Analysis Device is currently a popular tool in the scientific research of meridian electrophysiology. In this field, it is generally believed that measuring the electrical conductivity of meridians provides information about the balance of bioenergy or Qi-blood in the body. METHODS AND RESULTS: PubMed database based on some original articles from 1956 to 2014 and the authoŕs clinical experience. In this short communication, we provide clinical examples of Meridian Energy Analysis Device application, especially in the field of traditional Chinese medicine, discuss the reliability of the measurements, and put the values obtained into context by considering items of considerable variability and by estimating sample size. CONCLUSION: The Meridian Energy Analysis Device is making a valuable contribution to the diagnosis of Qi-blood dysfunction. It can be assessed from short-term and long-term meridian bioenergy recordings. It is one of the few methods that allow outpatient traditional Chinese medicine diagnosis, monitoring the progress, therapeutic effect and evaluation of patient prognosis. The holistic approaches underlying the practice of traditional Chinese medicine and new trends in modern medicine toward the use of objective instruments require in-depth knowledge of the mechanisms of meridian energy, and the Meridian Energy Analysis Device can feasibly be used for understanding and interpreting traditional Chinese medicine theory, especially in view of its expansion in Western countries.

  5. TCS: a new multiple sequence alignment reliability measure to estimate alignment accuracy and improve phylogenetic tree reconstruction.

    Science.gov (United States)

    Chang, Jia-Ming; Di Tommaso, Paolo; Notredame, Cedric

    2014-06-01

    Multiple sequence alignment (MSA) is a key modeling procedure when analyzing biological sequences. Homology and evolutionary modeling are the most common applications of MSAs. Both are known to be sensitive to the underlying MSA accuracy. In this work, we show how this problem can be partly overcome using the transitive consistency score (TCS), an extended version of the T-Coffee scoring scheme. Using this local evaluation function, we show that one can identify the most reliable portions of an MSA, as judged from BAliBASE and PREFAB structure-based reference alignments. We also show how this measure can be used to improve phylogenetic tree reconstruction using both an established simulated data set and a novel empirical yeast data set. For this purpose, we describe a novel lossless alternative to site filtering that involves overweighting the trustworthy columns. Our approach relies on the T-Coffee framework; it uses libraries of pairwise alignments to evaluate any third party MSA. Pairwise projections can be produced using fast or slow methods, thus allowing a trade-off between speed and accuracy. We compared TCS with Heads-or-Tails, GUIDANCE, Gblocks, and trimAl and found it to lead to significantly better estimates of structural accuracy and more accurate phylogenetic trees. The software is available from www.tcoffee.org/Projects/tcs. © The Author 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. Factor structure and reliability of the childhood trauma questionnaire and prevalence estimates of trauma for male and female street youth.

    Science.gov (United States)

    Forde, David R; Baron, Stephen W; Scher, Christine D; Stein, Murray B

    2012-01-01

    This study examines the psychometric properties of the Childhood Trauma Questionnaire short form (CTQ-SF) with street youth who have run away or been expelled from their homes (N = 397). Internal reliability coefficients for the five clinical scales ranged from .65 to .95. Confirmatory Factor Analysis (CFA) was used to test the five-factor structure of the scales yielding acceptable fit for the total sample. Additional multigroup analyses were performed to consider items by gender. Results provided only evidence of weak factorial invariance. Constrained models showed invariance in configuration, factor loadings, and factor covariances but failed for equality of intercepts. Mean trauma scores for street youth tended to fall in the moderate to severe range on all abuse/neglect clinical scales. Females reported higher levels of abuse and neglect. Prevalence of child maltreatment of individual forms was very high with 98% of street youth reporting one or more forms; 27.4% of males and 48.9% of females reported all five forms. Results of this study support the viability of the CTQ-SF for screening maltreatment in a highly vulnerable street population. Caution is recommended when comparing prevalence estimates for male and female street youth given the failure of the strong factorial multigroup model.

  7. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  8. Methods of Estimation the Reliability and Increasing the Informativeness of the Laboratory Results (Analysis of the Laboratory Case of Measurement the Indicators of Thyroid Function

    Directory of Open Access Journals (Sweden)

    N A Kovyazina

    2014-06-01

    Full Text Available The goal of the study was to demonstrate the multilevel laboratory quality management system and point at the methods of estimating the reliability and increasing the amount of information content of the laboratory results (on the example of the laboratory case. Results. The article examines the stages of laboratory quality management which has helped to estimate the reliability of the results of determining Free T3, Free T4 and TSH. The measurement results are presented by the expanded uncertainty and the evaluation of the dynamics. Conclusion. Compliance with mandatory measures for laboratory quality management system enables laboratories to obtain reliable results and calculate the parameters that are able to increase the amount of information content of laboratory tests in clinical decision making.

  9. Reliability of the Core Items in the General Social Survey: Estimates from the Three-Wave Panels, 2006–2014

    Directory of Open Access Journals (Sweden)

    Michael Hout

    2016-11-01

    Full Text Available We used standard and multilevel models to assess the reliability of core items in the General Social Survey panel studies spanning 2006 to 2014. Most of the 293 core items scored well on the measure of reliability: 62 items (21 percent had reliability measures greater than 0.85; another 71 (24 percent had reliability measures between 0.70 and 0.85. Objective items, especially facts about demography and religion, were generally more reliable than subjective items. The economic recession of 2007–2009, the slow recovery afterward, and the election of Barack Obama in 2008 altered the social context in ways that may look like unreliability of items. For example, unemployment status, hours worked, and weeks worked have lower reliability than most work-related items, reflecting the consequences of the recession on the facts of peoples lives. Items regarding racial and gender discrimination and racial stereotypes scored as particularly unreliable, accounting for most of the 15 items with reliability coefficients less than 0.40. Our results allow scholars to more easily take measurement reliability into consideration in their own research, while also highlighting the limitations of these approaches.

  10. Accuracy of a Classical Test Theory-Based Procedure for Estimating the Reliability of a Multistage Test. Research Report. ETS RR-17-02

    Science.gov (United States)

    Kim, Sooyeon; Livingston, Samuel A.

    2017-01-01

    The purpose of this simulation study was to assess the accuracy of a classical test theory (CTT)-based procedure for estimating the alternate-forms reliability of scores on a multistage test (MST) having 3 stages. We generated item difficulty and discrimination parameters for 10 parallel, nonoverlapping forms of the complete 3-stage test and…

  11. System Reliability Engineering

    International Nuclear Information System (INIS)

    Lim, Tae Jin

    2005-02-01

    This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.

  12. Capability ethics

    OpenAIRE

    Robeyns, Ingrid

    2012-01-01

    textabstractThe capability approach is one of the most recent additions to the landscape of normative theories in ethics and political philosophy. Yet in its present stage of development, the capability approach is not a full-blown normative theory, in contrast to utilitarianism, deontological theories, virtue ethics, or pragmatism. As I will argue in this chapter, at present the core of the capability approach is an account of value, which together with some other (more minor) normative comm...

  13. An overview of coefficient alpha and a reliability matrix for estimating adequacy of internal consistency coefficients with psychological research measures.

    Science.gov (United States)

    Ponterotto, Joseph G; Ruckdeschel, Daniel E

    2007-12-01

    The present article addresses issues in reliability assessment that are often neglected in psychological research such as acceptable levels of internal consistency for research purposes, factors affecting the magnitude of coefficient alpha (alpha), and considerations for interpreting alpha within the research context. A new reliability matrix anchored in classical test theory is introduced to help researchers judge adequacy of internal consistency coefficients with research measures. Guidelines and cautions in applying the matrix are provided.

  14. Dynamic Capabilities

    DEFF Research Database (Denmark)

    Grünbaum, Niels Nolsøe; Stenger, Marianne

    2013-01-01

    The findings reveal a positive relationship between dynamic capabilities and innovation performance in the case enterprises, as we would expect. It was, however, not possible to establish a positive relationship between innovation performance and profitability. Nor was there any positive...... relationship between dynamic capabilities and profitability....

  15. Capability ethics

    NARCIS (Netherlands)

    I.A.M. Robeyns (Ingrid)

    2012-01-01

    textabstractThe capability approach is one of the most recent additions to the landscape of normative theories in ethics and political philosophy. Yet in its present stage of development, the capability approach is not a full-blown normative theory, in contrast to utilitarianism, deontological

  16. Validation and reliability of the sex estimation of the human os coxae using freely available DSP2 software for bioarchaeology and forensic anthropology.

    Science.gov (United States)

    Brůžek, Jaroslav; Santos, Frédéric; Dutailly, Bruno; Murail, Pascal; Cunha, Eugenia

    2017-10-01

    A new tool for skeletal sex estimation based on measurements of the human os coxae is presented using skeletons from a metapopulation of identified adult individuals from twelve independent population samples. For reliable sex estimation, a posterior probability greater than 0.95 was considered to be the classification threshold: below this value, estimates are considered indeterminate. By providing free software, we aim to develop an even more disseminated method for sex estimation. Ten metric variables collected from 2,040 ossa coxa of adult subjects of known sex were recorded between 1986 and 2002 (reference sample). To test both the validity and reliability, a target sample consisting of two series of adult ossa coxa of known sex (n = 623) was used. The DSP2 software (Diagnose Sexuelle Probabiliste v2) is based on Linear Discriminant Analysis, and the posterior probabilities are calculated using an R script. For the reference sample, any combination of four dimensions provides a correct sex estimate in at least 99% of cases. The percentage of individuals for whom sex can be estimated depends on the number of dimensions; for all ten variables it is higher than 90%. Those results are confirmed in the target sample. Our posterior probability threshold of 0.95 for sex estimate corresponds to the traditional sectioning point used in osteological studies. DSP2 software is replacing the former version that should not be used anymore. DSP2 is a robust and reliable technique for sexing adult os coxae, and is also user friendly. © 2017 Wiley Periodicals, Inc.

  17. Feasibility Study of a Simulation Driven Approach for Estimating Reliability of Wind Turbine Fluid Power Pitch Systems

    DEFF Research Database (Denmark)

    Liniger, Jesper; Pedersen, Henrik Clemmensen; N. Soltani, Mohsen

    2018-01-01

    Recent field data indicates that pitch systems account for a substantial part of a wind turbines down time. Reducing downtime means increasing the total amount of energy produced during its lifetime. Both electrical and fluid power pitch systems are employed with a roughly 50/50 distribution. Fluid...... power pitch systems generally show higher reliability and have been favored on larger offshore wind turbines. Still general issues such as leakage, contamination and electrical faults make current systems work sub-optimal. Current field data for wind turbines present overall pitch system reliability...... and the reliability of component groups (valves, accumulators, pumps etc.). However, the failure modes of the components and more importantly the root causes are not evident. The root causes and failure mode probabilities are central for changing current pitch system designs and operational concepts to increase...

  18. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  19. An investigation into the minimum accelerometry wear time for reliable estimates of habitual physical activity and definition of a standard measurement day in pre-school children.

    Science.gov (United States)

    Hislop, Jane; Law, James; Rush, Robert; Grainger, Andrew; Bulley, Cathy; Reilly, John J; Mercer, Tom

    2014-11-01

    The purpose of this study was to determine the number of hours and days of accelerometry data necessary to provide a reliable estimate of habitual physical activity in pre-school children. The impact of a weekend day on reliability estimates was also determined and standard measurement days were defined for weekend and weekdays.Accelerometry data were collected from 112 children (60 males, 52 females, mean (SD) 3.7 (0.7)yr) over 7 d. The Spearman-Brown Prophecy formula (S-B prophecy formula) was used to predict the number of days and hours of data required to achieve an intraclass correlation coefficient (ICC) of 0.7. The impact of including a weekend day was evaluated by comparing the reliability coefficient (r) for any 4 d of data with data for 4 d including one weekend day.Our observations indicate that 3 d of accelerometry monitoring, regardless of whether it includes a weekend day, for at least 7 h  d(-1) offers sufficient reliability to characterise total physical activity and sedentary behaviour of pre-school children. These findings offer an approach that addresses the underlying tension in epidemiologic surveillance studies between the need to maintain acceptable measurement rigour and retention of a representatively meaningful sample size.

  20. Reliability of tumor volume estimation from MR images in patients with malignant glioma. Results from the American College of Radiology Imaging Network (ACRIN) 6662 Trial

    International Nuclear Information System (INIS)

    Ertl-Wagner, Birgit B.; Blume, Jeffrey D.; Herman, Benjamin; Peck, Donald; Udupa, Jayaram K.; Levering, Anthony; Schmalfuss, Ilona M.

    2009-01-01

    Reliable assessment of tumor growth in malignant glioma poses a common problem both clinically and when studying novel therapeutic agents. We aimed to evaluate two software-systems in their ability to estimate volume change of tumor and/or edema on magnetic resonance (MR) images of malignant gliomas. Twenty patients with malignant glioma were included from different sites. Serial post-operative MR images were assessed with two software systems representative of the two fundamental segmentation methods, single-image fuzzy analysis (3DVIEWNIX-TV) and multi-spectral-image analysis (Eigentool), and with a manual method by 16 independent readers (eight MR-certified technologists, four neuroradiology fellows, four neuroradiologists). Enhancing tumor volume and tumor volume plus edema were assessed independently by each reader. Intraclass correlation coefficients (ICCs), variance components, and prediction intervals were estimated. There were no significant differences in the average tumor volume change over time between the software systems (p > 0.05). Both software systems were much more reliable and yielded smaller prediction intervals than manual measurements. No significant differences were observed between the volume changes determined by fellows/neuroradiologists or technologists.Semi-automated software systems are reliable tools to serve as outcome parameters in clinical studies and the basis for therapeutic decision-making for malignant gliomas, whereas manual measurements are less reliable and should not be the basis for clinical or research outcome studies. (orig.)

  1. Gossiping Capabilities

    DEFF Research Database (Denmark)

    Mogensen, Martin; Frey, Davide; Guerraoui, Rachid

    Gossip-based protocols are now acknowledged as a sound basis to implement collaborative high-bandwidth content dissemination: content location is disseminated through gossip, the actual contents being subsequently pulled. In this paper, we present HEAP, HEterogeneity Aware gossip Protocol, where...... nodes dynamically adjust their contribution to gossip dissemination according to their capabilities. Using a continuous, itself gossip-based, approximation of relative capabilities, HEAP dynamically leverages the most capable nodes by (a) increasing their fanouts (while decreasing by the same proportion...... declare a high capability in order to augment their perceived quality without contributing accordingly. We evaluate HEAP in the context of a video streaming application on a 236 PlanetLab nodes testbed. Our results shows that HEAP improves the quality of the streaming by 25% over a standard gossip...

  2. Online Identification with Reliability Criterion and State of Charge Estimation Based on a Fuzzy Adaptive Extended Kalman Filter for Lithium-Ion Batteries

    Directory of Open Access Journals (Sweden)

    Zhongwei Deng

    2016-06-01

    Full Text Available In the field of state of charge (SOC estimation, the Kalman filter has been widely used for many years, although its performance strongly depends on the accuracy of the battery model as well as the noise covariance. The Kalman gain determines the confidence coefficient of the battery model by adjusting the weight of open circuit voltage (OCV correction, and has a strong correlation with the measurement noise covariance (R. In this paper, the online identification method is applied to acquire the real model parameters under different operation conditions. A criterion based on the OCV error is proposed to evaluate the reliability of online parameters. Besides, the equivalent circuit model produces an intrinsic model error which is dependent on the load current, and the property that a high battery current or a large current change induces a large model error can be observed. Based on the above prior knowledge, a fuzzy model is established to compensate the model error through updating R. Combining the positive strategy (i.e., online identification and negative strategy (i.e., fuzzy model, a more reliable and robust SOC estimation algorithm is proposed. The experiment results verify the proposed reliability criterion and SOC estimation method under various conditions for LiFePO4 batteries.

  3. The juvenile face as a suitable age indicator in child pornography cases: a pilot study on the reliability of automated and visual estimation approaches.

    Science.gov (United States)

    Ratnayake, M; Obertová, Z; Dose, M; Gabriel, P; Bröker, H M; Brauckmann, M; Barkus, A; Rizgeliene, R; Tutkuviene, J; Ritz-Timme, S; Marasciuolo, L; Gibelli, D; Cattaneo, C

    2014-09-01

    In cases of suspected child pornography, the age of the victim represents a crucial factor for legal prosecution. The conventional methods for age estimation provide unreliable age estimates, particularly if teenage victims are concerned. In this pilot study, the potential of age estimation for screening purposes is explored for juvenile faces. In addition to a visual approach, an automated procedure is introduced, which has the ability to rapidly scan through large numbers of suspicious image data in order to trace juvenile faces. Age estimations were performed by experts, non-experts and the Demonstrator of a developed software on frontal facial images of 50 females aged 10-19 years from Germany, Italy, and Lithuania. To test the accuracy, the mean absolute error (MAE) between the estimates and the real ages was calculated for each examiner and the Demonstrator. The Demonstrator achieved the lowest MAE (1.47 years) for the 50 test images. Decreased image quality had no significant impact on the performance and classification results. The experts delivered slightly less accurate MAE (1.63 years). Throughout the tested age range, both the manual and the automated approach led to reliable age estimates within the limits of natural biological variability. The visual analysis of the face produces reasonably accurate age estimates up to the age of 18 years, which is the legally relevant age threshold for victims in cases of pedo-pornography. This approach can be applied in conjunction with the conventional methods for a preliminary age estimation of juveniles depicted on images.

  4. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  5. Reliability and maintainability assessment factors for reliable fault-tolerant systems

    Science.gov (United States)

    Bavuso, S. J.

    1984-01-01

    A long term goal of the NASA Langley Research Center is the development of a reliability assessment methodology of sufficient power to enable the credible comparison of the stochastic attributes of one ultrareliable system design against others. This methodology, developed over a 10 year period, is a combined analytic and simulative technique. An analytic component is the Computer Aided Reliability Estimation capability, third generation, or simply CARE III. A simulative component is the Gate Logic Software Simulator capability, or GLOSS. The numerous factors that potentially have a degrading effect on system reliability and the ways in which these factors that are peculiar to highly reliable fault tolerant systems are accounted for in credible reliability assessments. Also presented are the modeling difficulties that result from their inclusion and the ways in which CARE III and GLOSS mitigate the intractability of the heretofore unworkable mathematics.

  6. Capability approach

    DEFF Research Database (Denmark)

    Jensen, Niels Rosendal; Kjeldsen, Christian Christrup

    Lærebogen er den første samlede danske præsentation af den af Amartya Sen og Martha Nussbaum udviklede Capability Approach. Bogen indeholder en præsentation og diskussion af Sen og Nussbaums teoretiske platform. I bogen indgår eksempler fra såvel uddannelse/uddannelsespolitik, pædagogik og omsorg....

  7. Standard error of measurement of 5 health utility indexes across the range of health for use in estimating reliability and responsiveness.

    Science.gov (United States)

    Palta, Mari; Chen, Han-Yang; Kaplan, Robert M; Feeny, David; Cherepanov, Dasha; Fryback, Dennis G

    2011-01-01

    Standard errors of measurement (SEMs) of health-related quality of life (HRQoL) indexes are not well characterized. SEM is needed to estimate responsiveness statistics, and is a component of reliability. To estimate the SEM of 5 HRQoL indexes. The National Health Measurement Study (NHMS) was a population-based survey. The Clinical Outcomes and Measurement of Health Study (COMHS) provided repeated measures. A total of 3844 randomly selected adults from the noninstitutionalized population aged 35 to 89 y in the contiguous United States and 265 cataract patients. The SF6-36v2™, QWB-SA, EQ-5D, HUI2, and HUI3 were included. An item-response theory approach captured joint variation in indexes into a composite construct of health (theta). The authors estimated 1) the test-retest standard deviation (SEM-TR) from COMHS, 2) the structural standard deviation (SEM-S) around theta from NHMS, and 3) reliability coefficients. SEM-TR was 0.068 (SF-6D), 0.087 (QWB-SA), 0.093 (EQ-5D), 0.100 (HUI2), and 0.134 (HUI3), whereas SEM-S was 0.071, 0.094, 0.084, 0.074, and 0.117, respectively. These yield reliability coefficients 0.66 (COMHS) and 0.71 (NHMS) for SF-6D, 0.59 and 0.64 for QWB-SA, 0.61 and 0.70 for EQ-5D, 0.64 and 0.80 for HUI2, and 0.75 and 0.77 for HUI3, respectively. The SEM varied across levels of health, especially for HUI2, HUI3, and EQ-5D, and was influenced by ceiling effects. Limitations. Repeated measures were 5 mo apart, and estimated theta contained measurement error. The 2 types of SEM are similar and substantial for all the indexes and vary across health.

  8. Concurrent validity and reliability of torso-worn inertial measurement unit for jump power and height estimation.

    Science.gov (United States)

    Rantalainen, Timo; Gastin, Paul B; Spangler, Rhys; Wundersitz, Daniel

    2018-09-01

    The purpose of the present study was to evaluate the concurrent validity and test-retest repeatability of torso-worn IMU-derived power and jump height in a counter-movement jump test. Twenty-seven healthy recreationally active males (age, 21.9 [SD 2.0] y, height, 1.76 [0.7] m, mass, 73.7 [10.3] kg) wore an IMU and completed three counter-movement jumps a week apart. A force platform and a 3D motion analysis system were used to concurrently measure the jumps and subsequently derive power and jump height (based on take-off velocity and flight time). The IMU significantly overestimated power (mean difference = 7.3 W/kg; P jump heights exhibited poorer concurrent validity (ICC = 0.72 to 0.78) and repeatability (ICC = 0.68) than flight-time-derived jump heights, which exhibited excellent validity (ICC = 0.93 to 0.96) and reliability (ICC = 0.91). Since jump height and power are closely related, and flight-time-derived jump height exhibits excellent concurrent validity and reliability, flight-time-derived jump height could provide a more desirable measure compared to power when assessing athletic performance in a counter-movement jump with IMUs.

  9. The influence of different error estimates in the detection of postoperative cognitive dysfunction using reliable change indices with correction for practice effects.

    Science.gov (United States)

    Lewis, Matthew S; Maruff, Paul; Silbert, Brendan S; Evered, Lis A; Scott, David A

    2007-02-01

    The reliable change index (RCI) expresses change relative to its associated error, and is useful in the identification of postoperative cognitive dysfunction (POCD). This paper examines four common RCIs that each account for error in different ways. Three rules incorporate a constant correction for practice effects and are contrasted with the standard RCI that had no correction for practice. These rules are applied to 160 patients undergoing coronary artery bypass graft (CABG) surgery who completed neuropsychological assessments preoperatively and 1 week postoperatively using error and reliability data from a comparable healthy nonsurgical control group. The rules all identify POCD in a similar proportion of patients, but the use of the within-subject standard deviation (WSD), expressing the effects of random error, as an error estimate is a theoretically appropriate denominator when a constant error correction, removing the effects of systematic error, is deducted from the numerator in a RCI.

  10. ENTREPRENEURIAL CAPABILITIES

    DEFF Research Database (Denmark)

    Rasmussen, Lauge Baungaard; Nielsen, Thorkild

    2003-01-01

    The aim of this article is to analyse entrepreneurship from an action research perspective. What is entrepreneurship about? Which are the fundamental capabilities and processes of entrepreneurship? To answer these questions the article includes a case study of a Danish entrepreneur and his networ....... Finally, the article discuss, how more long term action research methods could be integrated into the entrepreneurial processes and the possible impacts of such an implementation?...

  11. Assessment of the reliability of human corneal endothelial cell-density estimates using a noncontact specular microscope.

    Science.gov (United States)

    Doughty, M J; Müller, A; Zaman, M L

    2000-03-01

    We sought to determine the variance in endothelial cell density (ECD) estimates for human corneal endothelia. Noncontact specular micrographs were obtained from white subjects without any history of contact lens wear, or major eye disease or surgery; subjects were within four age groups (children, young adults, older adults, senior citizens). The endothelial image was scanned, and the areas from > or =75 cells measured from an overlay by planimetry. The cell-area values were used to calculate the ECD repeatedly so that the intra- and intersubject variation in an average ECD estimate could be made by using different numbers of cells (5, 10, 15, etc.). An average ECD of 3,519 cells/mm2 (range, 2,598-5,312 cells/mm2) was obtained of counts of 75 cells/ endothelium from individuals aged 6-83 years. Average ECD estimates in each age group were 4,124, 3,457, 3,360, and 3,113 cells/mm2, respectively. Analysis of intersubject variance revealed that ECD estimates would be expected to be no better than +/-10% if only 25 cells were measured per endothelium, but approach +/-2% if 75 cells are measured. In assessing the corneal endothelium by noncontact specular microscopy, cell count should be given, and this should be > or =75/ endothelium for an expected variance to be at a level close to that recommended for monitoring age-, stress-, or surgery-related changes.

  12. The reliability of grazing rate estimates from dilution experiments: Have we over-estimated rates of organic carbon consumption by microzooplankton?

    Directory of Open Access Journals (Sweden)

    J. R. Dolan,

    2005-01-01

    Full Text Available According to a recent global analysis, microzooplankton grazing is surprisingly invariant, ranging only between 59 and 74% of phytoplankton primary production across systems differing in seasonality, trophic status, latitude, or salinity. Thus an important biological process in the world ocean, the daily consumption of recently fixed carbon, appears nearly constant. We believe this conclusion is an artefact because dilution experiments are 1 prone to providing over-estimates of grazing rates and 2 unlikely to furnish evidence of low grazing rates. In our view the overall average rate of microzooplankton grazing probably does not exceed 50% of primary production and may be even lower in oligotrophic systems.

  13. Reliability of activation cross sections for estimation of shutdown dose rate in the ITER port cell and port interspace

    Science.gov (United States)

    García, Raquel; García, Mauricio; Ogando, Francisco; Pampin, Raúl; Sanz, Javier

    2017-09-01

    This paper explores the quality of available activation cross section (XS) data for accurate Shutdown Dose Rate (SDDR) prediction in the ITER Port Cell and Port Interspace areas, where different maintenance activities are foreseen. For this purpose the EAF library (2007 and 2010 versions) has been investigated, as it is typically used by the ITER community. Based on both reports/papers on SDDR in ITER and own calculations, major nuclides contributing to the SDDR coming from the activation of i) relevant materials placed in ITER and ii) candidate materials for the bioshield plug as L2N and barite concretes, are identified. Then, relevant production pathways are obtained. EAF XS quality for all pathways is checked following the procedure used for validating and testing the successive EAF versions. Also, possible improvements from using the TENDL-2015 library are assessed by comparing EAF and TENDL XS with available differential experimental data from EXFOR. Results point out that most of the activation XS related to materials currently placed in ITER are reliable, and only a few need improvement. Also, many of the XS related to both L2N and barite concretes need further work for validation.

  14. Campus Capability Plan

    Energy Technology Data Exchange (ETDEWEB)

    Adams, C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Arsenlis, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bailey, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bergman, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brase, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Brenner, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Camara, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Carlton, H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Cheng, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chrzanowski, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Colson, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); East, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Farrell, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ferranti, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gursahani, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hansen, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Helms, L. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hernandez, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Jeffries, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Larson, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lu, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McNabb, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mercer, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Skeate, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sueksdorf, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zucca, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Le, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ancria, R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Scott, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Leininger, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gagliardi, F. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Gash, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bronson, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chung, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hobson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Meeker, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sanchez, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Zagar, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Quivey, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sommer, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Atherton, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-06-06

    Lawrence Livermore National Laboratory Campus Capability Plan for 2018-2028. Lawrence Livermore National Laboratory (LLNL) is one of three national laboratories that are part of the National Nuclear Security Administration. LLNL provides critical expertise to strengthen U.S. security through development and application of world-class science and technology that: Ensures the safety, reliability, and performance of the U.S. nuclear weapons stockpile; Promotes international nuclear safety and nonproliferation; Reduces global danger from weapons of mass destruction; Supports U.S. leadership in science and technology. Essential to the execution and continued advancement of these mission areas are responsive infrastructure capabilities. This report showcases each LLNL capability area and describes the mission, science, and technology efforts enabled by LLNL infrastructure, as well as future infrastructure plans.

  15. Digital photography provides a fast, reliable, and noninvasive method to estimate anthocyanin pigment concentration in reproductive and vegetative plant tissues.

    Science.gov (United States)

    Del Valle, José C; Gallardo-López, Antonio; Buide, Mª Luisa; Whittall, Justen B; Narbona, Eduardo

    2018-03-01

    Anthocyanin pigments have become a model trait for evolutionary ecology as they often provide adaptive benefits for plants. Anthocyanins have been traditionally quantified biochemically or more recently using spectral reflectance. However, both methods require destructive sampling and can be labor intensive and challenging with small samples. Recent advances in digital photography and image processing make it the method of choice for measuring color in the wild. Here, we use digital images as a quick, noninvasive method to estimate relative anthocyanin concentrations in species exhibiting color variation. Using a consumer-level digital camera and a free image processing toolbox, we extracted RGB values from digital images to generate color indices. We tested petals, stems, pedicels, and calyces of six species, which contain different types of anthocyanin pigments and exhibit different pigmentation patterns. Color indices were assessed by their correlation to biochemically determined anthocyanin concentrations. For comparison, we also calculated color indices from spectral reflectance and tested the correlation with anthocyanin concentration. Indices perform differently depending on the nature of the color variation. For both digital images and spectral reflectance, the most accurate estimates of anthocyanin concentration emerge from anthocyanin content-chroma ratio, anthocyanin content-chroma basic, and strength of green indices. Color indices derived from both digital images and spectral reflectance strongly correlate with biochemically determined anthocyanin concentration; however, the estimates from digital images performed better than spectral reflectance in terms of r 2 and normalized root-mean-square error. This was particularly noticeable in a species with striped petals, but in the case of striped calyces, both methods showed a comparable relationship with anthocyanin concentration. Using digital images brings new opportunities to accurately quantify the

  16. Teaching Confirmatory Factor Analysis to Non-Statisticians: A Case Study for Estimating Composite Reliability of Psychometric Instruments

    Science.gov (United States)

    Gajewski, Byron J.; Jiang, Yu; Yeh, Hung-Wen; Engelman, Kimberly; Teel, Cynthia; Choi, Won S.; Greiner, K. Allen; Daley, Christine Makosky

    2013-01-01

    Texts and software that we are currently using for teaching multivariate analysis to non-statisticians lack in the delivery of confirmatory factor analysis (CFA). The purpose of this paper is to provide educators with a complement to these resources that includes CFA and its computation. We focus on how to use CFA to estimate a “composite reliability” of a psychometric instrument. This paper provides guidance for introducing, via a case-study, the non-statistician to CFA. As a complement to our instruction about the more traditional SPSS, we successfully piloted the software R for estimating CFA on nine non-statisticians. This approach can be used with healthcare graduate students taking a multivariate course, as well as modified for community stakeholders of our Center for American Indian Community Health (e.g. community advisory boards, summer interns, & research team members). The placement of CFA at the end of the class is strategic and gives us an opportunity to do some innovative teaching: (1) build ideas for understanding the case study using previous course work (such as ANOVA); (2) incorporate multi-dimensional scaling (that students already learned) into the selection of a factor structure (new concept); (3) use interactive data from the students (active learning); (4) review matrix algebra and its importance to psychometric evaluation; (5) show students how to do the calculation on their own; and (6) give students access to an actual recent research project. PMID:24772373

  17. The transverse diameter of the chest on routine radiographs reliably estimates gestational age and weight in premature infants.

    Science.gov (United States)

    Dietz, Kelly R; Zhang, Lei; Seidel, Frank G

    2015-08-01

    Prior to digital radiography it was possible for a radiologist to easily estimate the size of a patient on an analog film. Because variable magnification may be applied at the time of processing an image, it is now more difficult to visually estimate an infant's size on the monitor. Since gestational age and weight significantly impact the differential diagnosis of neonatal diseases and determine the expected size of kidneys or appearance of the brain by MRI or US, this information is useful to a pediatric radiologist. Although this information may be present in the electronic medical record, it is frequently not readily available to the pediatric radiologist at the time of image interpretation. To determine if there was a correlation between gestational age and weight of a premature infant with their transverse chest diameter (rib to rib) on admission chest radiographs. This retrospective study was approved by the institutional review board, which waived informed consent. The maximum transverse chest diameter outer rib to outer rib was measured on admission portable chest radiographs of 464 patients admitted to the neonatal intensive care unit (NICU) during the 2010 calendar year. Regression analysis was used to investigate the association between chest diameter and gestational age/birth weight. Quadratic term of chest diameter was used in the regression model. Chest diameter was statistically significantly associated with both gestational age (P chest diameter on digital chest radiograph with the tables and graphs in our study.

  18. Reliable and Damage-Free Estimation of Resistivity of ZnO Thin Films for Photovoltaic Applications Using Photoluminescence Technique

    Directory of Open Access Journals (Sweden)

    N. Poornima

    2013-01-01

    Full Text Available This work projects photoluminescence (PL as an alternative technique to estimate the order of resistivity of zinc oxide (ZnO thin films. ZnO thin films, deposited using chemical spray pyrolysis (CSP by varying the deposition parameters like solvent, spray rate, pH of precursor, and so forth, have been used for this study. Variation in the deposition conditions has tremendous impact on the luminescence properties as well as resistivity. Two emissions could be recorded for all samples—the near band edge emission (NBE at 380 nm and the deep level emission (DLE at ~500 nm which are competing in nature. It is observed that the ratio of intensities of DLE to NBE (/ can be reduced by controlling oxygen incorporation in the sample. - measurements indicate that restricting oxygen incorporation reduces resistivity considerably. Variation of / and resistivity for samples prepared under different deposition conditions is similar in nature. / was always less than resistivity by an order for all samples. Thus from PL measurements alone, the order of resistivity of the samples can be estimated.

  19. Ultrasound estimates of muscle quality in older adults: reliability and comparison of Photoshop and ImageJ for the grayscale analysis of muscle echogenicity

    Directory of Open Access Journals (Sweden)

    Michael O. Harris-Love

    2016-02-01

    Full Text Available Background. Quantitative diagnostic ultrasound imaging has been proposed as a method of estimating muscle quality using measures of echogenicity. The Rectangular Marquee Tool (RMT and the Free Hand Tool (FHT are two types of editing features used in Photoshop and ImageJ for determining a region of interest (ROI within an ultrasound image. The primary objective of this study is to determine the intrarater and interrater reliability of Photoshop and ImageJ for the estimate of muscle tissue echogenicity in older adults via grayscale histogram analysis. The secondary objective is to compare the mean grayscale values obtained using both the RMT and FHT methods across both image analysis platforms. Methods. This cross-sectional observational study features 18 community-dwelling men (age = 61.5 ± 2.32 years. Longitudinal views of the rectus femoris were captured using B-mode ultrasound. The ROI for each scan was selected by 2 examiners using the RMT and FHT methods from each software program. Their reliability is assessed using intraclass correlation coefficients (ICCs and the standard error of the measurement (SEM. Measurement agreement for these values is depicted using Bland-Altman plots. A paired t-test is used to determine mean differences in echogenicity expressed as grayscale values using the RMT and FHT methods to select the post-image acquisition ROI. The degree of association among ROI selection methods and image analysis platforms is analyzed using the coefficient of determination (R2. Results. The raters demonstrated excellent intrarater and interrater reliability using the RMT and FHT methods across both platforms (lower bound 95% CI ICC = .97–.99, p < .001. Mean differences between the echogenicity estimates obtained with the RMT and FHT methods was .87 grayscale levels (95% CI [.54–1.21], p < .0001 using data obtained with both programs. The SEM for Photoshop was .97 and 1.05 grayscale levels when using the RMT and FHT ROI selection

  20. Estimation technique of corrective effects for forecasting of reliability of the designed and operated objects of the generating systems

    Science.gov (United States)

    Truhanov, V. N.; Sultanov, M. M.

    2017-11-01

    In the present article researches of statistical material on the refusals and malfunctions influencing operability of heat power installations have been conducted. In this article the mathematical model of change of output characteristics of the turbine depending on number of the refusals revealed in use has been presented. The mathematical model is based on methods of mathematical statistics, probability theory and methods of matrix calculation. The novelty of this model is that it allows to predict the change of the output characteristic in time, and the operating influences have been presented in an explicit form. As desirable dynamics of change of the output characteristic (function, reliability) the law of distribution of Veybull which is universal is adopted since at various values of parameters it turns into other types of distributions (for example, exponential, normal, etc.) It should be noted that the choice of the desirable law of management allows to determine the necessary management parameters with use of the saved-up change of the output characteristic in general. The output characteristic can be changed both on the speed of change of management parameters, and on acceleration of change of management parameters. In this article the technique of an assessment of the pseudo-return matrix has been stated in detail by the method of the smallest squares and the standard Microsoft Excel functions. Also the technique of finding of the operating effects when finding restrictions both for the output characteristic, and on management parameters has been considered. In the article the order and the sequence of finding of management parameters has been stated. A concrete example of finding of the operating effects in the course of long-term operation of turbines has been shown.

  1. Reliability of a self-report Italian version of the AUDIT-C questionnaire, used to estimate alcohol consumption by pregnant women in an obstetric setting.

    Science.gov (United States)

    Bazzo, Stefania; Battistella, Giuseppe; Riscica, Patrizia; Moino, Giuliana; Dal Pozzo, Giuseppe; Bottarel, Mery; Geromel, Mariasole; Czerwinsky, Loredana

    2015-01-01

    Alcohol consumption during pregnancy can result in a range of harmful effects on the developing foetus and newborn, called Fetal Alcohol Spectrum Disorders (FASD). The identification of pregnant women who use alcohol enables to provide information, support and treatment for women and the surveillance of their children. The AUDIT-C (the shortened consumption version of the Alcohol Use Disorders Identification Test) is used for investigating risky drinking with different populations, and has been applied to estimate alcohol use and risky drinking also in antenatal clinics. The aim of the study was to investigate the reliability of a self-report Italian version of the AUDIT-C questionnaire to detect alcohol consumption during pregnancy, regardless of its use as a screening tool. The questionnaire was filled in by two independent consecutive series of pregnant women at the 38th gestation week visit in the two birth locations of the Local Health Authority of Treviso (Italy), during the years 2010 and 2011 (n=220 and n=239). Reliability analysis was performed using internal consistency, item-total score correlations, and inter-item correlations. The "discriminatory power" of the test was also evaluated. Results. Overall, about one third of women recalled alcohol consumption at least once during the current pregnancy. The questionnaire had an internal consistency of 0.565 for the group of the year 2010, of 0.516 for the year 2011, and of 0.542 for the overall group. The highest item total correlations' coefficient was 0.687 and the highest inter-item correlations' coefficient was 0.675. As for the discriminatory power of the questionnaire, the highest Ferguson's delta coefficient was 0.623. These findings suggest that the Italian self-report version of the AUDIT-C possesses unsatisfactory reliability to estimate alcohol consumption during pregnancy when used as self-report questionnaire in an obstetric setting.

  2. The reliability and accuracy of estimating heart-rates from RGB video recorded on a consumer grade camera

    Science.gov (United States)

    Eaton, Adam; Vincely, Vinoin; Lloyd, Paige; Hugenberg, Kurt; Vishwanath, Karthik

    2017-03-01

    Video Photoplethysmography (VPPG) is a numerical technique to process standard RGB video data of exposed human skin and extracting the heart-rate (HR) from the skin areas. Being a non-contact technique, VPPG has the potential to provide estimates of subject's heart-rate, respiratory rate, and even the heart rate variability of human subjects with potential applications ranging from infant monitors, remote healthcare and psychological experiments, particularly given the non-contact and sensor-free nature of the technique. Though several previous studies have reported successful correlations in HR obtained using VPPG algorithms to HR measured using the gold-standard electrocardiograph, others have reported that these correlations are dependent on controlling for duration of the video-data analyzed, subject motion, and ambient lighting. Here, we investigate the ability of two commonly used VPPG-algorithms in extraction of human heart-rates under three different laboratory conditions. We compare the VPPG HR values extracted across these three sets of experiments to the gold-standard values acquired by using an electrocardiogram or a commercially available pulseoximeter. The two VPPG-algorithms were applied with and without KLT-facial feature tracking and detection algorithms from the Computer Vision MATLAB® toolbox. Results indicate that VPPG based numerical approaches have the ability to provide robust estimates of subject HR values and are relatively insensitive to the devices used to record the video data. However, they are highly sensitive to conditions of video acquisition including subject motion, the location, size and averaging techniques applied to regions-of-interest as well as to the number of video frames used for data processing.

  3. Design and validation of new genotypic tools for easy and reliable estimation of HIV tropism before using CCR5 antagonists.

    Science.gov (United States)

    Poveda, Eva; Seclén, Eduardo; González, María del Mar; García, Federico; Chueca, Natalia; Aguilera, Antonio; Rodríguez, Jose Javier; González-Lahoz, Juan; Soriano, Vincent

    2009-05-01

    Genotypic tools may allow easier and less expensive estimation of HIV tropism before prescription of CCR5 antagonists compared with the Trofile assay (Monogram Biosciences, South San Francisco, CA, USA). Paired genotypic and Trofile results were compared in plasma samples derived from the maraviroc expanded access programme (EAP) in Europe. A new genotypic approach was built to improve the sensitivity to detect X4 variants based on an optimization of the webPSSM algorithm. Then, the new tool was validated in specimens from patients included in the ALLEGRO trial, a multicentre study conducted in Spain to assess the prevalence of R5 variants in treatment-experienced HIV patients. A total of 266 specimens from the maraviroc EAP were tested. Overall geno/pheno concordance was above 72%. A high specificity was generally seen for the detection of X4 variants using genotypic tools (ranging from 58% to 95%), while sensitivity was low (ranging from 31% to 76%). The PSSM score was then optimized to enhance the sensitivity to detect X4 variants changing the original threshold for R5 categorization. The new PSSM algorithms, PSSM(X4R5-8) and PSSM(SINSI-6.4), considered as X4 all V3 scoring values above -8 or -6.4, respectively, increasing the sensitivity to detect X4 variants up to 80%. The new algorithms were then validated in 148 specimens derived from patients included in the ALLEGRO trial. The sensitivity/specificity to detect X4 variants was 93%/69% for PSSM(X4R5-8) and 93%/70% for PSSM(SINSI-6.4). PSSM(X4R5-8) and PSSM(SINSI-6.4) may confidently assist therapeutic decisions for using CCR5 antagonists in HIV patients, providing an easier and rapid estimation of tropism in clinical samples.

  4. Standard error of measurement of five health utility indexes across the range of health for use in estimating reliability and responsiveness

    Science.gov (United States)

    Palta, Mari; Chen, Han-Yang; Kaplan, Robert M.; Feeny, David; Cherepanov, Dasha; Fryback, Dennis

    2011-01-01

    Background Standard errors of measurement (SEMs) of health related quality of life (HRQoL) indexes are not well characterized. SEM is needed to estimate responsiveness statistics and provides guidance on using indexes on the individual and group level. SEM is also a component of reliability. Purpose To estimate SEM of five HRQoL indexes. Design The National Health Measurement Study (NHMS) was a population based telephone survey. The Clinical Outcomes and Measurement of Health Study (COMHS) provided repeated measures 1 and 6 months post cataract surgery. Subjects 3844 randomly selected adults from the non-institutionalized population 35 to 89 years old in the contiguous United States and 265 cataract patients. Measurements The SF6-36v2™, QWB-SA, EQ-5D, HUI2 and HUI3 were included. An item-response theory (IRT) approach captured joint variation in indexes into a composite construct of health (theta). We estimated: (1) the test-retest standard deviation (SEM-TR) from COMHS, (2) the structural standard deviation (SEM-S) around the composite construct from NHMS and (3) corresponding reliability coefficients. Results SEM-TR was 0.068 (SF-6D), 0.087 (QWB-SA), 0.093 (EQ-5D), 0.100 (HUI2) and 0.134 (HUI3), while SEM-S was 0.071, 0.094, 0.084, 0.074 and 0.117, respectively. These translate into reliability coefficients for SF-6D: 0.66 (COMHS) and 0.71 (NHMS), for QWB: 0.59 and 0.64, for EQ-5D: 0.61 and 0.70 for HUI2: 0.64 and 0.80, and for HUI3: 0.75 and 0.77, respectively. The SEM varied considerably across levels of health, especially for HUI2, HUI3 and EQ-5D, and was strongly influenced by ceiling effects. Limitations Repeated measures were five months apart and estimated theta contain measurement error. Conclusions The two types of SEM are similar and substantial for all the indexes, and vary across the range of health. PMID:20935280

  5. Main aspects of estimating the working capability under prolonged effect of ionizing radiation; Osnovnye aspekty otsenki rabotosposobnosti izdelij pri dlitel`nom vozdejstvii ioniziruyushchikh izluchenij

    Energy Technology Data Exchange (ETDEWEB)

    Kulikov, I V

    1994-12-31

    Methodic recommendations concerning the investigation into the working capacity of products for evaluation of radiation resistance, reliability and physical aging under thermo current loads are given. Applicability of defect additive accumulation principle, radiation defect annealing kinetics and the assumed effect of the product resistance change under different values of the radiation intensity are described.

  6. A fast-reliable methodology to estimate the concentration of rutile or anatase phases of TiO2

    Directory of Open Access Journals (Sweden)

    A. R. Zanatta

    2017-07-01

    Full Text Available Titanium-dioxide (TiO2 is a low-cost, chemically inert material that became the basis of many modern applications ranging from, for example, cosmetics to photovoltaics. TiO2 exists in three different crystal phases − Rutile, Anatase and, less commonly, Brookite − and, in most of the cases, the presence or relative amount of these phases are essential to decide the TiO2 final application and its related efficiency. Traditionally, X-ray diffraction has been chosen to study TiO2 and provides both the phases identification and the Rutile-to-Anatase ratio. Similar information can be achieved from Raman scattering spectroscopy that, additionally, is versatile and involves rather simple instrumentation. Motivated by these aspects this work took into account various TiO2 Rutile+Anatase powder mixtures and their corresponding Raman spectra. Essentially, the method described here was based upon the fact that the Rutile and Anatase crystal phases have distinctive phonon features, and therefore, the composition of the TiO2 mixtures can be readily assessed from their Raman spectra. The experimental results clearly demonstrate the suitability of Raman spectroscopy in estimating the concentration of Rutile or Anatase in TiO2 and is expected to influence the study of TiO2-related thin films, interfaces, systems with reduced dimensions, and devices like photocatalytic and solar cells.

  7. Space Logistics: Launch Capabilities

    Science.gov (United States)

    Furnas, Randall B.

    1989-01-01

    The current maximum launch capability for the United States are shown. The predicted Earth-to-orbit requirements for the United States are presented. Contrasting the two indicates the strong National need for a major increase in Earth-to-orbit lift capability. Approximate weights for planned payloads are shown. NASA is studying the following options to meet the need for a new heavy-lift capability by mid to late 1990's: (1) Shuttle-C for near term (include growth versions); and (2) the Advanced Lauching System (ALS) for the long term. The current baseline two-engine Shuttle-C has a 15 x 82 ft payload bay and an expected lift capability of 82,000 lb to Low Earth Orbit. Several options are being considered which have expanded diameter payload bays. A three-engine Shuttle-C with an expected lift of 145,000 lb to LEO is being evaluated as well. The Advanced Launch System (ALS) is a potential joint development between the Air Force and NASA. This program is focused toward long-term launch requirements, specifically beyond the year 2000. The basic approach is to develop a family of vehicles with the same high reliability as the Shuttle system, yet offering a much greater lift capability at a greatly reduced cost (per pound of payload). The ALS unmanned family of vehicles will provide a low end lift capability equivalent to Titan IV, and a high end lift capability greater than the Soviet Energia if requirements for such a high-end vehicle are defined.In conclusion, the planning of the next generation space telescope should not be constrained to the current launch vehicles. New vehicle designs will be driven by the needs of anticipated heavy users.

  8. Using Length of Stay to Control for Unobserved Heterogeneity When Estimating Treatment Effect on Hospital Costs with Observational Data: Issues of Reliability, Robustness, and Usefulness.

    Science.gov (United States)

    May, Peter; Garrido, Melissa M; Cassel, J Brian; Morrison, R Sean; Normand, Charles

    2016-10-01

    To evaluate the sensitivity of treatment effect estimates when length of stay (LOS) is used to control for unobserved heterogeneity when estimating treatment effect on cost of hospital admission with observational data. We used data from a prospective cohort study on the impact of palliative care consultation teams (PCCTs) on direct cost of hospital care. Adult patients with an advanced cancer diagnosis admitted to five large medical and cancer centers in the United States between 2007 and 2011 were eligible for this study. Costs were modeled using generalized linear models with a gamma distribution and a log link. We compared variability in estimates of PCCT impact on hospitalization costs when LOS was used as a covariate, as a sample parameter, and as an outcome denominator. We used propensity scores to account for patient characteristics associated with both PCCT use and total direct hospitalization costs. We analyzed data from hospital cost databases, medical records, and questionnaires. Our propensity score weighted sample included 969 patients who were discharged alive. In analyses of hospitalization costs, treatment effect estimates are highly sensitive to methods that control for LOS, complicating interpretation. Both the magnitude and significance of results varied widely with the method of controlling for LOS. When we incorporated intervention timing into our analyses, results were robust to LOS-controls. Treatment effect estimates using LOS-controls are not only suboptimal in terms of reliability (given concerns over endogeneity and bias) and usefulness (given the need to validate the cost-effectiveness of an intervention using overall resource use for a sample defined at baseline) but also in terms of robustness (results depend on the approach taken, and there is little evidence to guide this choice). To derive results that minimize endogeneity concerns and maximize external validity, investigators should match and analyze treatment and comparison arms

  9. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  10. Disability as deprivation of capabilities: Estimation using a large-scale survey in Morocco and Tunisia and an instrumental variable approach.

    Science.gov (United States)

    Trani, Jean-Francois; Bakhshi, Parul; Brown, Derek; Lopez, Dominique; Gall, Fiona

    2018-05-25

    The capability approach pioneered by Amartya Sen and Martha Nussbaum offers a new paradigm to examine disability, poverty and their complex associations. Disability is hence defined as a situation in which a person with an impairment faces various forms of restrictions in functionings and capabilities. Additionally, poverty is not the mere absence of income but a lack of ability to achieve essential functionings; disability is consequently the poverty of capabilities of persons with impairment. It is the lack of opportunities in a given context and agency that leads to persons with disabilities being poorer than other social groups. Consequently, poverty of people with disabilities comprises of complex processes of social exclusion and disempowerment. Despite growing evidence that persons with disabilities face higher levels of poverty, the literature from low and middle-income countries that analyzes the causal link between disability and poverty, remains limited. Drawing on data from a large case control field survey carried out between December 24th , 2013 and February 16th , 2014 in Tunisia and between November 4th , 2013 and June 12th , 2014 in Morocco, we examined the effect of impairment on various basic capabilities, health related quality of life and multidimensional poverty - indicators of poor wellbeing-in Morocco and Tunisia. To demonstrate a causal link between impairment and deprivation of capabilities, we used instrumental variable regression analyses. In both countries, we found lower access to jobs for persons with impairment. Health related quality of life was also lower for this group who also faced a higher risk of multidimensional poverty. There was no significant direct effect of impairment on access to school and acquiring literacy in both countries, and on access to health care and expenses in Tunisia, while having an impairment reduced access to healthcare facilities in Morocco and out of pocket expenditures. These results suggest that

  11. Neural network fusion capabilities for efficient implementation of tracking algorithms

    Science.gov (United States)

    Sundareshan, Malur K.; Amoozegar, Farid

    1997-03-01

    The ability to efficiently fuse information of different forms to facilitate intelligent decision making is one of the major capabilities of trained multilayer neural networks that is now being recognized. While development of innovative adaptive control algorithms for nonlinear dynamical plants that attempt to exploit these capabilities seems to be more popular, a corresponding development of nonlinear estimation algorithms using these approaches, particularly for application in target surveillance and guidance operations, has not received similar attention. We describe the capabilities and functionality of neural network algorithms for data fusion and implementation of tracking filters. To discuss details and to serve as a vehicle for quantitative performance evaluations, the illustrative case of estimating the position and velocity of surveillance targets is considered. Efficient target- tracking algorithms that can utilize data from a host of sensing modalities and are capable of reliably tracking even uncooperative targets executing fast and complex maneuvers are of interest in a number of applications. The primary motivation for employing neural networks in these applications comes from the efficiency with which more features extracted from different sensor measurements can be utilized as inputs for estimating target maneuvers. A system architecture that efficiently integrates the fusion capabilities of a trained multilayer neural net with the tracking performance of a Kalman filter is described. The innovation lies in the way the fusion of multisensor data is accomplished to facilitate improved estimation without increasing the computational complexity of the dynamical state estimator itself.

  12. Ultrasound estimates of muscle quality in older adults: reliability and comparison of Photoshop and ImageJ for the grayscale analysis of muscle echogenicity.

    Science.gov (United States)

    Harris-Love, Michael O; Seamon, Bryant A; Teixeira, Carla; Ismail, Catheeja

    2016-01-01

    Background. Quantitative diagnostic ultrasound imaging has been proposed as a method of estimating muscle quality using measures of echogenicity. The Rectangular Marquee Tool (RMT) and the Free Hand Tool (FHT) are two types of editing features used in Photoshop and ImageJ for determining a region of interest (ROI) within an ultrasound image. The primary objective of this study is to determine the intrarater and interrater reliability of Photoshop and ImageJ for the estimate of muscle tissue echogenicity in older adults via grayscale histogram analysis. The secondary objective is to compare the mean grayscale values obtained using both the RMT and FHT methods across both image analysis platforms. Methods. This cross-sectional observational study features 18 community-dwelling men (age = 61.5 ± 2.32 years). Longitudinal views of the rectus femoris were captured using B-mode ultrasound. The ROI for each scan was selected by 2 examiners using the RMT and FHT methods from each software program. Their reliability is assessed using intraclass correlation coefficients (ICCs) and the standard error of the measurement (SEM). Measurement agreement for these values is depicted using Bland-Altman plots. A paired t-test is used to determine mean differences in echogenicity expressed as grayscale values using the RMT and FHT methods to select the post-image acquisition ROI. The degree of association among ROI selection methods and image analysis platforms is analyzed using the coefficient of determination (R (2)). Results. The raters demonstrated excellent intrarater and interrater reliability using the RMT and FHT methods across both platforms (lower bound 95% CI ICC = .97-.99, p Photoshop was .97 and 1.05 grayscale levels when using the RMT and FHT ROI selection methods, respectively. Comparatively, the SEM values were .72 and .81 grayscale levels, respectively, when using the RMT and FHT ROI selection methods in ImageJ. Uniform coefficients of determination (R (2) = .96

  13. Sensor Alerting Capability

    Science.gov (United States)

    Henriksson, Jakob; Bermudez, Luis; Satapathy, Goutam

    2013-04-01

    There is a large amount of sensor data generated today by various sensors, from in-situ buoys to mobile underwater gliders. Providing sensor data to the users through standardized services, language and data model is the promise of OGC's Sensor Web Enablement (SWE) initiative. As the amount of data grows it is becoming difficult for data providers, planners and managers to ensure reliability of data and services and to monitor critical data changes. Intelligent Automation Inc. (IAI) is developing a net-centric alerting capability to address these issues. The capability is built on Sensor Observation Services (SOSs), which is used to collect and monitor sensor data. The alerts can be configured at the service level and at the sensor data level. For example it can alert for irregular data delivery events or a geo-temporal statistic of sensor data crossing a preset threshold. The capability provides multiple delivery mechanisms and protocols, including traditional techniques such as email and RSS. With this capability decision makers can monitor their assets and data streams, correct failures or be alerted about a coming phenomena.

  14. Reliable Function Approximation and Estimation

    Science.gov (United States)

    2016-08-16

    compressed sensing results to a wide class of infinite -dimensional problems. We discuss four key application domains for the methods developed in this... infinite -dimensional problems. We discuss four key findings arising from this project, as related to uncertainty quantification, image processing, matrix...compressed sensing results to a wide class of infinite -dimensional problems. We discuss four key application domains for the methods developed in this project

  15. Uncertainties and reliability theories for reactor safety

    International Nuclear Information System (INIS)

    Veneziano, D.

    1975-01-01

    What makes the safety problem of nuclear reactors particularly challenging is the demand for high levels of reliability and the limitation of statistical information. The latter is an unfortunate circumstance, which forces deductive theories of reliability to use models and parameter values with weak factual support. The uncertainty about probabilistic models and parameters which are inferred from limited statistical evidence can be quantified and incorporated rationally into inductive theories of reliability. In such theories, the starting point is the information actually available, as opposed to an estimated probabilistic model. But, while the necessity of introducing inductive uncertainty into reliability theories has been recognized by many authors, no satisfactory inductive theory is presently available. The paper presents: a classification of uncertainties and of reliability models for reactor safety; a general methodology to include these uncertainties into reliability analysis; a discussion about the relative advantages and the limitations of various reliability theories (specifically, of inductive and deductive, parametric and nonparametric, second-moment and full-distribution theories). For example, it is shown that second-moment theories, which were originally suggested to cope with the scarcity of data, and which have been proposed recently for the safety analysis of secondary containment vessels, are the least capable of incorporating statistical uncertainty. The focus is on reliability models for external threats (seismic accelerations and tornadoes). As an application example, the effect of statistical uncertainty on seismic risk is studied using parametric full-distribution models

  16. Secondary dentine as a sole parameter for age estimation: Comparison and reliability of qualitative and quantitative methods among North Western adult Indians

    Directory of Open Access Journals (Sweden)

    Jasbir Arora

    2016-06-01

    Full Text Available The indestructible nature of teeth against most of the environmental abuses makes its use in disaster victim identification (DVI. The present study has been undertaken to examine the reliability of Gustafson’s qualitative method and Kedici’s quantitative method of measuring secondary dentine for age estimation among North Western adult Indians. 196 (M = 85; F = 111 single rooted teeth were collected from the Department of Oral Health Sciences, PGIMER, Chandigarh. Ground sections were prepared and the amount of secondary dentine formed was scored qualitatively according to Gustafson’s (0–3 scoring system (method 1 and quantitatively following Kedici’s micrometric measurement method (method 2. Out of 196 teeth 180 samples (M = 80; F = 100 were found to be suitable for measuring secondary dentine following Kedici’s method. Absolute mean error of age was calculated by both methodologies. Results clearly showed that in pooled data, method 1 gave an error of ±10.4 years whereas method 2 exhibited an error of approximately ±13 years. A statistically significant difference was noted in absolute mean error of age between two methods of measuring secondary dentine for age estimation. Further, it was also revealed that teeth extracted for periodontal reasons severely decreased the accuracy of Kedici’s method however, the disease had no effect while estimating age by Gustafson’s method. No significant gender differences were noted in the absolute mean error of age by both methods which suggest that there is no need to separate data on the basis of gender.

  17. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  18. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  19. A GIS-based assessment of the suitability of SCIAMACHY satellite sensor measurements for estimating reliable CO concentrations in a low-latitude climate.

    Science.gov (United States)

    Fagbeja, Mofoluso A; Hill, Jennifer L; Chatterton, Tim J; Longhurst, James W S

    2015-02-01

    An assessment of the reliability of the Scanning Imaging Absorption Spectrometer for Atmospheric Cartography (SCIAMACHY) satellite sensor measurements to interpolate tropospheric concentrations of carbon monoxide considering the low-latitude climate of the Niger Delta region in Nigeria was conducted. Monthly SCIAMACHY carbon monoxide (CO) column measurements from January 2,003 to December 2005 were interpolated using ordinary kriging technique. The spatio-temporal variations observed in the reliability were based on proximity to the Atlantic Ocean, seasonal variations in the intensities of rainfall and relative humidity, the presence of dust particles from the Sahara desert, industrialization in Southwest Nigeria and biomass burning during the dry season in Northern Nigeria. Spatial reliabilities of 74 and 42 % are observed for the inland and coastal areas, respectively. Temporally, average reliability of 61 and 55 % occur during the dry and wet seasons, respectively. Reliability in the inland and coastal areas was 72 and 38 % during the wet season, and 75 and 46 % during the dry season, respectively. Based on the results, the WFM-DOAS SCIAMACHY CO data product used for this study is therefore relevant in the assessment of CO concentrations in developing countries within the low latitudes that could not afford monitoring infrastructure due to the required high costs. Although the SCIAMACHY sensor is no longer available, it provided cost-effective, reliable and accessible data that could support air quality assessment in developing countries.

  20. Rights, goals, and capabilities

    NARCIS (Netherlands)

    van Hees, M.V.B.P.M

    This article analyses the relationship between rights and capabilities in order to get a better grasp of the kind of consequentialism that the capability theory represents. Capability rights have been defined as rights that have a capability as their object (rights to capabilities). Such a

  1. The rating reliability calculator

    Directory of Open Access Journals (Sweden)

    Solomon David J

    2004-04-01

    Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.

  2. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  3. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  4. Mobile Test Capabilities

    Data.gov (United States)

    Federal Laboratory Consortium — The Electrical Power Mobile Test capabilities are utilized to conduct electrical power quality testing on aircraft and helicopters. This capability allows that the...

  5. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  6. Capabilities for Strategic Adaptation

    DEFF Research Database (Denmark)

    Distel, Andreas Philipp

    This dissertation explores capabilities that enable firms to strategically adapt to environmental changes and preserve competitiveness over time – often referred to as dynamic capabilities. While dynamic capabilities being a popular research domain, too little is known about what these capabiliti...

  7. Mobile systems capability plan

    International Nuclear Information System (INIS)

    1996-09-01

    This plan was prepared to initiate contracting for and deployment of these mobile system services. 102,000 cubic meters of retrievable, contact-handled TRU waste are stored at many sites around the country. Also, an estimated 38,000 cubic meters of TRU waste will be generated in the course of waste inventory workoff and continuing DOE operations. All the defense TRU waste is destined for disposal in WIPP near Carlsbad NM. To ship TRU waste there, sites must first certify that the waste meets WIPP waste acceptance criteria. The waste must be characterized, and if not acceptable, subjected to additional processing, including repackaging. Most sites plan to use existing fixed facilities or open new ones between FY1997-2006 to perform these functions; small-quantity sites lack this capability. An alternative to fixed facilities is the use of mobile systems mounted in trailers or skids, and transported to sites. Mobile systems will be used for all characterization and certification at small sites; large sites can also use them. The Carlsbad Area Office plans to pursue a strategy of privatization of mobile system services, since this offers a number of advantages. To indicate the possible magnitude of the costs of deploying mobile systems, preliminary estimates of equipment, maintenance, and operating costs over a 10-year period were prepared and options for purchase, lease, and privatization through fixed-price contracts considered

  8. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  9. Dynamic capabilities, Marketing Capability and Organizational Performance

    Directory of Open Access Journals (Sweden)

    Adriana Roseli Wünsch Takahashi

    2017-01-01

    Full Text Available The goal of the study is to investigate the influence of dynamic capabilities on organizational performance and the role of marketing capabilities as a mediator in this relationship in the context of private HEIs in Brazil. As a research method we carried out a survey with 316 IES and data analysis was operationalized with the technique of structural equation modeling. The results indicate that the dynamic capabilities have influence on organizational performance only when mediated by marketing ability. The marketing capability has an important role in the survival, growth and renewal on educational services offerings for HEIs in private sector, and consequently in organizational performance. It is also demonstrated that mediated relationship is more intense for HEI with up to 3,000 students and other organizational profile variables such as amount of courses, the constitution, the type of institution and type of education do not significantly alter the results.

  10. 用Delta法估计多维测验合成信度的置信区间%Estimating the Confidence Interval of Composite Reliability of a Multidimensional Test With the Delta Method

    Institute of Scientific and Technical Information of China (English)

    叶宝娟; 温忠麟

    2012-01-01

    Reliability is very important in evaluating the quality of a test. Based on the confirmatory factor analysis, composite reliabili- ty is a good index to estimate the test reliability for general applications. As is well known, point estimate contains limited information a- bout a population parameter and cannot indicate how far it can be from the population parameter. The confidence interval of the parame- ter can provide more information. In evaluating the quality of a test, the confidence interval of composite reliability has received atten- tion in recent years. There are three approaches to estimating the confidence interval of composite reliability of an unidimensional test: the Bootstrap method, the Delta method, and the direct use of the standard error of a software output (e. g. , LISREL). The Bootstrap method pro- vides empirical results of the standard error, and is the most credible method. But it needs data simulation techniques, and its computa- tion process is rather complex. The Delta method computes the standard error of composite reliability by approximate calculation. It is simpler than the Bootstrap method. The LISREL software can directly prompt the standard error, and it is the easiest among the three methods. By simulation study, it had been found that the interval estimates obtained by the Delta method and the Bootstrap method were almost identical, whereas the results obtained by LISREL and by the Bootstrap method were substantially different ( Ye & Wen, 2011 ). The Delta method is recommended when the confidence interval of composite reliability of a unidimensional test is estimated, because the Delta method is simpler than the Bootstrap method. There was little research about how to compute the confidence interval of composite reliability of a multidimensional test. We de- duced a formula by using the Delta method for computing the standard error of composite reliability of a multidimensional test. Based on the standard error, the

  11. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  12. Parameter Estimation of a Reliability Model of Demand-Caused and Standby-Related Failures of Safety Components Exposed to Degradation by Demand Stress and Ageing That Undergo Imperfect Maintenance

    Directory of Open Access Journals (Sweden)

    S. Martorell

    2017-01-01

    Full Text Available One can find many reliability, availability, and maintainability (RAM models proposed in the literature. However, such models become more complex day after day, as there is an attempt to capture equipment performance in a more realistic way, such as, explicitly addressing the effect of component ageing and degradation, surveillance activities, and corrective and preventive maintenance policies. Then, there is a need to fit the best model to real data by estimating the model parameters using an appropriate tool. This problem is not easy to solve in some cases since the number of parameters is large and the available data is scarce. This paper considers two main failure models commonly adopted to represent the probability of failure on demand (PFD of safety equipment: (1 by demand-caused and (2 standby-related failures. It proposes a maximum likelihood estimation (MLE approach for parameter estimation of a reliability model of demand-caused and standby-related failures of safety components exposed to degradation by demand stress and ageing that undergo imperfect maintenance. The case study considers real failure, test, and maintenance data for a typical motor-operated valve in a nuclear power plant. The results of the parameters estimation and the adoption of the best model are discussed.

  13. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  14. Assessment of the reliability of ultrasonic inspection methods

    International Nuclear Information System (INIS)

    Haines, N.F.; Langston, D.B.; Green, A.J.; Wilson, R.

    1982-01-01

    The reliability of NDT techniques has remained an open question for many years. A reliable technique may be defined as one that, when rigorously applied by a number of inspection teams, consistently finds then correctly sizes all defects of concern. In this paper we report an assessment of the reliability of defect detection by manual ultrasonic methods applied to the inspection of thick section pressure vessel weldments. Initially we consider the available data relating to the inherent physical capabilities of ultrasonic techniques to detect cracks in weldment and then, independently, we assess the likely variability in team to team performance when several teams are asked to follow the same specified test procedure. The two aspects of 'capability' and 'variability' are brought together to provide quantitative estimates of the overall reliability of ultrasonic inspection of thick section pressure vessel weldments based on currently existing data. The final section of the paper considers current research programmes on reliability and presents a view on how these will help to further improve NDT reliability. (author)

  15. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  16. On the reliability of a simple method for scoring phenotypes to estimate heritability: A case study with pupal color in Heliconius erato phyllis , Fabricius 1775 (Lepidoptera, Nymphalidae

    Directory of Open Access Journals (Sweden)

    Adriano Andrejew Ferreira

    2009-01-01

    Full Text Available In this paper, two methods for assessing the degree of melanization of pupal exuviae from the butterfly Heliconius erato phyllis , Fabricius 1775 (Lepidoptera, Nymphalidae, Heliconiini are compared. In the first method, which was qualitative, the exuviae were classified by scoring the degree of melanization, whereas in the second method, which was quantitative, the exuviae were classified by optical density followed by analysis with appropriate software. The heritability (h 2 of the degree of melanization was estimated by regression and analysis of variance. The estimates of h 2 were similar with both methods, indicating that the qualitative method could be particularly suitable for field work. The low estimates obtained for heritability may have resulted from the small sample size ( n = 7-18 broods, including the parents or from the allocation-priority hypothesis in which pupal color would be a lower priority trait compared to morphological traits and adequate larval development.

  17. Production capability and supply

    International Nuclear Information System (INIS)

    Klemenic, J.

    1977-01-01

    The strong market for uranium of recent years is about to usher in a new era in domestic uranium production. The spot market price of uranium has remained relatively stable at a little over $40/lb for more than 18 months. Many of the recent contracts for delivery in the early 1980s are calling for prices in the range of $40 to $65 per lb in year-of-delivery dollars. Low-grade, high-cost projects, such as uranium recovery from mill tailings and the reopening of ''mined-out'' ore bodies, have already been initiated. New underground mines to produce at greater depths, and new surface mines to recover lower grade ores, are being developed or seriously planned. In keeping with this movement to recover uranium from low-grade ore and other high cost materials, the Grand Junction Office has examined, for the first time, the production capability of the domestic industry assuming a $30/lb (or less) ''forward cost'' resource base. As in the past, keep in mind that the market price needed to stimulate full production of a given resource base may be significantly higher than the estimated forward cost of producing that resource. Results of the $30/lb study are presented

  18. Stochastic Modeling of Long-Term and Extreme Value Estimation of Wind and Sea Conditions for Probabilistic Reliability Assessments of Wave Energy Devices

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2014-01-01

    Wave energy power plants are expected to become one of the major future contribution to the sustainable electricity production. Optimal design of wave energy power plants is associated with modeling of physical, statistical, measurement and model uncertainties. This paper presents stochastic models...... for the significant wave height, the mean zero-crossing wave period and the wind speed for long-term and extreme estimations. The long-term estimation focuses on annual statistical distributions, the inter-annual variation of distribution parameters and the statistical uncertainty due to limited amount of data...

  19. Fatigue damage estimation in non-linear systems using a combination of Monte Carlo simulation and the First Order Reliability Method

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2015-01-01

    For non-linear systems the estimation of fatigue damage under stochastic loadings can be rather time-consuming. Usually Monte Carlo simulation (MCS) is applied, but the coefficient-of-variation (COV) can be large if only a small set of simulations can be done due to otherwise excessive CPU time...

  20. 78 FR 58492 - Generator Verification Reliability Standards

    Science.gov (United States)

    2013-09-24

    ... power capability that is available for planning models and bulk electric system reliability assessments... of generator equipment needed to support Bulk-Power System reliability and enhance coordination of... support Bulk-Power System reliability and will ensure that accurate data is verified and made available...

  1. Reliability of Wireless Sensor Networks

    Science.gov (United States)

    Dâmaso, Antônio; Rosa, Nelson; Maciel, Paulo

    2014-01-01

    Wireless Sensor Networks (WSNs) consist of hundreds or thousands of sensor nodes with limited processing, storage, and battery capabilities. There are several strategies to reduce the power consumption of WSN nodes (by increasing the network lifetime) and increase the reliability of the network (by improving the WSN Quality of Service). However, there is an inherent conflict between power consumption and reliability: an increase in reliability usually leads to an increase in power consumption. For example, routing algorithms can send the same packet though different paths (multipath strategy), which it is important for reliability, but they significantly increase the WSN power consumption. In this context, this paper proposes a model for evaluating the reliability of WSNs considering the battery level as a key factor. Moreover, this model is based on routing algorithms used by WSNs. In order to evaluate the proposed models, three scenarios were considered to show the impact of the power consumption on the reliability of WSNs. PMID:25157553

  2. An Evaluation Method of Equipment Reliability Configuration Management

    Science.gov (United States)

    Wang, Wei; Feng, Weijia; Zhang, Wei; Li, Yuan

    2018-01-01

    At present, many equipment development companies have been aware of the great significance of reliability of the equipment development. But, due to the lack of effective management evaluation method, it is very difficult for the equipment development company to manage its own reliability work. Evaluation method of equipment reliability configuration management is to determine the reliability management capabilities of equipment development company. Reliability is not only designed, but also managed to achieve. This paper evaluates the reliability management capabilities by reliability configuration capability maturity model(RCM-CMM) evaluation method.

  3. Building Service Provider Capabilities

    DEFF Research Database (Denmark)

    Brandl, Kristin; Jaura, Manya; Ørberg Jensen, Peter D.

    2015-01-01

    In this paper we study whether and how the interaction between clients and the service providers contributes to the development of capabilities in service provider firms. In situations where such a contribution occurs, we analyze how different types of activities in the production process...... process. We find that clients influence the development of human capital capabilities and management capabilities in reciprocally produced services. While in sequential produced services clients influence the development of organizational capital capabilities and management capital capabilities....... of the services, such as sequential or reciprocal task activities, influence the development of different types of capabilities. We study five cases of offshore-outsourced knowledge-intensive business services that are distinguished according to their reciprocal or sequential task activities in their production...

  4. Redefining reliability

    International Nuclear Information System (INIS)

    Paulson, S.L.

    1995-01-01

    Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas

  5. Field Programmable Gate Array Reliability Analysis Guidelines for Launch Vehicle Reliability Block Diagrams

    Science.gov (United States)

    Al Hassan, Mohammad; Britton, Paul; Hatfield, Glen Spencer; Novack, Steven D.

    2017-01-01

    Field Programmable Gate Arrays (FPGAs) integrated circuits (IC) are one of the key electronic components in today's sophisticated launch and space vehicle complex avionic systems, largely due to their superb reprogrammable and reconfigurable capabilities combined with relatively low non-recurring engineering costs (NRE) and short design cycle. Consequently, FPGAs are prevalent ICs in communication protocols and control signal commands. This paper will identify reliability concerns and high level guidelines to estimate FPGA total failure rates in a launch vehicle application. The paper will discuss hardware, hardware description language, and radiation induced failures. The hardware contribution of the approach accounts for physical failures of the IC. The hardware description language portion will discuss the high level FPGA programming languages and software/code reliability growth. The radiation portion will discuss FPGA susceptibility to space environment radiation.

  6. On the Reliability of Optimization Results for Trigeneration Systems in Buildings, in the Presence of Price Uncertainties and Erroneous Load Estimation

    Directory of Open Access Journals (Sweden)

    Antonio Piacentino

    2016-12-01

    Full Text Available Cogeneration and trigeneration plants are widely recognized as promising technologies for increasing energy efficiency in buildings. However, their overall potential is scarcely exploited, due to the difficulties in achieving economic viability and the risk of investment related to uncertainties in future energy loads and prices. Several stochastic optimization models have been proposed in the literature to account for uncertainties, but these instruments share in a common reliance on user-defined probability functions for each stochastic parameter. Being such functions hard to predict, in this paper an analysis of the influence of erroneous estimation of the uncertain energy loads and prices on the optimal plant design and operation is proposed. With reference to a hotel building, a number of realistic scenarios is developed, exploring all the most frequent errors occurring in the estimation of energy loads and prices. Then, profit-oriented optimizations are performed for the examined scenarios, by means of a deterministic mixed integer linear programming algorithm. From a comparison between the achieved results, it emerges that: (i the plant profitability is prevalently influenced by the average “spark-spread” (i.e., ratio between electricity and fuel price and, secondarily, from the shape of the daily price profiles; (ii the “optimal sizes” of the main components are scarcely influenced by the daily load profiles, while they are more strictly related with the average “power to heat” and “power to cooling” ratios of the building.

  7. Measures of differences in reliability

    International Nuclear Information System (INIS)

    Doksum, K.A.

    1975-01-01

    Measures of differences in reliability of two systems are considered in the scale model, location-scale model, and a nonparametric model. In each model, estimates and confidence intervals are given and some of their properties discussed

  8. Development of reliable pavement models.

    Science.gov (United States)

    2011-05-01

    The current report proposes a framework for estimating the reliability of a given pavement structure as analyzed by : the Mechanistic-Empirical Pavement Design Guide (MEPDG). The methodology proposes using a previously fit : response surface, in plac...

  9. Towards a national cybersecurity capability development model

    CSIR Research Space (South Africa)

    Jacobs, Pierre C

    2017-06-01

    Full Text Available to be broken down into its components, a model serves as a blueprint to ensure that those building the capability considers all components, allows for cost estimation and facilitates the evaluation of trade-offs. One national cybersecurity capability...

  10. Capability Handbook- offline metrology

    DEFF Research Database (Denmark)

    Islam, Aminul; Marhöfer, David Maximilian; Tosello, Guido

    This offline metrological capability handbook has been made in relation to HiMicro Task 3.3. The purpose of this document is to assess the metrological capability of the HiMicro partners and to gather the information of all available metrological instruments in the one single document. It provides...

  11. Dynamic Capabilities and Performance

    DEFF Research Database (Denmark)

    Wilden, Ralf; Gudergan, Siegfried P.; Nielsen, Bo Bernhard

    2013-01-01

    are contingent on the competitive intensity faced by firms. Our findings demonstrate the performance effects of internal alignment between organizational structure and dynamic capabilities, as well as the external fit of dynamic capabilities with competitive intensity. We outline the advantages of PLS...

  12. Developing Alliance Capabilities

    DEFF Research Database (Denmark)

    Heimeriks, Koen H.; Duysters, Geert; Vanhaverbeke, Wim

    This paper assesses the differential performance effects of learning mechanisms on the development of alliance capabilities. Prior research has suggested that different capability levels could be identified in which specific intra-firm learning mechanisms are used to enhance a firm's alliance...

  13. Telematics Options and Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Hodge, Cabell [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-05

    This presentation describes the data tracking and analytical capabilities of telematics devices. Federal fleet managers can use the systems to keep their drivers safe, maintain a fuel efficient fleet, ease their reporting burden, and save money. The presentation includes an example of how much these capabilities can save fleets.

  14. SU-E-T-598: Parametric Equation for Quick and Reliable Estimate of Stray Neutron Doses in Proton Therapy and Application for Intracranial Tumor Treatments

    Energy Technology Data Exchange (ETDEWEB)

    Bonfrate, A; Farah, J; Sayah, R; Clairand, I [Institut de Radioprotection et de Surete Nucleaire (IRSN), Fontenay-aux-roses (France); De Marzi, L; Delacroix, S [Institut Curie Centre de Protontherapie d Orsay (CPO), Orsay (France); Herault, J [Centre Antoine Lacassagne (CAL) Cyclotron biomedical, Nice (France); Lee, C [National Cancer Institute, Rockville, MD (United States); Bolch, W [Univ Florida, Gainesville, FL (United States)

    2015-06-15

    Purpose: Development of a parametric equation suitable for a daily use in routine clinic to provide estimates of stray neutron doses in proton therapy. Methods: Monte Carlo (MC) calculations using the UF-NCI 1-year-old phantom were exercised to determine the variation of stray neutron doses as a function of irradiation parameters while performing intracranial treatments. This was done by individually changing the proton beam energy, modulation width, collimator aperture and thickness, compensator thickness and the air gap size while their impact on neutron doses were put into a single equation. The variation of neutron doses with distance from the target volume was also included in it. Then, a first step consisted in establishing the fitting coefficients by using 221 learning data which were neutron absorbed doses obtained with MC simulations while a second step consisted in validating the final equation. Results: The variation of stray neutron doses with irradiation parameters were fitted with linear, polynomial, etc. model while a power-law model was used to fit the variation of stray neutron doses with the distance from the target volume. The parametric equation fitted well MC simulations while establishing fitting coefficients as the discrepancies on the estimate of neutron absorbed doses were within 10%. The discrepancy can reach ∼25% for the bladder, the farthest organ from the target volume. Finally, the validation showed results in compliance with MC calculations since the discrepancies were also within 10% for head-and-neck and thoracic organs while they can reach ∼25%, again for pelvic organs. Conclusion: The parametric equation presents promising results and will be validated for other target sites as well as other facilities to go towards a universal method.

  15. Impact of inservice inspection on the reliability of nuclear piping

    International Nuclear Information System (INIS)

    Woo, H.H.

    1983-12-01

    The reliability of nuclear piping is a function of piping quality as fabricated, service loadings and environments, plus programs of continuing inspection during operation. This report presents the results of a study of the impact of inservice inspection (ISI) programs on the reliability of specific nuclear piping systems that have actually failed in service. Two major factors are considered in the ISI programs: one is the capability of detecting flaws; the other is the frequency of performing ISI. A probabilistic fracture mechanics model issued to estimate the reliability of two nuclear piping lines over the plant life as functions of the ISI programs. Examples chosen for the study are the PWR feedwater steam generator nozzle cracking incident and the BWR recirculation reactor vessel nozzle safe-end cracking incident

  16. Combining Cystatin C and Creatinine Yields a Reliable Glomerular Filtration Rate Estimation in Older Adults in Contrast to β-Trace Protein and β2-Microglobulin.

    Science.gov (United States)

    Werner, Karin; Pihlsgård, Mats; Elmståhl, Sölve; Legrand, Helen; Nyman, Ulf; Christensson, Anders

    2017-01-01

    The glomerular filtration rate (GFR) is the most important measure of kidney function and chronic kidney disease (CKD). This study aims to validate commonly used equations for estimated GFR (eGFR) based on creatinine (cr), cystatin C (cys), β-trace protein (BTP), and β2-microglobulin (B2M) in older adults. We conducted a validation study with 126 participants aged between 72 and 98 with a mean measured GFR (mGFR) by iohexol clearance of 54 mL/min/1.73 m2. The eGFR equations (CKD-Epidemiology collaboration [CKD-EPI], Berlin Initiative Study [BIS], Full Age Spectrum [FAS], Modification of Diet in Renal Disease [MDRD]cr, Caucasian-Asian-Pediatric-Adult [CAPA]cys, Lund-Malmö Revised [LM-REV]cr, and MEAN-LM-CAPAcr-cys), were assessed in terms of bias (median difference: eGFR-mGFR), precision (interquartile range of the differences), and accuracy (P30: percentage of estimates ±30% of mGFR). The equations were compared to a benchmark equation: CKD-EPIcr-cys. All cystatin C-based equations underestimated the GFR compared to mGFR, whereas bias was mixed for the equations based only on creatinine. Accuracy was the highest for CKD-EPIcr-cys (98%) and lowest for MDRD (82%). Below mGFR 45 mL/min/1.73 m2 only equations incorporating cystatin C reached P30 accuracy >90%. CKD-EPIcr-cys was not significantly more accurate than the other cystatin C-based equations. In contrast, CKD-EPIcr-cys was significantly more accurate than all creatinine-based equations except LM-REVcr. This study confirms that it is reasonable to use equations incorporating cystatin C and creatinine in older patients across a wide spectrum of GFR. However, the results call into question the use of creatinine alone below mGFR 45 mL/min/1.73 m2. B2M and BTP do not demonstrate additional value in eGFR determination in older adults. © 2017 S. Karger AG, Basel.

  17. Direct Estimation of Power Distribution in Reactors for Nuclear Thermal Space Propulsion

    Science.gov (United States)

    Aldemir, Tunc; Miller, Don W.; Burghelea, Andrei

    2004-02-01

    A recently proposed constant temperature power sensor (CTPS) has the capability to directly measure the local power deposition rate in nuclear reactor cores proposed for space thermal propulsion. Such a capability reduces the uncertainties in the estimated power peaking factors and hence increases the reliability of the nuclear engine. The CTPS operation is sensitive to the changes in the local thermal conditions. A procedure is described for the automatic on-line calibration of the sensor through estimation of changes in thermal .conditions.

  18. FMEF/experimental capabilities

    International Nuclear Information System (INIS)

    Burgess, C.A.; Dronen, V.R.

    1981-01-01

    The Fuels and Materials Examination Facility (FMEF), under construction at the Hanford site north of Richland, Washington, will be one of the most modern facilities offering irradiated fuels and materials examination capabilities and fuel fabrication development technologies. Scheduled for completion in 1984, the FMEF will provide examination capability for fuel assemblies, fuel pins and test pins irradiated in the FFTF. Various functions of the FMEF are described, with emphasis on experimental data-gathering capabilities in the facility's Nondestructive and Destructive examination cell complex

  19. KSC Technical Capabilities Website

    Science.gov (United States)

    Nufer, Brian; Bursian, Henry; Brown, Laurette L.

    2010-01-01

    This document is the website pages that review the technical capabilities that the Kennedy Space Center (KSC) has for partnership opportunities. The purpose of this information is to make prospective customers aware of the capabilities and provide an opportunity to form relationships with the experts at KSC. The technical capabilities fall into these areas: (1) Ground Operations and Processing Services, (2) Design and Analysis Solutions, (3) Command and Control Systems / Services, (4) Materials and Processes, (5) Research and Technology Development and (6) Laboratories, Shops and Test Facilities.

  20. Determination of the influence of dispersion pattern of pesticide-resistant individuals on the reliability of resistance estimates using different sampling plans.

    Science.gov (United States)

    Shah, R; Worner, S P; Chapman, R B

    2012-10-01

    Pesticide resistance monitoring includes resistance detection and subsequent documentation/ measurement. Resistance detection would require at least one (≥1) resistant individual(s) to be present in a sample to initiate management strategies. Resistance documentation, on the other hand, would attempt to get an estimate of the entire population (≥90%) of the resistant individuals. A computer simulation model was used to compare the efficiency of simple random and systematic sampling plans to detect resistant individuals and to document their frequencies when the resistant individuals were randomly or patchily distributed. A patchy dispersion pattern of resistant individuals influenced the sampling efficiency of systematic sampling plans while the efficiency of random sampling was independent of such patchiness. When resistant individuals were randomly distributed, sample sizes required to detect at least one resistant individual (resistance detection) with a probability of 0.95 were 300 (1%) and 50 (10% and 20%); whereas, when resistant individuals were patchily distributed, using systematic sampling, sample sizes required for such detection were 6000 (1%), 600 (10%) and 300 (20%). Sample sizes of 900 and 400 would be required to detect ≥90% of resistant individuals (resistance documentation) with a probability of 0.95 when resistant individuals were randomly dispersed and present at a frequency of 10% and 20%, respectively; whereas, when resistant individuals were patchily distributed, using systematic sampling, a sample size of 3000 and 1500, respectively, was necessary. Small sample sizes either underestimated or overestimated the resistance frequency. A simple random sampling plan is, therefore, recommended for insecticide resistance detection and subsequent documentation.

  1. Resources, constraints and capabilities

    NARCIS (Netherlands)

    Dhondt, S.; Oeij, P.R.A.; Schröder, A.

    2018-01-01

    Human and financial resources as well as organisational capabilities are needed to overcome the manifold constraints social innovators are facing. To unlock the potential of social innovation for the whole society new (social) innovation friendly environments and new governance structures

  2. a Capability approach

    African Journals Online (AJOL)

    efforts towards gender equality in education as a means of achieving social justice. ... should mean that a lot of capability approach-oriented commentators are ... processes, their forms of exercising power, and their rules, unwritten cultures, ...

  3. Engineering Capabilities and Partnerships

    Science.gov (United States)

    Poulos, Steve

    2010-01-01

    This slide presentation reviews the engineering capabilities at Johnson Space Center, The presentation also reviews the partnerships that have resulted in successfully designed and developed projects that involved commercial and educational institutions.

  4. Calculating system reliability with SRFYDO

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, Jerome [Los Alamos National Laboratory; Anderson - Cook, Christine M [Los Alamos National Laboratory; Klamann, Richard M [Los Alamos National Laboratory

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  5. Reliability engineering

    International Nuclear Information System (INIS)

    Nieuwhof, G.W.E.

    1976-01-01

    Failure of systems is undesirable, but also inevitable. The consequences of failure can be reduced if the failure mode can be anticipated and repair procedures planned in advance. The fault tree analysis is one method of identifying the most probable failure modes and determining system failure rates. From these rates, repair frequency can be estimated. (author)

  6. Brandishing Cyberattack Capabilities

    Science.gov (United States)

    2013-01-01

    Advertising cyberwar capabilities may be helpful. It may back up a deterrence strategy. It might dissuade other states from conventional mischief or...to enable the attack.5 Many of the instruments of the attack remain with the target system, nestled in its log files, or even in the malware itself...debat- able. Even if demonstrated, what worked yesterday may not work today. But difficult does not mean impossible. Advertising cyberwar capabilities

  7. CASL Dakota Capabilities Summary

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Brian M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Simmons, Chris [Univ. of Texas, Austin, TX (United States); Williams, Brian J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  8. BURAR: Detection and signal processing capabilities

    International Nuclear Information System (INIS)

    Ghica, Daniela; Radulian, Mircea; Popa, Mihaela

    2004-01-01

    Since July 2002, a new seismic monitoring station, the Bucovina Seismic Array (BURAR), has been installed in the northern part of Romania, in a joint effort of the Air Force Technical Applications Center, USA, and the National Institute for Earth Physics (NIEP), Romania. The array consists of 10 seismic sensors (9 short-period and one broad band) located in boreholes and distributed in a 5 x 5 km area. At present, the seismic data are continuously recorded by BURAR and transmitted in real-time to the Romanian National Data Centre (ROM N DC), at Bucharest and to the National Data Center of USA, in Florida. The statistical analysis for the seismic information gathered at ROM N DC by the BURAR in the August 2002 - December 2003 time interval points out a much better efficiency of the BURAR system in detecting teleseismic events and local events occurred in the N-NE part of Romanian territory, in comparison with the actual Romanian Telemetered Network. Furthermore, the BURAR monitoring system has proven to be an important source of reliable data for NIEP efforts in elaborating of the seismic bulletins. Signal processing capability of the system provides useful information in order to improve the location of the local seismic events, using the array beamforming facility. This method increases significantly the signal-to-noise ratio of the seismic signal by summing up the coherent signals from the array components. In this way, eventual source nucleation phases can be detected. At the same time, using the slowness and backazimuth estimations by f-k analysis, locations for the seismic events can be performed based only on the information recorded by the BURAR array, acting like a single seismic station recording system. Additionally, f-k analysis techniques are useful in the local site effects estimation and interpretation of the local geological structure. (authors)

  9. MHTGR inherent heat transfer capability

    International Nuclear Information System (INIS)

    Berkoe, J.M.

    1992-01-01

    This paper reports on the Commercial Modular High Temperature Gas-Cooled Reactor (MHTGR) which achieves improved reactor safety performance and reliability by utilizing a completely passive natural convection cooling system called the RCCS to remove decay heat in the event that all active cooling systems fail to operate. For the highly improbable condition that the RCCS were to become non-functional following a reactor depressurization event, the plant would be forced to rely upon its inherent thermo-physical characteristics to reject decay heat to the surrounding earth and ambient environment. A computational heat transfer model was created to simulate such a scenario. Plant component temperature histories were computed over a period of 20 days into the event. The results clearly demonstrate the capability of the MHTGR to maintain core integrity and provide substantial lead time for taking corrective measures

  10. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  11. Final Report for the Virtual Reliability Realization System LDRD

    Energy Technology Data Exchange (ETDEWEB)

    DELLIN, THEODORE A.; HENDERSON, CHRISTOPHER L.; O' TOOLE, EDWARD J.

    2000-12-01

    Current approaches to reliability are not adequate to keep pace with the need for faster, better and cheaper products and systems. This is especially true in high consequence of failure applications. The original proposal for the LDRD was to look at this challenge and see if there was a new paradigm that could make reliability predictions, along with a quantitative estimate of the risk in that prediction, in a way that was faster, better and cheaper. Such an approach would be based on the underlying science models that are the backbone of reliability predictions. The new paradigm would be implemented in two software tools: the Virtual Reliability Realization System (VRRS) and the Reliability Expert System (REX). The three-year LDRD was funded at a reduced level for the first year ($120K vs. $250K) and not renewed. Because of the reduced funding, we concentrated on the initial development of the expertise system. We developed an interactive semiconductor calculation tool needed for reliability analyses. We also were able to generate a basic functional system using Microsoft Siteserver Commerce Edition and Microsoft Sequel Server. The base system has the capability to store Office documents from multiple authors, and has the ability to track and charge for usage. The full outline of the knowledge model has been incorporated as well as examples of various types of content.

  12. Proposed Reliability/Cost Model

    Science.gov (United States)

    Delionback, L. M.

    1982-01-01

    New technique estimates cost of improvement in reliability for complex system. Model format/approach is dependent upon use of subsystem cost-estimating relationships (CER's) in devising cost-effective policy. Proposed methodology should have application in broad range of engineering management decisions.

  13. Reliability of Circumplex Axes

    Directory of Open Access Journals (Sweden)

    Micha Strack

    2013-06-01

    Full Text Available We present a confirmatory factor analysis (CFA procedure for computing the reliability of circumplex axes. The tau-equivalent CFA variance decomposition model estimates five variance components: general factor, axes, scale-specificity, block-specificity, and item-specificity. Only the axes variance component is used for reliability estimation. We apply the model to six circumplex types and 13 instruments assessing interpersonal and motivational constructs—Interpersonal Adjective List (IAL, Interpersonal Adjective Scales (revised; IAS-R, Inventory of Interpersonal Problems (IIP, Impact Messages Inventory (IMI, Circumplex Scales of Interpersonal Values (CSIV, Support Action Scale Circumplex (SAS-C, Interaction Problems With Animals (IPI-A, Team Role Circle (TRC, Competing Values Leadership Instrument (CV-LI, Love Styles, Organizational Culture Assessment Instrument (OCAI, Customer Orientation Circle (COC, and System for Multi-Level Observation of Groups (behavioral adjectives; SYMLOG—in 17 German-speaking samples (29 subsamples, grouped by self-report, other report, and metaperception assessments. The general factor accounted for a proportion ranging from 1% to 48% of the item variance, the axes component for 2% to 30%; and scale specificity for 1% to 28%, respectively. Reliability estimates varied considerably from .13 to .92. An application of the Nunnally and Bernstein formula proposed by Markey, Markey, and Tinsley overestimated axes reliabilities in cases of large-scale specificities but otherwise works effectively. Contemporary circumplex evaluations such as Tracey’s RANDALL are sensitive to the ratio of the axes and scale-specificity components. In contrast, the proposed model isolates both components.

  14. A study of operational and testing reliability in software reliability analysis

    International Nuclear Information System (INIS)

    Yang, B.; Xie, M.

    2000-01-01

    Software reliability is an important aspect of any complex equipment today. Software reliability is usually estimated based on reliability models such as nonhomogeneous Poisson process (NHPP) models. Software systems are improving in testing phase, while it normally does not change in operational phase. Depending on whether the reliability is to be predicted for testing phase or operation phase, different measure should be used. In this paper, two different reliability concepts, namely, the operational reliability and the testing reliability, are clarified and studied in detail. These concepts have been mixed up or even misused in some existing literature. Using different reliability concept will lead to different reliability values obtained and it will further lead to different reliability-based decisions made. The difference of the estimated reliabilities is studied and the effect on the optimal release time is investigated

  15. Technological Capability's Predictor Variables

    Directory of Open Access Journals (Sweden)

    Fernanda Maciel Reichert

    2011-03-01

    Full Text Available The aim of this study was to identify the factors that influence in configuration of the technological capability of companies in sectors with medium-low technological intensity. To achieve the goal proposed in this article a survey was carried out. Based on the framework developed by Lall (1992 which classifies firms in basic, intermediate and advanced level of technological capability; it was found that the predominant technological capability is intermediate, with 83.7% of respondent companies (plastics companies in Brazil. It is believed that the main contribution of this study is the finding that the dependent variable named “Technological Capability” can be explained at a rate of 65% by six variables: development of new processes; selection of the best equipment supplier; sales of internally developed new technology to third parties; design and manufacture of equipment; study of the work methods and perform inventory control; and improvement of product quality.

  16. Capabilities for innovation

    DEFF Research Database (Denmark)

    Nielsen, Peter; Nielsen, Rene Nesgaard; Bamberger, Simon Grandjean

    2012-01-01

    is a survey that collected information from 601 firms belonging to the private urban sector in Denmark. The survey was carried out in late 2010. Keywords: dynamic capabilities/innovation/globalization/employee/employer cooperation/Nordic model Acknowledgment: The GOPA study was financed by grant 20080053113......Technological developments combined with increasing levels of competition related to the ongoing globalization imply that firms find themselves in dynamic, changing environments that call for dynamic capabilities. This challenges the internal human and organizational resources of firms in general...

  17. Human push capability.

    Science.gov (United States)

    Barnett, Ralph L; Liber, Theodore

    2006-02-22

    Use of unassisted human push capability arises from time to time in the areas of crowd and animal control, the security of locked doors, the integrity of railings, the removal of tree stumps and entrenched vehicles, the manoeuvering of furniture, and athletic pursuits such as US football or wrestling. Depending on the scenario, human push capability involves strength, weight, weight distribution, push angle, footwear/floor friction, and the friction between the upper body and the pushed object. Simple models are used to establish the relationships among these factors.

  18. The Capability Approach

    OpenAIRE

    Robeyns, Ingrid

    2011-01-01

    textabstract In its most general description, the capability approach is a flexible and multi-purpose normative framework, rather than a precise theory of well-being, freedom or justice. At its core are two normative claims: first, the claim that the freedom to achieve well-being is of primary moral importance, and second, that freedom to achieve well-being is to be understood in terms of people’s capabilities, that is, their real opportunities to do and be what they have reason to value. Thi...

  19. Sandia QIS Capabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Muller, Richard P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-07-01

    Sandia National Laboratories has developed a broad set of capabilities in quantum information science (QIS), including elements of quantum computing, quantum communications, and quantum sensing. The Sandia QIS program is built atop unique DOE investments at the laboratories, including the MESA microelectronics fabrication facility, the Center for Integrated Nanotechnologies (CINT) facilities (joint with LANL), the Ion Beam Laboratory, and ASC High Performance Computing (HPC) facilities. Sandia has invested $75 M of LDRD funding over 12 years to develop unique, differentiating capabilities that leverage these DOE infrastructure investments.

  20. Structural reliability methods: Code development status

    Science.gov (United States)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  1. ISOPHOT - Capabilities and performance

    DEFF Research Database (Denmark)

    Lemke, D.; Klaas, U.; Abolins, J.

    1996-01-01

    ISOPHOT covers the largest wavelength range on ISO from 2.5 to 240 mu m. Its scientific capabilities include multi filter and multi-aperture photometry, polarimetry, imaging and spectrophotometry. All modes can optionally include a focal plane chopper. The backbone of the photometric calibration...

  2. Capabilities for Intercultural Dialogue

    Science.gov (United States)

    Crosbie, Veronica

    2014-01-01

    The capabilities approach offers a valuable analytical lens for exploring the challenge and complexity of intercultural dialogue in contemporary settings. The central tenets of the approach, developed by Amartya Sen and Martha Nussbaum, involve a set of humanistic goals including the recognition that development is a process whereby people's…

  3. Capabilities and Special Needs

    DEFF Research Database (Denmark)

    Kjeldsen, Christian Christrup

    into international consideration in relation to the implementation of the UN convention on the rights of persons with disabilities. As for the theoretical basis, the research makes use of the sociological open-ended and relational concepts of Pierre Bourdieu and the normative yardstick of the Capability Approach...

  4. Metrology Measurement Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Glen E. Gronniger

    2007-10-02

    This document contains descriptions of Federal Manufacturing & Technologies (FM&T) Metrology capabilities, traceability flow charts, and the measurement uncertainty of each measurement capability. Metrology provides NIST traceable precision measurements or equipment calibration for a wide variety of parameters, ranges, and state-of-the-art uncertainties. Metrology laboratories conform to the requirements of the Department of Energy Development and Production Manual Chapter 13.2, ANSI/ISO/IEC ANSI/ISO/IEC 17025:2005, and ANSI/NCSL Z540-1. FM&T Metrology laboratories are accredited by NVLAP for the parameters, ranges, and uncertainties listed in the specific scope of accreditation under NVLAP Lab code 200108-0. See the Internet at http://ts.nist.gov/Standards/scopes/2001080.pdf. These parameters are summarized. The Honeywell Federal Manufacturing & Technologies (FM&T) Metrology Department has developed measurement technology and calibration capability in four major fields of measurement: (1) Mechanical; (2) Environmental, Gas, Liquid; (3) Electrical (DC, AC, RF/Microwave); and (4) Optical and Radiation. Metrology Engineering provides the expertise to develop measurement capabilities for virtually any type of measurement in the fields listed above. A strong audit function has been developed to provide a means to evaluate the calibration programs of our suppliers and internal calibration organizations. Evaluation includes measurement audits and technical surveys.

  5. The Capability Approach

    NARCIS (Netherlands)

    I.A.M. Robeyns (Ingrid)

    2011-01-01

    textabstract In its most general description, the capability approach is a flexible and multi-purpose normative framework, rather than a precise theory of well-being, freedom or justice. At its core are two normative claims: first, the claim that the freedom to achieve well-being is of primary

  6. Capitalizing on capabilities.

    Science.gov (United States)

    Ulrich, Dave; Smallwood, Norm

    2004-06-01

    By making the most of organizational capabilities--employees' collective skills and fields of expertise--you can dramatically improve your company's market value. Although there is no magic list of proficiencies that every organization needs in order to succeed, the authors identify 11 intangible assets that well-managed companies tend to have: talent, speed, shared mind-set and coherent brand identity, accountability, collaboration, learning, leadership, customer connectivity, strategic unity, innovation, and efficiency. Such companies typically excel in only three of these capabilities while maintaining industry parity in the other areas. Organizations that fall below the norm in any of the 11 are likely candidates for dysfunction and competitive disadvantage. So you can determine how your company fares in these categories (or others, if the generic list doesn't suit your needs), the authors explain how to conduct a "capabilities audit," describing in particular the experiences and findings of two companies that recently performed such audits. In addition to highlighting which intangible assets are most important given the organization's history and strategy, this exercise will gauge how well your company delivers on its capabilities and will guide you in developing an action plan for improvement. A capabilities audit can work for an entire organization, a business unit, or a region--indeed, for any part of a company that has a strategy to generate financial or customer-related results. It enables executives to assess overall company strengths and weaknesses, senior leaders to define strategy, midlevel managers to execute strategy, and frontline leaders to achieve tactical results. In short, it helps turn intangible assets into concrete strengths.

  7. Improving Power Converter Reliability

    DEFF Research Database (Denmark)

    Ghimire, Pramod; de Vega, Angel Ruiz; Beczkowski, Szymon

    2014-01-01

    of a high-power IGBT module during converter operation, which may play a vital role in improving the reliability of the power converters. The measured voltage is used to estimate the module average junction temperature of the high and low-voltage side of a half-bridge IGBT separately in every fundamental......The real-time junction temperature monitoring of a high-power insulated-gate bipolar transistor (IGBT) module is important to increase the overall reliability of power converters for industrial applications. This article proposes a new method to measure the on-state collector?emitter voltage...... is measured in a wind power converter at a low fundamental frequency. To illustrate more, the test method as well as the performance of the measurement circuit are also presented. This measurement is also useful to indicate failure mechanisms such as bond wire lift-off and solder layer degradation...

  8. As reliable as the sun

    Science.gov (United States)

    Leijtens, J. A. P.

    2017-11-01

    Fortunately there is almost nothing as reliable as the sun which can consequently be utilized as a very reliable source of spacecraft power. In order to harvest this power, the solar panels have to be pointed towards the sun as accurately and reliably as possible. To this extend, sunsensors are available on almost every satellite to support vital sun-pointing capability throughout the mission, even in the deployment and save mode phases of the satellites life. Given the criticality of the application one would expect that after more than 50 years of sun sensor utilisation, such sensors would be fully matured and optimised. In actual fact though, the majority of sunsensors employed are still coarse sunsensors which have a proven extreme reliability but present major issues regarding albedo sensitivity and pointing accuracy.

  9. Reliability and Maintainability (RAM) Training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Packard, Michael H. (Editor)

    2000-01-01

    The theme of this manual is failure physics-the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low-cost reliable products. In a broader sense the manual should do more. It should underscore the urgent need CI for mature attitudes toward reliability. Five of the chapters were originally presented as a classroom course to over 1000 Martin Marietta engineers and technicians. Another four chapters and three appendixes have been added, We begin with a view of reliability from the years 1940 to 2000. Chapter 2 starts the training material with a review of mathematics and a description of what elements contribute to product failures. The remaining chapters elucidate basic reliability theory and the disciplines that allow us to control and eliminate failures.

  10. NDE reliability and probability of detection (POD) evolution and paradigm shift

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Surendra [NDE Engineering, Materials and Process Engineering, Honeywell Aerospace, Phoenix, AZ 85034 (United States)

    2014-02-18

    The subject of NDE Reliability and POD has gone through multiple phases since its humble beginning in the late 1960s. This was followed by several programs including the important one nicknamed “Have Cracks – Will Travel” or in short “Have Cracks” by Lockheed Georgia Company for US Air Force during 1974–1978. This and other studies ultimately led to a series of developments in the field of reliability and POD starting from the introduction of fracture mechanics and Damaged Tolerant Design (DTD) to statistical framework by Bernes and Hovey in 1981 for POD estimation to MIL-STD HDBK 1823 (1999) and 1823A (2009). During the last decade, various groups and researchers have further studied the reliability and POD using Model Assisted POD (MAPOD), Simulation Assisted POD (SAPOD), and applying Bayesian Statistics. All and each of these developments had one objective, i.e., improving accuracy of life prediction in components that to a large extent depends on the reliability and capability of NDE methods. Therefore, it is essential to have a reliable detection and sizing of large flaws in components. Currently, POD is used for studying reliability and capability of NDE methods, though POD data offers no absolute truth regarding NDE reliability, i.e., system capability, effects of flaw morphology, and quantifying the human factors. Furthermore, reliability and POD have been reported alike in meaning but POD is not NDE reliability. POD is a subset of the reliability that consists of six phases: 1) samples selection using DOE, 2) NDE equipment setup and calibration, 3) System Measurement Evaluation (SME) including Gage Repeatability and Reproducibility (Gage R and R) and Analysis Of Variance (ANOVA), 4) NDE system capability and electronic and physical saturation, 5) acquiring and fitting data to a model, and data analysis, and 6) POD estimation. This paper provides an overview of all major POD milestones for the last several decades and discuss rationale for using

  11. Prediction of software operational reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1995-01-01

    A number of software reliability models have been developed to estimate and to predict software reliability. However, there are no established standard models to quantify software reliability. Most models estimate the quality of software in reliability figures such as remaining faults, failure rate, or mean time to next failure at the testing phase, and they consider them ultimate indicators of software reliability. Experience shows that there is a large gap between predicted reliability during development and reliability measured during operation, which means that predicted reliability, or so-called test reliability, is not operational reliability. Customers prefer operational reliability to test reliability. In this study, we propose a method that predicts operational reliability rather than test reliability by introducing the testing environment factor that quantifies the changes in environments

  12. Atmospheric Release Advisory Capability

    International Nuclear Information System (INIS)

    Dickerson, M.H.; Gudiksen, P.H.; Sullivan, T.J.

    1983-02-01

    The Atmospheric Release Advisory Capability (ARAC) project is a Department of Energy (DOE) sponsored real-time emergency response service available for use by both federal and state agencies in case of a potential or actual atmospheric release of nuclear material. The project, initiated in 1972, is currently evolving from the research and development phase to full operation. Plans are underway to expand the existing capability to continuous operation by 1984 and to establish a National ARAC Center (NARAC) by 1988. This report describes the ARAC system, its utilization during the past two years, and plans for its expansion during the next five to six years. An integral part of this expansion is due to a very important and crucial effort sponsored by the Defense Nuclear Agency to extend the ARAC service to approximately 45 Department of Defense (DOD) sites throughout the continental US over the next three years

  13. Group Capability Model

    Science.gov (United States)

    Olejarski, Michael; Appleton, Amy; Deltorchio, Stephen

    2009-01-01

    The Group Capability Model (GCM) is a software tool that allows an organization, from first line management to senior executive, to monitor and track the health (capability) of various groups in performing their contractual obligations. GCM calculates a Group Capability Index (GCI) by comparing actual head counts, certifications, and/or skills within a group. The model can also be used to simulate the effects of employee usage, training, and attrition on the GCI. A universal tool and common method was required due to the high risk of losing skills necessary to complete the Space Shuttle Program and meet the needs of the Constellation Program. During this transition from one space vehicle to another, the uncertainty among the critical skilled workforce is high and attrition has the potential to be unmanageable. GCM allows managers to establish requirements for their group in the form of head counts, certification requirements, or skills requirements. GCM then calculates a Group Capability Index (GCI), where a score of 1 indicates that the group is at the appropriate level; anything less than 1 indicates a potential for improvement. This shows the health of a group, both currently and over time. GCM accepts as input head count, certification needs, critical needs, competency needs, and competency critical needs. In addition, team members are categorized by years of experience, percentage of contribution, ex-members and their skills, availability, function, and in-work requirements. Outputs are several reports, including actual vs. required head count, actual vs. required certificates, CGI change over time (by month), and more. The program stores historical data for summary and historical reporting, which is done via an Excel spreadsheet that is color-coded to show health statistics at a glance. GCM has provided the Shuttle Ground Processing team with a quantifiable, repeatable approach to assessing and managing the skills in their organization. They now have a common

  14. Expeditionary Rubber Removal Capability

    Science.gov (United States)

    2006-12-31

    the modified spray unit or system with equivalent capabilities. 24 25 9.8. A pressure sensor or caster wheels should be incorporated into the...DISCUSSION 18 8.0 CONCLUSIONS 23 9.0 RECOMMENDATIONS 24 APPENDIX A – DETAILED LIST OF EQUIPMENT AND MODIFICATIONS 26 APPENDIX B – LIST OF SOURCES FOR...tall Weight – 4820 lb (No Attachments) Top Speed – 18 mph High Flow Hydraulics (Optional) – 26 gpm Steering – All Wheel Steering Cargo Max Load

  15. Atmospheric release advisory capability

    International Nuclear Information System (INIS)

    Sullivan, T.J.

    1981-01-01

    The ARAC system (Atmospheric Release Advisory Capability) is described. The system is a collection of people, computers, computer models, topographic data and meteorological input data that together permits a calculation of, in a quasi-predictive sense, where effluent from an accident will migrate through the atmosphere, where it will be deposited on the ground, and what instantaneous and integrated dose an exposed individual would receive

  16. BURAR: Detection and signal processing capabilities

    International Nuclear Information System (INIS)

    Ghica, Daniela; Radulian, Mircea; Popa, Mihaela

    2004-01-01

    Since July 2002, a new seismic monitoring station, the Bucovina Seismic Array (BURAR), has been installed in the northern part of Romania, in a joint effort of the Air Force Technical Applications Center, USA, and the National Institute for Earth Physics (NIEP), Romania. The array consists of 10 seismic sensors (9 short-period and one broad band) located in boreholes and distributed in a 5 x 5 km 2 area. At present, the seismic data are continuously recorded by BURAR and transmitted in real-time to the Romanian National Data Centre (ROM N DC), in Bucharest and to the National Data Center of USA, in Florida. The statistical analysis for the seismic information gathered at ROM N DC by the BURAR in the August 2002 - December 2003 time interval points out a much better efficiency of the BURAR system in detecting teleseismic events and local events occurred in the N-NE part of Romanian territory, in comparison with the actual Romanian Telemetered Network. Furthermore, the BURAR monitoring system has proven to be an important source of reliable data for NIEP efforts in issuing the seismic bulletins. Signal processing capability of the system provides useful information in order to improve the location of the local seismic events, using the array beamforming procedure. This method increases significantly the signal-to-noise ratio by summing up the coherent signals from the array components. In this way, possible source nucleation phases can be detected. At the same time, using the slowness and back azimuth estimations by f-k analysis, locations for the seismic events can be established based only on the information recorded by the BURAR array, acting like a single seismic station recording system. (authors)

  17. The evolution of alliance capabilities

    NARCIS (Netherlands)

    Heimeriks, K.H.; Duysters, G.M.; Vanhaverbeke, W.P.M.

    2004-01-01

    This paper assesses the effectiveness and differential performance effects of learning mechanisms on the evolution of alliance capabilities. Relying on the concept of capability lifecycles, prior research has suggested that different capability levels could be identified in which different

  18. Building Server Capabilities

    DEFF Research Database (Denmark)

    Adeyemi, Oluseyi

    2013-01-01

    Many western companies have moved part of their operations to China in order to take advantage of cheap resources and/or to gain access to a high potential market. Depending on motive, offshore facilities usually start either as “sales-only” of products exported by headquarters or “production......-only”, exporting parts and components back to headquarter for sales in the home country. In the course of time, the role of offshore subsidiaries in a company’s operations network tends to change and, with that, the capabilities, of the subsidiaries. Focusing on Danish subsidiaries in China, the objective...

  19. Building server capabilities

    DEFF Research Database (Denmark)

    Adeyemi, Oluseyi

    Many western companies have moved part of their operations to China in order to take advantage of cheap resources and/or to gain access to a high potential market. Depending on motive, offshore facilities usually start either as “sales-only” of products exported by headquarters or “production......-only”, exporting parts and components back to headquarter for sales in the home country. In the course of time, the role of offshore subsidiaries in a company’s operations network tends to change and, with that, the capabilities, of the subsidiaries. Focusing on Danish subsidiaries in China, the objective...

  20. Histogram score contributes for reliability of DNA content estimatives in Brachiaria spp Notas do histograma contribuem para a confiabilidade das estimativas do conteúdo de DNA de Brachiaria spp

    Directory of Open Access Journals (Sweden)

    Ana Luiza de Oliveira Timbó

    2012-12-01

    Full Text Available Flow cytometry allows to estimate the DNA content of a large number of plants quickly. However, inadequate protocols can compromise the reliability of these estimates leading to variations in the values of DNA content the same species. The objective of this study was to propose an efficient protocol to estimate the DNA content of Brachiaria spp. genotypes with different ploidy levels using flow cytometry. We evaluated four genotypes (B. ruziziensis diploid and artificially tetraploidized; a tetraploid B. brizantha and a natural triploid hybrid, three buffer solutions (MgSO4, Galbraith and Tris-HCl and three species as internal reference standards (Raphanus sativus, Solanum lycopersicum e Pisum sativum. The variables measured were: histogram score (1-5, coefficient of variation and estimation of DNA content. The best combination for the analysis of Brachiaria spp. DNA content was the use of MgSO4 buffer with R. sativus as a internal reference standard. Genome sizes expressed in picograms of DNA are presented for all genotypes and the importance of the histogram score on the results reliability of DNA content analyses were discussed.A citometria de fluxo permite estimar o conteúdo de DNA de um grande número de plantas rapidamente. No entanto, protocolos inadequados podem comprometer a confiabilidade dessas estimativas, levando a variações nos valores de conteúdo de DNA para uma mesma espécie. Neste trabalho, objetivou-se propor um protocolo eficiente para a estimativa do conteúdo de DNA de genótipos de Brachiaria spp. com diferentes níveis de ploidia, utilizando a citometria de fluxo. Foram avaliados quatro genótipos (B. ruziziensis, diploide e tetraploidizada artificialmente; B. brizantha tetraploide e um híbrido natural triploide, 3 soluções tampões (MgSO4, Galbraith e Tris-HCl e três espécies como padrões de referência interno (Raphanus sativus, Solanum lycopersicum e Pisum sativum. As variáveis mensuradas foram: nota do

  1. Laboratory microfusion capability study

    International Nuclear Information System (INIS)

    1993-05-01

    The purpose of this study is to elucidate the issues involved in developing a Laboratory Microfusion Capability (LMC) which is the major objective of the Inertial Confinement Fusion (ICF) program within the purview of the Department of Energy's Defense Programs. The study was initiated to support a number of DOE management needs: to provide insight for the evolution of the ICF program; to afford guidance to the ICF laboratories in planning their research and development programs; to inform Congress and others of the details and implications of the LMC; to identify criteria for selection of a concept for the Laboratory Microfusion Facility and to develop a coordinated plan for the realization of an LMC. As originally proposed, the LMC study was divided into two phases. The first phase identifies the purpose and potential utility of the LMC, the regime of its performance parameters, driver independent design issues and requirements, its development goals and requirements, and associated technical, management, staffing, environmental, and other developmental and operational issues. The second phase addresses driver-dependent issues such as specific design, range of performance capabilities, and cost. The study includes four driver options; the neodymium-glass solid state laser, the krypton fluoride excimer gas laser, the light-ion accelerator, and the heavy-ion induction linear accelerator. The results of the Phase II study are described in the present report

  2. Binding free energy predictions of farnesoid X receptor (FXR) agonists using a linear interaction energy (LIE) approach with reliability estimation: application to the D3R Grand Challenge 2

    Science.gov (United States)

    Rifai, Eko Aditya; van Dijk, Marc; Vermeulen, Nico P. E.; Geerke, Daan P.

    2018-01-01

    Computational protein binding affinity prediction can play an important role in drug research but performing efficient and accurate binding free energy calculations is still challenging. In the context of phase 2 of the Drug Design Data Resource (D3R) Grand Challenge 2 we used our automated eTOX ALLIES approach to apply the (iterative) linear interaction energy (LIE) method and we evaluated its performance in predicting binding affinities for farnesoid X receptor (FXR) agonists. Efficiency was obtained by our pre-calibrated LIE models and molecular dynamics (MD) simulations at the nanosecond scale, while predictive accuracy was obtained for a small subset of compounds. Using our recently introduced reliability estimation metrics, we could classify predictions with higher confidence by featuring an applicability domain (AD) analysis in combination with protein-ligand interaction profiling. The outcomes of and agreement between our AD and interaction-profile analyses to distinguish and rationalize the performance of our predictions highlighted the relevance of sufficiently exploring protein-ligand interactions during training and it demonstrated the possibility to quantitatively and efficiently evaluate if this is achieved by using simulation data only.

  3. OPSAID improvements and capabilities report.

    Energy Technology Data Exchange (ETDEWEB)

    Halbgewachs, Ronald D.; Chavez, Adrian R.

    2011-08-01

    Process Control System (PCS) and Industrial Control System (ICS) security is critical to our national security. But there are a number of technological, economic, and educational impediments to PCS owners implementing effective security on their systems. Sandia National Laboratories has performed the research and development of the OPSAID (Open PCS Security Architecture for Interoperable Design), a project sponsored by the US Department of Energy Office of Electricity Delivery and Energy Reliability (DOE/OE), to address this issue. OPSAID is an open-source architecture for PCS/ICS security that provides a design basis for vendors to build add-on security devices for legacy systems, while providing a path forward for the development of inherently-secure PCS elements in the future. Using standardized hardware, a proof-of-concept prototype system was also developed. This report describes the improvements and capabilities that have been added to OPSAID since an initial report was released. Testing and validation of this architecture has been conducted in another project, Lemnos Interoperable Security Project, sponsored by DOE/OE and managed by the National Energy Technology Laboratory (NETL).

  4. Test Reliability at the Individual Level

    Science.gov (United States)

    Hu, Yueqin; Nesselroade, John R.; Erbacher, Monica K.; Boker, Steven M.; Burt, S. Alexandra; Keel, Pamela K.; Neale, Michael C.; Sisk, Cheryl L.; Klump, Kelly

    2016-01-01

    Reliability has a long history as one of the key psychometric properties of a test. However, a given test might not measure people equally reliably. Test scores from some individuals may have considerably greater error than others. This study proposed two approaches using intraindividual variation to estimate test reliability for each person. A simulation study suggested that the parallel tests approach and the structural equation modeling approach recovered the simulated reliability coefficients. Then in an empirical study, where forty-five females were measured daily on the Positive and Negative Affect Schedule (PANAS) for 45 consecutive days, separate estimates of reliability were generated for each person. Results showed that reliability estimates of the PANAS varied substantially from person to person. The methods provided in this article apply to tests measuring changeable attributes and require repeated measures across time on each individual. This article also provides a set of parallel forms of PANAS. PMID:28936107

  5. Reliable computation from contextual correlations

    Science.gov (United States)

    Oestereich, André L.; Galvão, Ernesto F.

    2017-12-01

    An operational approach to the study of computation based on correlations considers black boxes with one-bit inputs and outputs, controlled by a limited classical computer capable only of performing sums modulo-two. In this setting, it was shown that noncontextual correlations do not provide any extra computational power, while contextual correlations were found to be necessary for the deterministic evaluation of nonlinear Boolean functions. Here we investigate the requirements for reliable computation in this setting; that is, the evaluation of any Boolean function with success probability bounded away from 1 /2 . We show that bipartite CHSH quantum correlations suffice for reliable computation. We also prove that an arbitrarily small violation of a multipartite Greenberger-Horne-Zeilinger noncontextuality inequality also suffices for reliable computation.

  6. Development of Nuclear Analysis Capabilities for DOE Waste Management Activities

    International Nuclear Information System (INIS)

    Parks, C.V.; Rearden, B.T.; Broadhead, B.L.; Petrie, L.M.; DeHart, M.D.; Hopper, C.M.

    2000-01-01

    The objective of this project is to develop and demonstrate prototypical analysis capabilities that can be used by nuclear safety analysis practitioners to: (1) demonstrate a more thorough understanding of the underlying physics phenomena that can lead to improved reliability and defensibility of safety evaluations; and (2) optimize operations related to the handling, storage, transportation, and disposal of fissile material and DOE spent fuel. To address these problems, this project has been investigating the implementation of sensitivity and uncertainty methods within existing Monte Carlo codes used for criticality safety analyses. It is also investigating the use of a new deterministic code that allows for specification of arbitrary grids to accurately model geometric details required in a criticality safety analysis. This capability can facilitate improved estimations of the required subcritical margin and potentially enable the use of a broader range of experiments in the validation process. The new arbitrary grid radiation transport code will also enable detailed geometric modeling valuable for improved accuracy in application to a myriad of other problems related to waste characterization. Application to these problems will also be explored. RESEARCH PROGRESS AND IMPLICATIONS

  7. Menstrual cycle length: a surrogate measure of reproductive health capable of improving the accuracy of biochemical/sonographical ovarian reserve test in estimating the reproductive chances of women referred to ART.

    Science.gov (United States)

    Gizzo, Salvatore; Andrisani, Alessandra; Noventa, Marco; Quaranta, Michela; Esposito, Federica; Armanini, Decio; Gangemi, Michele; Nardelli, Giovanni B; Litta, Pietro; D'Antona, Donato; Ambrosini, Guido

    2015-04-10

    Aim of the study was to investigate whether menstrual cycle length may be considered as a surrogate measure of reproductive health, improving the accuracy of biochemical/sonographical ovarian reserve test in estimating the reproductive chances of women referred to ART. A retrospective-observational-study in Padua' public tertiary level Centre was conducted. A total of 455 normo-ovulatory infertile women scheduled for their first fresh non-donor IVF/ICSI treatment. The mean menstrual cycle length (MCL) during the preceding 6 months was calculated by physicians on the basis of information contained in our electronic database (first day of menstrual cycle collected every month by telephonic communication by single patients). We evaluated the relations between MCL, ovarian response to stimulation protocol, oocytes fertilization ratio, ovarian sensitivity index (OSI) and pregnancy rate in different cohorts of patients according to the class of age and the estimated ovarian reserve. In women younger than 35 years, MCL over 31 days may be associated with an increased risk of OHSS and with a good OSI. In women older than 35 years, and particularly than 40 years, MCL shortening may be considered as a marker of ovarian aging and may be associated with poor ovarian response, low OSI and reduced fertilization rate. When AMH serum value is lower than 1.1 ng/ml in patients older than 40 years, MCL may help Clinicians discriminate real from expected poor responders. Considering the pool of normoresponders, MCL was not correlated with pregnancy rate while a positive association was found with patients' age. MCL diary is more predictive than chronological age in estimating ovarian biological age and response to COH and it is more predictive than AMH in discriminating expected from real poor responders. In women older than 35 years MCL shortening may be considered as a marker of ovarian aging while chronological age remains most accurate parameter in predicting pregnancy.

  8. Reliability Analysis of Wind Turbines

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    2008-01-01

    In order to minimise the total expected life-cycle costs of a wind turbine it is important to estimate the reliability level for all components in the wind turbine. This paper deals with reliability analysis for the tower and blades of onshore wind turbines placed in a wind farm. The limit states...... consideres are in the ultimate limit state (ULS) extreme conditions in the standstill position and extreme conditions during operating. For wind turbines, where the magnitude of the loads is influenced by the control system, the ultimate limit state can occur in both cases. In the fatigue limit state (FLS......) the reliability level for a wind turbine placed in a wind farm is considered, and wake effects from neighbouring wind turbines is taken into account. An illustrative example with calculation of the reliability for mudline bending of the tower is considered. In the example the design is determined according...

  9. Fatigue Reliability under Random Loads

    DEFF Research Database (Denmark)

    Talreja, R.

    1979-01-01

    We consider the problem of estimating the probability of survival (non-failure) and the probability of safe operation (strength greater than a limiting value) of structures subjected to random loads. These probabilities are formulated in terms of the probability distributions of the loads...... propagation stage. The consequences of this behaviour on the fatigue reliability are discussed....

  10. Aircraft Capability Management

    Science.gov (United States)

    Mumaw, Randy; Feary, Mike

    2018-01-01

    This presentation presents an overview of work performed at NASA Ames Research Center in 2017. The work concerns the analysis of current aircraft system management displays, and the initial development of an interface for providing information about aircraft system status. The new interface proposes a shift away from current aircraft system alerting interfaces that report the status of physical components, and towards displaying the implications of degradations on mission capability. The proposed interface describes these component failures in terms of operational consequences of aircraft system degradations. The research activity was an effort to examine the utility of different representations of complex systems and operating environments to support real-time decision making of off-nominal situations. A specific focus was to develop representations that provide better integrated information to allow pilots to more easily reason about the operational consequences of the off-nominal situations. The work is also seen as a pathway to autonomy, as information is integrated and understood in a form that automated responses could be developed for the off-nominal situations in the future.

  11. LHC Capabilities for Quarkonia

    CERN Document Server

    Petrushanko, Sergey

    2008-01-01

    The measurement of the charmonium and bottomonium resonances in nucleus-nucleus collisions provides crucial information on high-density QCD matter. First, the suppression of quarkonia production is generally agreed to be one of the most direct probes of quark-gluon plasma formation. The observation of anomalous J/$\\psi$ suppression at the CERN-SPS and at RHIC is well established but the clarification of some important remaining questions requires equivalent studies of the $\\Upsilon$ family, only possible at the LHC energies. Second, the production of heavy-quarks proceeds mainly via gluon-gluon fusion processes and, as such, is sensitive to saturation of the gluon density at low-x in the nucleus. Measured departures from the expected vacuum quarkonia cross-sections in Pb+Pb collisions at the LHC will thus provide valuable information not only on the thermodynamical state of the produced partonic medium, but also on the initial-state modifications of the nuclear parton distribution functions. The capabilities ...

  12. Strength capability while kneeling.

    Science.gov (United States)

    Haslegrave, C M; Tracy, M F; Corlett, E N

    1997-12-01

    Work sometimes has to be carried out kneeling, particularly where jobs are performed in confined spaces as is common for miners, aircraft baggage handlers and maintenance workers. In order to assess the risks in performing forceful tasks under such conditions, data is needed on strength capabilities of kneeling subjects. A study was undertaken to measure isometric strength in single-handed exertions for male subjects and to investigate the effects on this of task layout factors (direction of force exertion, reach distance, height of the workpiece and orientation relative to the subject's sagittal plane). The data has been tabulated to show the degree to which strength may be reduced in different situations and analysis of the task factors showed their influence to be complex with direction of exertion and reach distance having the greatest effect. The results also suggest that exertions are weaker when subjects are kneeling on two knees than when kneeling on one knee, although this needs to be confirmed by direct experimental comparison.

  13. Large Sample Confidence Intervals for Item Response Theory Reliability Coefficients

    Science.gov (United States)

    Andersson, Björn; Xin, Tao

    2018-01-01

    In applications of item response theory (IRT), an estimate of the reliability of the ability estimates or sum scores is often reported. However, analytical expressions for the standard errors of the estimators of the reliability coefficients are not available in the literature and therefore the variability associated with the estimated reliability…

  14. Dynamic reliability and risk assessment of the accident localization system of the Ignalina NPP RBMK-1500 reactor

    International Nuclear Information System (INIS)

    Kopustinskas, V.; Augutis, J.; Rimkevicius, S.

    2005-01-01

    The paper presents reliability and risk analysis of the RBMK-1500 reactor accident localization system (ALS) (confinement), which prevents radioactive releases to the environment. Reliability of the system was estimated and compared by two methods: the conventional fault tree method and an innovative dynamic reliability model, based on stochastic differential equations. Frequency of radioactive release through ALS was also estimated. The results of the study indicate that conventional fault tree modeling techniques in this case apply high degree of conservatism in the system reliability estimates. One of the purposes of the ALS reliability study was to demonstrate advantages of the dynamic reliability analysis against the conventional fault/event tree methods. The Markovian framework to deal with dynamic aspects of system behavior is presented. Although not analyzed in detail, the framework is also capable of accounting for non-constant component failure rates. Computational methods are proposed to solve stochastic differential equations, including analytical solution, which is possible only for relatively small and simple systems. Other numerical methods, like Monte Carlo and numerical schemes of differential equations are analyzed and compared. The study is finalized with concluding remarks regarding both the studied system reliability and computational methods used

  15. AMSAA Reliability Growth Guide

    National Research Council Canada - National Science Library

    Broemm, William

    2000-01-01

    ... has developed reliability growth methodology for all phases of the process, from planning to tracking to projection. The report presents this methodology and associated reliability growth concepts.

  16. Emergency diesel generator reliability program

    International Nuclear Information System (INIS)

    Serkiz, A.W.

    1989-01-01

    The need for an emergency diesel generator (EDG) reliability program has been established by 10 CFR Part 50, Section 50.63, Loss of All Alternating Current Power, which requires that utilities assess their station blackout duration and recovery capability. EDGs are the principal emergency ac power sources for coping with a station blackout. Regulatory Guide 1.155, Station Blackout, identifies a need for (1) an EDG reliability equal to or greater than 0.95, and (2) an EDG reliability program to monitor and maintain the required levels. The resolution of Generic Safety Issue (GSI) B-56 embodies the identification of a suitable EDG reliability program structure, revision of pertinent regulatory guides and Tech Specs, and development of an Inspection Module. Resolution of B-56 is coupled to the resolution of Unresolved Safety Issue (USI) A-44, Station Blackout, which resulted in the station blackout rule, 10 CFR 50.63 and Regulatory Guide 1.155, Station Blackout. This paper discusses the principal elements of an EDG reliability program developed for resolving GSI B-56 and related matters

  17. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  18. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  19. Solid State Lighting Reliability Components to Systems

    CERN Document Server

    Fan, XJ

    2013-01-01

    Solid State Lighting Reliability: Components to Systems begins with an explanation of the major benefits of solid state lighting (SSL) when compared to conventional lighting systems including but not limited to long useful lifetimes of 50,000 (or more) hours and high efficacy. When designing effective devices that take advantage of SSL capabilities the reliability of internal components (optics, drive electronics, controls, thermal design) take on critical importance. As such a detailed discussion of reliability from performance at the device level to sub components is included as well as the integrated systems of SSL modules, lamps and luminaires including various failure modes, reliability testing and reliability performance. This book also: Covers the essential reliability theories and practices for current and future development of Solid State Lighting components and systems Provides a systematic overview for not only the state-of-the-art, but also future roadmap and perspectives of Solid State Lighting r...

  20. Sandia Laboratories technical capabilities. Auxiliary capabilities: environmental health information science

    International Nuclear Information System (INIS)

    1975-09-01

    Sandia Laboratories is an engineering laboratory in which research, development, testing, and evaluation capabilities are integrated by program management for the generation of advanced designs. In fulfilling its primary responsibility to ERDA, Sandia Laboratories has acquired extensive research and development capabilities. The purpose of this series of documents is to catalog the many technical capabilities of the Laboratories. After the listing of capabilities, supporting information is provided in the form of highlights, which show applications. This document deals with auxiliary capabilities, in particular, environmental health and information science. (11 figures, 1 table) (RWR)

  1. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    Science.gov (United States)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential

  2. Sandia Laboratories technical capabilities: testing

    International Nuclear Information System (INIS)

    Lundergan, C.D.

    1975-12-01

    The testing capabilities at Sandia Laboratories are characterized. Selected applications of these capabilities are presented to illustrate the extent to which they can be applied in research and development programs

  3. Sandia Laboratories technical capabilities: electronics

    International Nuclear Information System (INIS)

    Lundergan, C.D.

    1975-12-01

    This report characterizes the electronics capabilities at Sandia Laboratories. Selected applications of these capabilities are presented to illustrate the extent to which they can be applied in research and development programs

  4. Structural Capability of an Organization toward Innovation Capability

    DEFF Research Database (Denmark)

    Nielsen, Susanne Balslev; Momeni, Mostafa

    2016-01-01

    The scholars in the field of strategic management have developed two major approaches for attainment of competitive advantage: an approach based on environmental opportunities, and another one based on internal capabilities of an organization. Some investigations in the last two decades have...... indicated that the advantages relying on the internal capabilities of organizations may determine the competitive position of organizations better than environmental opportunities do. Characteristics of firms shows that one of the most internal capabilities that lead the organizations to the strongest...... competitive advantage in the organizations is the innovation capability. The innovation capability is associated with other organizational capabilities, and many organizations have focused on the need to identify innovation capabilities.This research focuses on recognition of the structural aspect...

  5. Are Validity and Reliability "Relevant" in Qualitative Evaluation Research?

    Science.gov (United States)

    Goodwin, Laura D.; Goodwin, William L.

    1984-01-01

    The views of prominant qualitative methodologists on the appropriateness of validity and reliability estimation for the measurement strategies employed in qualitative evaluations are summarized. A case is made for the relevance of validity and reliability estimation. Definitions of validity and reliability for qualitative measurement are presented…

  6. Linear and evolutionary polynomial regression models to forecast coastal dynamics: Comparison and reliability assessment

    Science.gov (United States)

    Bruno, Delia Evelina; Barca, Emanuele; Goncalves, Rodrigo Mikosz; de Araujo Queiroz, Heithor Alexandre; Berardi, Luigi; Passarella, Giuseppe

    2018-01-01

    In this paper, the Evolutionary Polynomial Regression data modelling strategy has been applied to study small scale, short-term coastal morphodynamics, given its capability for treating a wide database of known information, non-linearly. Simple linear and multilinear regression models were also applied to achieve a balance between the computational load and reliability of estimations of the three models. In fact, even though it is easy to imagine that the more complex the model, the more the prediction improves, sometimes a "slight" worsening of estimations can be accepted in exchange for the time saved in data organization and computational load. The models' outcomes were validated through a detailed statistical, error analysis, which revealed a slightly better estimation of the polynomial model with respect to the multilinear model, as expected. On the other hand, even though the data organization was identical for the two models, the multilinear one required a simpler simulation setting and a faster run time. Finally, the most reliable evolutionary polynomial regression model was used in order to make some conjecture about the uncertainty increase with the extension of extrapolation time of the estimation. The overlapping rate between the confidence band of the mean of the known coast position and the prediction band of the estimated position can be a good index of the weakness in producing reliable estimations when the extrapolation time increases too much. The proposed models and tests have been applied to a coastal sector located nearby Torre Colimena in the Apulia region, south Italy.

  7. The Capability to Hold Property

    NARCIS (Netherlands)

    Claassen, Rutger

    2015-01-01

    This paper discusses the question of whether a capability theory of justice (such as that of Martha Nussbaum) should accept a basic “capability to hold property.” Answering this question is vital for bridging the gap between abstract capability theories of justice and their institutional

  8. Atmospheric Release Advisory Capability (ARAC)

    International Nuclear Information System (INIS)

    Dickerson, M.H.

    1975-01-01

    The chief purpose of ARAC data acquisition program is to provide site officials, who are responsible for ensuring maximum health protection for the endangered site personnel and public, with estimates of the effects of atmospheric releases of hazardous material as rapidly and accurately as possible. ARAC is in the initial stages of being implemented and is therefore susceptible to changes before it reaches its final form. However the concept of ARAC is fully developed and was successfully demonstrated during a feasibility study conducted in June 1974, as a joint effort between the Savannah River Laboratory (SRL) and Lawrence Livermore Laboratory (LLL). Additional tests between SRL and LLL are scheduled for December 1975. While the immediate goal is the application of ARAC to assist a limited number of ERDA sites, the system is designed with sufficient flexibility to permit expanding the service to a large number of sites. Success in ARAC application should provide nuclear facilities with a means to handle better the urgent questions concerning the potential accidental hazards from atmospheric releases in addition to providing the sites with a capability to assess the effort of their normal operations

  9. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  10. Lifetime Reliability Assessment of Concrete Slab Bridges

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    A procedure for lifetime assesment of the reliability of short concrete slab bridges is presented in the paper. Corrosion of the reinforcement is the deterioration mechanism used for estimating the reliability profiles for such bridges. The importance of using sensitivity measures is stressed....... Finally the produce is illustrated on 6 existing UK bridges....

  11. Reliability of PWR type nuclear power plants

    International Nuclear Information System (INIS)

    Ribeiro, A.A.T.; Muniz, A.A.

    1978-12-01

    Results of the analysis of factors influencing the reliability of international nuclear power plants of the PWR type are presented. The reliability factor is estimated and the probability of its having lower values than a certain specified value is discussed. (Author) [pt

  12. Scale Reliability Evaluation with Heterogeneous Populations

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.

    2015-01-01

    A latent variable modeling approach for scale reliability evaluation in heterogeneous populations is discussed. The method can be used for point and interval estimation of reliability of multicomponent measuring instruments in populations representing mixtures of an unknown number of latent classes or subpopulations. The procedure is helpful also…

  13. Reliability tasks from prediction to field use

    International Nuclear Information System (INIS)

    Guyot, Christian.

    1975-01-01

    This tutorial paper is part of a series intended to sensitive on reliability prolems. Reliability probabilistic concept, is an important parameter of availability. Reliability prediction is an estimation process for evaluating design progress. It is only by the application of a reliability program that reliability objectives can be attained through the different stages of work: conception, fabrication, field use. The user is mainly interested in operational reliability. Indication are given on the support and the treatment of data in the case of electronic equipment at C.E.A. Reliability engineering requires a special state of mind which must be formed and developed in a company in the same way as it may be done for example for safety [fr

  14. Structural reliability of atomic power plant

    International Nuclear Information System (INIS)

    Klemin, A.I.; Polyakov, E.F.

    1980-01-01

    In 1978 the first specialized technical manual ''Technique of Calculating the Structural Reliability of an Atomic Power Plant and Its Systems in the Design Stage'' was developed. The present article contains information about the main characteristics and capabilities of the manual. The manual gives recommendations concerning the calculations of the reliability of such specific systems as the reactor control and safety system, the system of instrumentation and automatic control, and safety systems. 2 refs

  15. Reliability of nuclear power plants and equipment

    International Nuclear Information System (INIS)

    1985-01-01

    The standard sets the general principles, a list of reliability indexes and demands on their selection. Reliability indexes of nuclear power plants include the simple indexes of fail-safe operation, life and maintainability, and of storage capability. All terms and notions are explained and methods of evaluating the indexes briefly listed - statistical, and calculation experimental. The dates when the standard comes in force in the individual CMEA countries are given. (M.D.)

  16. Transforming organizational capabilities in strategizing

    DEFF Research Database (Denmark)

    Jørgensen, Claus; Friis, Ole Uhrskov; Koch, Christian

    2014-01-01

    Offshored and networked enterprises are becoming an important if not leading organizational form and this development seriously challenges their organizational capabilities. More specifically, over the last years, SMEs have commenced entering these kinds of arrangements. As the organizational...... capabilities of SMEs are limited at the outset, even more emphasis is needed regarding the issues of developing relevant organizational capabilities. This paper aims at investigating how capabilities evolve during an offshoring process of more than 5 years in two Danish SMEs, i.e. not only short- but long......-term evolvements within the companies. We develop our framework of understanding organizational capabilities drawing on dynamic capability, relational capability and strategy as practice concepts, appreciating the performative aspects of developing new routines. Our two cases are taken from one author’s Ph...

  17. Reliability Modeling of Wind Turbines

    DEFF Research Database (Denmark)

    Kostandyan, Erik

    Cost reductions for offshore wind turbines are a substantial requirement in order to make offshore wind energy more competitive compared to other energy supply methods. During the 20 – 25 years of wind turbines useful life, Operation & Maintenance costs are typically estimated to be a quarter...... for Operation & Maintenance planning. Concentrating efforts on development of such models, this research is focused on reliability modeling of Wind Turbine critical subsystems (especially the power converter system). For reliability assessment of these components, structural reliability methods are applied...... to one third of the total cost of energy. Reduction of Operation & Maintenance costs will result in significant cost savings and result in cheaper electricity production. Operation & Maintenance processes mainly involve actions related to replacements or repair. Identifying the right times when...

  18. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  19. Suncor maintenance and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Little, S. [Suncor Energy, Calgary, AB (Canada)

    2006-07-01

    Fleet maintenance and reliability at Suncor Energy was discussed in this presentation, with reference to Suncor Energy's primary and support equipment fleets. This paper also discussed Suncor Energy's maintenance and reliability standard involving people, processes and technology. An organizational maturity chart that graphed organizational learning against organizational performance was illustrated. The presentation also reviewed the maintenance and reliability framework; maintenance reliability model; the process overview of the maintenance and reliability standard; a process flow chart of maintenance strategies and programs; and an asset reliability improvement process flow chart. An example of an improvement initiative was included, with reference to a shovel reliability review; a dipper trip reliability investigation; bucket related failures by type and frequency; root cause analysis of the reliability process; and additional actions taken. Last, the presentation provided a graph of the results of the improvement initiative and presented the key lessons learned. tabs., figs.

  20. Impact of Personnel Capabilities on Organizational Innovation Capability

    DEFF Research Database (Denmark)

    Nielsen, Susanne Balslev; Momeni, Mostafa

    2016-01-01

    in this rapidly changing world. This research focuses on definition of the personnel aspect of innovation capability, and proposes a conceptual model based on the scientific articles of academic literature on organisations innovation capability. This paper includes an expert based validation in three rounds...... of the Delphi method. And for the purpose of a better appreciation of the relationship dominating the factors of the model, it has distributed the questionnaire to Iranian companies in the Food industry. This research proposed a direct relationship between Innovation Capability and the Personnel Capability...

  1. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  2. Measuring reliability under epistemic uncertainty: Review on non-probabilistic reliability metrics

    Directory of Open Access Journals (Sweden)

    Kang Rui

    2016-06-01

    Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.

  3. Prediction of safety critical software operational reliability from test reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1999-01-01

    It has been a critical issue to predict the safety critical software reliability in nuclear engineering area. For many years, many researches have focused on the quantification of software reliability and there have been many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is however on the operational reliability rather than on the test reliability. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, from testing to operation, testing environment factors comprising the aging factor and the coverage factor are developed in this paper and used to predict the ultimate operational reliability with the failure data in testing phase. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results show that the proposed method can estimate the operational reliability accurately. (Author). 14 refs., 1 tab., 1 fig

  4. Probabilistic risk assessment course documentation. Volume 3. System reliability and analysis techniques, Session A - reliability

    International Nuclear Information System (INIS)

    Lofgren, E.V.

    1985-08-01

    This course in System Reliability and Analysis Techniques focuses on the quantitative estimation of reliability at the systems level. Various methods are reviewed, but the structure provided by the fault tree method is used as the basis for system reliability estimates. The principles of fault tree analysis are briefly reviewed. Contributors to system unreliability and unavailability are reviewed, models are given for quantitative evaluation, and the requirements for both generic and plant-specific data are discussed. Also covered are issues of quantifying component faults that relate to the systems context in which the components are embedded. All reliability terms are carefully defined. 44 figs., 22 tabs

  5. People Capability Maturity Model. SM.

    Science.gov (United States)

    1995-09-01

    tailored so it consumes less time and resources than a traditional software process assessment or CMU/SEI-95-MM-02 People Capability Maturity Model...improved reputation or customer loyalty. CMU/SEI-95-MM-02 People Capability Maturity Model ■ L5-17 Coaching Level 5: Optimizing Activity 1...Maturity Model CMU/SEI-95-MM-62 Carnegie-Mellon University Software Engineering Institute DTIC ELECTE OCT 2 7 1995 People Capability Maturity

  6. Human Reliability Program Overview

    Energy Technology Data Exchange (ETDEWEB)

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  7. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  8. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  9. Developing Reliable Life Support for Mars

    Science.gov (United States)

    Jones, Harry W.

    2017-01-01

    A human mission to Mars will require highly reliable life support systems. Mars life support systems may recycle water and oxygen using systems similar to those on the International Space Station (ISS). However, achieving sufficient reliability is less difficult for ISS than it will be for Mars. If an ISS system has a serious failure, it is possible to provide spare parts, or directly supply water or oxygen, or if necessary bring the crew back to Earth. Life support for Mars must be designed, tested, and improved as needed to achieve high demonstrated reliability. A quantitative reliability goal should be established and used to guide development t. The designers should select reliable components and minimize interface and integration problems. In theory a system can achieve the component-limited reliability, but testing often reveal unexpected failures due to design mistakes or flawed components. Testing should extend long enough to detect any unexpected failure modes and to verify the expected reliability. Iterated redesign and retest may be required to achieve the reliability goal. If the reliability is less than required, it may be improved by providing spare components or redundant systems. The number of spares required to achieve a given reliability goal depends on the component failure rate. If the failure rate is under estimated, the number of spares will be insufficient and the system may fail. If the design is likely to have undiscovered design or component problems, it is advisable to use dissimilar redundancy, even though this multiplies the design and development cost. In the ideal case, a human tended closed system operational test should be conducted to gain confidence in operations, maintenance, and repair. The difficulty in achieving high reliability in unproven complex systems may require the use of simpler, more mature, intrinsically higher reliability systems. The limitations of budget, schedule, and technology may suggest accepting lower and

  10. TFTR CAMAC power supplies reliability

    International Nuclear Information System (INIS)

    Camp, R.A.; Bergin, W.

    1989-01-01

    Since the expected life of the Tokamak Fusion Test Reactor (TFTR) has been extended into the early 1990's, the issues of equipment wear-out, when to refurbish/replace, and the costs associated with these decisions, must be faced. The management of the maintenance of the TFTR Central Instrumentation, Control and Data Acquisition System (CICADA) power supplies within the CAMAC network is a case study of a set of systems to monitor repairable systems reliability, costs, and results of action. The CAMAC network is composed of approximately 500 racks, each with its own power supply. By using a simple reliability estimator on a coarse time interval, in conjunction with determining the root cause of individual failures, a cost effective repair and maintenance program has been realized. This paper describes the estimator, some of the specific causes for recurring failures and their correction, and the subsequent effects on the reliability estimator. By extension of this program the authors can assess the continued viability of CAMAC power supplies into the future, predicting wear-out and developing cost effective refurbishment/replacement policies. 4 refs., 3 figs., 1 tab

  11. Reliable Design Versus Trust

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  12. Pocket Handbook on Reliability

    Science.gov (United States)

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  13. Reliability analysis of stiff versus flexible piping

    International Nuclear Information System (INIS)

    Lu, S.C.

    1985-01-01

    The overall objective of this research project is to develop a technical basis for flexible piping designs which will improve piping reliability and minimize the use of pipe supports, snubbers, and pipe whip restraints. The current study was conducted to establish the necessary groundwork based on the piping reliability analysis. A confirmatory piping reliability assessment indicated that removing rigid supports and snubbers tends to either improve or affect very little the piping reliability. The authors then investigated a couple of changes to be implemented in Regulatory Guide (RG) 1.61 and RG 1.122 aimed at more flexible piping design. They concluded that these changes substantially reduce calculated piping responses and allow piping redesigns with significant reduction in number of supports and snubbers without violating ASME code requirements. Furthermore, the more flexible piping redesigns are capable of exhibiting reliability levels equal to or higher than the original stiffer design. An investigation of the malfunction of pipe whip restraints confirmed that the malfunction introduced higher thermal stresses and tended to reduce the overall piping reliability. Finally, support and component reliabilities were evaluated based on available fragility data. Results indicated that the support reliability usually exhibits a moderate decrease as the piping flexibility increases. Most on-line pumps and valves showed an insignificant reduction in reliability for a more flexible piping design

  14. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....

  15. Reliability in engineering '87

    International Nuclear Information System (INIS)

    Tuma, M.

    1987-01-01

    The participants heard 51 papers dealing with the reliability of engineering products. Two of the papers were incorporated in INIS, namely ''Reliability comparison of two designs of low pressure regeneration of the 1000 MW unit at the Temelin nuclear power plant'' and ''Use of probability analysis of reliability in designing nuclear power facilities.''(J.B.)

  16. Neurophysiological Estimates of Human Performance Capabilities in Aerospace Systems

    Science.gov (United States)

    1976-11-30

    decreased when the shared activity was within the same hemisphere. We again recall that in our previous dyslexia experiment(l0), good readers had...higher coherence across the hemispheres during reading tasks, whereas the dyslexia poor readers had higher within the hemisphere coherences while...modification of sequential tasks not possible by intervention of the experimenter. Nevertheless, we have not formed a system suited to our needs in a

  17. South African uranium resource and production capability estimates

    International Nuclear Information System (INIS)

    Camisani-Calzolari, F.A.G.M.; Toens, P.D.

    1980-09-01

    South Africa, along with Canada and the United States, submitted forecasts of uranium capacities and capabilites to the year 2025 for the 1979 'Red Book' edition. This report deals with the methodologies used in arriving at the South African forecasts. As the future production trends of the South African uranium producers cannot be confidently defined, chiefly because uranium is extracted as a by-product of the gold mining industry and is thus highly sensitive to market fluctuations for both uranium and gold, the Evaluation Group of the Atomic Energy Board has carried out numerous forecast exercises using current and historical norms and assuming various degrees of 'adverse', 'normal' and 'most favourable' conditions. The two exercises, which were submitted for the 'Red Book', are shown in the Appendices. This paper has been prepared for presentation to the Working Group on Methodologies for Forecasting Uranium Availability of the NEA/IAEA Steering Group on Uranium Resources [af

  18. Neurophysiological Estimates of Human Performance Capabilities in Aerospace Systems

    Science.gov (United States)

    1975-01-27

    plasma membranes suggests that the extracellular material is anchored to cell membranes by bonding inter- actions between protein and lipid moieties of...library of " callable " routines to perform functions commonly needed in conducting experiments. The monitor is modular in design and can be expanded or

  19. Trade-offs in size, quantity and reliability of generalized nuclear power plants: a preliminary assessment

    International Nuclear Information System (INIS)

    Hill, D.

    1985-04-01

    An approximate method is used to estimate the effects of system reliability on optimal nuclear plant size, taking into account also scale factors and manufacturing learning curve slopes. The method is used to estimate the additional effective capability gained by adding units of different sizes to an existing electrical system. The number of additional units proves to be sensitive to forced outrage rate, estimated here from trends in US light-water reactors from 1971 to 1980. The relative cost of added units ranging in size from 200 to 800 MW is determined as a function of the parameters: scale factor and learning curve slope. The results generally corrobate the trends found in an earlier study in which the effect of reliability on required installed capacity was not explicitly considered. Optimal plant size decreases with weaker scale effects and stronger learning curve effects. Reliability considerations further reduce the optimal plant size, but the relative reduction is apparently not as great with steeper learning curves. This is a plausible finding inasmuch as the reduction in numbers of additional units due to reliability considerations will affect cost most where the learning curve is steepest. 9 refs., 4 figs., 3 tabs

  20. Reliability of reactor materials

    International Nuclear Information System (INIS)

    Toerroenen, K.; Aho-Mantila, I.

    1986-05-01

    This report is the final technical report of the fracture mechanics part of the Reliability of Reactor Materials Programme, which was carried out at the Technical Research Centre of Finland (VTT) through the years 1981 to 1983. Research and development work was carried out in five major areas, viz. statistical treatment and modelling of cleavage fracture, crack arrest, ductile fracture, instrumented impact testing as well as comparison of numerical and experimental elastic-plastic fracture mechanics. In the area of cleavage fracture the critical variables affecting the fracture of steels are considered in the frames of a statistical model, so called WST-model. Comparison of fracture toughness values predicted by the model and corresponding experimental values shows excellent agreement for a variety of microstructures. different posibilities for using the model are discussed. The development work in the area of crack arrest testing was concentrated in the crack starter properties, test arrangement and computer control. A computerized elastic-plastic fracture testing method with a variety of test specimen geometries in a large temperature range was developed for a routine stage. Ductile fracture characteristics of reactor pressure vessel steel A533B and comparable weld material are given. The features of a new, patented instrumented impact tester are described. Experimental and theoretical comparisons between the new and conventional testers indicated clearly the improvements achieved with the new tester. A comparison of numerical and experimental elastic-plastic fracture mechanics capabilities at VTT was carried out. The comparison consisted of two-dimensional linear elastic as well as elastic-plastic finite element analysis of four specimen geometries and equivalent experimental tests. (author)

  1. Technological Dynamics and Social Capability

    DEFF Research Database (Denmark)

    Fagerberg, Jan; Feldman, Maryann; Srholec, Martin

    2014-01-01

    for the sample as a whole between 1998 and 2008. The results indicate that social capabilities, such as well-developed public knowledge infrastructure, an egalitarian distribution of income, a participatory democracy and prevalence of public safety condition the growth of technological capabilities. Possible...

  2. A business analytics capability framework

    Directory of Open Access Journals (Sweden)

    Ranko Cosic

    2015-09-01

    Full Text Available Business analytics (BA capabilities can potentially provide value and lead to better organisational performance. This paper develops a holistic, theoretically-grounded and practically relevant business analytics capability framework (BACF that specifies, defines and ranks the capabilities that constitute an organisational BA initiative. The BACF was developed in two phases. First, an a priori conceptual framework was developed based on the Resource-Based View theory of the firm and a thematic content analysis of the BA literature. Second, the conceptual framework was further developed and refined using a three round Delphi study involving 16 BA experts. Changes from the Delphi study resulted in a refined and confirmed framework including detailed capability definitions, together with a ranking of the capabilities based on importance. The BACF will help academic researchers and industry practitioners to better understand the capabilities that constitute an organisational BA initiative and their relative importance. In future work, the capabilities in the BACF will be operationalised to measure their as-is status, thus enabling organisations to identify key areas of strength and weakness and prioritise future capability improvement efforts.

  3. Identifying 21st Century Capabilities

    Science.gov (United States)

    Stevens, Robert

    2012-01-01

    What are the capabilities necessary to meet 21st century challenges? Much of the literature on 21st century skills focuses on skills necessary to meet those challenges associated with future work in a globalised world. The result is a limited characterisation of those capabilities necessary to address 21st century social, health and particularly…

  4. Reliability evaluation of the Savannah River reactor leak detection system

    International Nuclear Information System (INIS)

    Daugherty, W.L.; Sindelar, R.L.; Wallace, I.T.

    1991-01-01

    The Savannah River Reactors have been in operation since the mid-1950's. The primary degradation mode for the primary coolant loop piping is intergranular stress corrosion cracking. The leak-before-break (LBB) capability of the primary system piping has been demonstrated as part of an overall structural integrity evaluation. One element of the LBB analyses is a reliability evaluation of the leak detection system. The most sensitive element of the leak detection system is the airborne tritium monitors. The presence of small amounts of tritium in the heavy water coolant provide the basis for a very sensitive system of leak detection. The reliability of the tritium monitors to properly identify a crack leaking at a rate of either 50 or 300 lb/day (0.004 or 0.023 gpm, respectively) has been characterized. These leak rates correspond to action points for which specific operator actions are required. High reliability has been demonstrated using standard fault tree techniques. The probability of not detecting a leak within an assumed mission time of 24 hours is estimated to be approximately 5 x 10 -5 per demand. This result is obtained for both leak rates considered. The methodology and assumptions used to obtain this result are described in this paper. 3 refs., 1 fig., 1 tab

  5. Capabilities of a remote work vehicle

    International Nuclear Information System (INIS)

    Whittaker, W.L.; Champeny, L.

    1987-01-01

    The remote work vehicle (RWV) is a mobile work system for recovery operations in radiological environments. A teleoperated, electrohydraulically powered system, the RWV features omnidirectional locomotion, a telescoping boom with a seven meter reach, a master/slave manipulator, ten cameras, a tether for sustained power, and an offboard console where three operators control vehicle functions. (The RWV is more fully described elsewhere see bibliography; capability is emphasized here). Capabilities of the base vehicle and specialized tooling allow the RWV to perform accident recovery tasks, including demolishing concrete and steel structures, decontaminating and sealing surfaces, removing water and sediment from flooded areas, emplacing shields, packaging and transporting materials, and performing general inspections. Aspirations for reliability have made the RWV an order of magnitude more complex than its predecessor recovery robots, and ambitions for task performance have made it two orders of magnitude more capable. In addition to nuclear recovery work, the RWV is a viable candidate for other remote work applications, including nuclear facility maintenance and decommissioning

  6. Reliability analysis applied to structural tests

    Science.gov (United States)

    Diamond, P.; Payne, A. O.

    1972-01-01

    The application of reliability theory to predict, from structural fatigue test data, the risk of failure of a structure under service conditions because its load-carrying capability is progressively reduced by the extension of a fatigue crack, is considered. The procedure is applicable to both safe-life and fail-safe structures and, for a prescribed safety level, it will enable an inspection procedure to be planned or, if inspection is not feasible, it will evaluate the life to replacement. The theory has been further developed to cope with the case of structures with initial cracks, such as can occur in modern high-strength materials which are susceptible to the formation of small flaws during the production process. The method has been applied to a structure of high-strength steel and the results are compared with those obtained by the current life estimation procedures. This has shown that the conventional methods can be unconservative in certain cases, depending on the characteristics of the structure and the design operating conditions. The suitability of the probabilistic approach to the interpretation of the results from full-scale fatigue testing of aircraft structures is discussed and the assumptions involved are examined.

  7. Delivering high performance BWR fuel reliably

    International Nuclear Information System (INIS)

    Schardt, J.F.

    1998-01-01

    Utilities are under intense pressure to reduce their production costs in order to compete in the increasingly deregulated marketplace. They need fuel, which can deliver high performance to meet demanding operating strategies. GE's latest BWR fuel design, GE14, provides that high performance capability. GE's product introduction process assures that this performance will be delivered reliably, with little risk to the utility. (author)

  8. Reliability Based Optimization of Structural Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    1987-01-01

    The optimization problem to design structural systems such that the reliability is satisfactory during the whole lifetime of the structure is considered in this paper. Some of the quantities modelling the loads and the strength of the structure are modelled as random variables. The reliability...... is estimated using first. order reliability methods ( FORM ). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements satisfies given requirements or such that the systems reliability satisfies a given requirement....... For these optimization problems it is described how a sensitivity analysis can be performed. Next, new optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability based optimization problem sequentially using quasi-analytical derivatives. Finally...

  9. Developing Collaborative Product Development Capabilities

    DEFF Research Database (Denmark)

    Mahnke, Volker; Tran, Yen

    2012-01-01

    innovation strategies’. Our analyses suggest that developing such collaboration capabilities benefits from the search for complementary practices, the combination of learning styles, and the development of weak and strong ties. Results also underscore the crucial importance of co-evolution of multi......Collaborative product development capabilities support a company’s product innovation activities. In the context of the fast fashion sector, this paper examines the development of the product development capabilities (PDC) that align product development capabilities in a dual innovation context......, one, slow paced, where the firm is well established and the other, fast paced, which represents a new competitive arena in which the company competes. To understand the process associated with collaborative capability development, we studied three Scandinavian fashion companies pursuing ‘dual...

  10. Marketing Capability in Strategy Research

    DEFF Research Database (Denmark)

    Ritter, Thomas; Distel, Andreas Philipp

    Following the call for a demand-side perspective of strategic management (e.g., Priem et al., 2012), a firm’s marketing capability, i.e. its ability to interact with down-stream stakeholders, becomes a pivotal element in explaining a firm’s competitiveness. While marketing capability is recognized...... in the strategic management literature as an important driver of firm performance, our review of 86 articles reveals a lack of a generally accepted definition of marketing capability, a lack of a common conceptualization as well as differences in the measurement of marketing capability. In order to build a common...... ground for advancing marketing capability research and thus supporting the demand-side perspective in strategic management, we develop an integrative framework to explain the differences and propose a research agenda for developing the field....

  11. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  12. Human factor reliability program

    International Nuclear Information System (INIS)

    Knoblochova, L.

    2017-01-01

    The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)

  13. Assessing sufficient capability: A new approach to economic evaluation.

    Science.gov (United States)

    Mitchell, Paul Mark; Roberts, Tracy E; Barton, Pelham M; Coast, Joanna

    2015-08-01

    Amartya Sen's capability approach has been discussed widely in the health economics discipline. Although measures have been developed to assess capability in economic evaluation, there has been much less attention paid to the decision rules that might be applied alongside. Here, new methods, drawing on the multidimensional poverty and health economics literature, are developed for conducting economic evaluation within the capability approach and focusing on an objective of achieving "sufficient capability". This objective more closely reflects the concern with equity that pervades the capability approach and the method has the advantage of retaining the longitudinal aspect of estimating outcome that is associated with quality-adjusted life years (QALYs), whilst also drawing on notions of shortfall associated with assessments of poverty. Economic evaluation from this perspective is illustrated in an osteoarthritis patient group undergoing joint replacement, with capability wellbeing assessed using ICECAP-O. Recommendations for taking the sufficient capability approach forward are provided. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. A Closed-Form Technique for the Reliability and Risk Assessment of Wind Turbine Systems

    Directory of Open Access Journals (Sweden)

    Leonardo Dueñas-Osorio

    2012-06-01

    Full Text Available This paper proposes a closed-form method to evaluate wind turbine system reliability and associated failure consequences. Monte Carlo simulation, a widely used approach for system reliability assessment, usually requires large numbers of computational experiments, while existing analytical methods are limited to simple system event configurations with a focus on average values of reliability metrics. By analyzing a wind turbine system and its components in a combinatorial yet computationally efficient form, the proposed approach provides an entire probability distribution of system failure that contains all possible configurations of component failure and survival events. The approach is also capable of handling unique component attributes such as downtime and repair cost needed for risk estimations, and enables sensitivity analysis for quantifying the criticality of individual components to wind turbine system reliability. Applications of the technique are illustrated by assessing the reliability of a 12-subassembly turbine system. In addition, component downtimes and repair costs of components are embedded in the formulation to compute expected annual wind turbine unavailability and repair cost probabilities, and component importance metrics useful for maintenance planning and research prioritization. Furthermore, this paper introduces a recursive solution to closed-form method and applies this to a 45-component turbine system. The proposed approach proves to be computationally efficient and yields vital reliability information that could be readily used by wind farm stakeholders for decision making and risk management.

  15. Comparing performances of clements, box-cox, Johnson methods with weibull distributions for assessing process capability

    Directory of Open Access Journals (Sweden)

    Ozlem Senvar

    2016-08-01

    Full Text Available Purpose: This study examines Clements’ Approach (CA, Box-Cox transformation (BCT, and Johnson transformation (JT methods for process capability assessments through Weibull-distributed data with different parameters to figure out the effects of the tail behaviours on process capability and compares their estimation performances in terms of accuracy and precision. Design/methodology/approach: Usage of process performance index (PPI Ppu is handled for process capability analysis (PCA because the comparison issues are performed through generating Weibull data without subgroups. Box plots, descriptive statistics, the root-mean-square deviation (RMSD, which is used as a measure of error, and a radar chart are utilized all together for evaluating the performances of the methods. In addition, the bias of the estimated values is important as the efficiency measured by the mean square error. In this regard, Relative Bias (RB and the Relative Root Mean Square Error (RRMSE are also considered. Findings: The results reveal that the performance of a method is dependent on its capability to fit the tail behavior of the Weibull distribution and on targeted values of the PPIs. It is observed that the effect of tail behavior is more significant when the process is more capable. Research limitations/implications: Some other methods such as Weighted Variance method, which also give good results, were also conducted. However, we later realized that it would be confusing in terms of comparison issues between the methods for consistent interpretations. Practical implications: Weibull distribution covers a wide class of non-normal processes due to its capability to yield a variety of distinct curves based on its parameters. Weibull distributions are known to have significantly different tail behaviors, which greatly affects the process capability. In quality and reliability applications, they are widely used for the analyses of failure data in order to understand how

  16. System reliability analysis with natural language and expert's subjectivity

    International Nuclear Information System (INIS)

    Onisawa, T.

    1996-01-01

    This paper introduces natural language expressions and expert's subjectivity to system reliability analysis. To this end, this paper defines a subjective measure of reliability and presents the method of the system reliability analysis using the measure. The subjective measure of reliability corresponds to natural language expressions of reliability estimation, which is represented by a fuzzy set defined on [0,1]. The presented method deals with the dependence among subsystems and employs parametrized operations of subjective measures of reliability which can reflect expert 's subjectivity towards the analyzed system. The analysis results are also expressed by linguistic terms. Finally this paper gives an example of the system reliability analysis by the presented method

  17. Classifier Fusion With Contextual Reliability Evaluation.

    Science.gov (United States)

    Liu, Zhunga; Pan, Quan; Dezert, Jean; Han, Jun-Wei; He, You

    2018-05-01

    Classifier fusion is an efficient strategy to improve the classification performance for the complex pattern recognition problem. In practice, the multiple classifiers to combine can have different reliabilities and the proper reliability evaluation plays an important role in the fusion process for getting the best classification performance. We propose a new method for classifier fusion with contextual reliability evaluation (CF-CRE) based on inner reliability and relative reliability concepts. The inner reliability, represented by a matrix, characterizes the probability of the object belonging to one class when it is classified to another class. The elements of this matrix are estimated from the -nearest neighbors of the object. A cautious discounting rule is developed under belief functions framework to revise the classification result according to the inner reliability. The relative reliability is evaluated based on a new incompatibility measure which allows to reduce the level of conflict between the classifiers by applying the classical evidence discounting rule to each classifier before their combination. The inner reliability and relative reliability capture different aspects of the classification reliability. The discounted classification results are combined with Dempster-Shafer's rule for the final class decision making support. The performance of CF-CRE have been evaluated and compared with those of main classical fusion methods using real data sets. The experimental results show that CF-CRE can produce substantially higher accuracy than other fusion methods in general. Moreover, CF-CRE is robust to the changes of the number of nearest neighbors chosen for estimating the reliability matrix, which is appealing for the applications.

  18. Capabilities and Incapabilities of the Capabilities Approach to Health Justice.

    Science.gov (United States)

    Selgelid, Michael J

    2016-01-01

    This first part of this article critiques Sridhar Venkatapuram's conception of health as a capability. It argues that Venkatapuram relies on the problematic concept of dignity, implies that those who are unhealthy lack lives worthy of dignity (which seems politically incorrect), sets a low bar for health, appeals to metaphysically problematic thresholds, fails to draw clear connections between appealed-to capabilities and health, and downplays the importance/relevance of health functioning. It concludes by questioning whether justice entitlements should pertain to the capability for health versus health achievements, challenging Venkatapuram's claims about the strength of health entitlements, and demonstrating that the capabilities approach is unnecessary to address social determinants of health. © 2016 John Wiley & Sons Ltd.

  19. Revisiting the Fundamentals and Capabilities of the Stack Compression Test

    DEFF Research Database (Denmark)

    Alves, L.M.; Nielsen, Chris Valentin; Martin, P.A.F.

    2011-01-01

    performance by comparing the flow curves obtained from its utilisation with those determined by means of compressive testing carried out on solid cylinder specimens of the same material. Results show that mechanical testing of materials by means of the stack compression test is capable of meeting...... the increasing demand of accurate and reliable flow curves for sheet metals....

  20. Development of the ETOC: a European test of olfactory capabilities

    NARCIS (Netherlands)

    Thomas-Danguin, T.; Rouby, C.; Sicard, G.; Vigouroux, M.; Farget, V.; Johanson, A.; Bengtzon, A.; Hall, G.; Ormel, W.; Graaf, de C.; Rousseau, F.; Dumont, J.P.

    2003-01-01

    A number of smell tests designed to evaluate human olfactory capabilities have been published, but none have been validated cross-culturally. The aim of this study was therefore to develop a reliable and quick olfactory test that could be used to evaluate efficiently the olfactory abilities of a

  1. Software engineering practices for control system reliability

    International Nuclear Information System (INIS)

    S. K. Schaffner; K. S White

    1999-01-01

    This paper will discuss software engineering practices used to improve Control System reliability. The authors begin with a brief discussion of the Software Engineering Institute's Capability Maturity Model (CMM) which is a framework for evaluating and improving key practices used to enhance software development and maintenance capabilities. The software engineering processes developed and used by the Controls Group at the Thomas Jefferson National Accelerator Facility (Jefferson Lab), using the Experimental Physics and Industrial Control System (EPICS) for accelerator control, are described. Examples are given of how their procedures have been used to minimized control system downtime and improve reliability. While their examples are primarily drawn from their experience with EPICS, these practices are equally applicable to any control system. Specific issues addressed include resource allocation, developing reliable software lifecycle processes and risk management

  2. Reliability analysis based on the losses from failures.

    Science.gov (United States)

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the

  3. Computable general equilibrium model fiscal year 2014 capability development report

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Brian Keith [Los Alamos National Laboratory; Boero, Riccardo [Los Alamos National Laboratory

    2016-05-11

    This report provides an overview of the development of the NISAC CGE economic modeling capability since 2012. This capability enhances NISAC's economic modeling and analysis capabilities to answer a broader set of questions than possible with previous economic analysis capability. In particular, CGE modeling captures how the different sectors of the economy, for example, households, businesses, government, etc., interact to allocate resources in an economy and this approach captures these interactions when it is used to estimate the economic impacts of the kinds of events NISAC often analyzes.

  4. Indigenous Technological Innovation : Capability and ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Indigenous Technological Innovation : Capability and Competitiveness in China's ... IDRC and key partners will showcase critical work on adaptation and ... Call for new OWSD Fellowships for Early Career Women Scientists now open.

  5. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  6. Improved DFIG Capability during Asymmetrical Grid Faults

    DEFF Research Database (Denmark)

    Zhou, Dao; Blaabjerg, Frede

    2015-01-01

    In the wind power application, different asymmetrical types of the grid fault can be categorized after the Y/d transformer, and the positive and negative components of a single-phase fault, phase-to-phase fault, and two-phase fault can be summarized. Due to the newly introduced negative and even...... the natural component of the Doubly-Fed Induction Generator (DFIG) stator flux during the fault period, their effects on the rotor voltage can be investigated. It is concluded that the phase-to-phase fault has the worst scenario due to its highest introduction of the negative stator flux. Afterwards......, the capability of a 2 MW DFIG to ride through asymmetrical grid faults can be estimated at the existing design of the power electronics converter. Finally, a control scheme aimed to improve the DFIG capability is proposed and the simulation results validate its feasibility....

  7. Earth Science Capability Demonstration Project

    Science.gov (United States)

    Cobleigh, Brent

    2006-01-01

    A viewgraph presentation reviewing the Earth Science Capability Demonstration Project is shown. The contents include: 1) ESCD Project; 2) Available Flight Assets; 3) Ikhana Procurement; 4) GCS Layout; 5) Baseline Predator B Architecture; 6) Ikhana Architecture; 7) UAV Capability Assessment; 8) The Big Picture; 9) NASA/NOAA UAV Demo (5/05 to 9/05); 10) NASA/USFS Western States Fire Mission (8/06); and 11) Suborbital Telepresence.

  8. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  9. Reliability of electronic systems

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2001-01-01

    Reliability techniques have been developed subsequently as a need of the diverse engineering disciplines, nevertheless they are not few those that think they have been work a lot on reliability before the same word was used in the current context. Military, space and nuclear industries were the first ones that have been involved in this topic, however not only in these environments it is that it has been carried out this small great revolution in benefit of the increase of the reliability figures of the products of those industries, but rather it has extended to the whole industry. The fact of the massive production, characteristic of the current industries, drove four decades ago, to the fall of the reliability of its products, on one hand, because the massively itself and, for other, to the recently discovered and even not stabilized industrial techniques. Industry should be changed according to those two new requirements, creating products of medium complexity and assuring an enough reliability appropriated to production costs and controls. Reliability began to be integral part of the manufactured product. Facing this philosophy, the book describes reliability techniques applied to electronics systems and provides a coherent and rigorous framework for these diverse activities providing a unifying scientific basis for the entire subject. It consists of eight chapters plus a lot of statistical tables and an extensive annotated bibliography. Chapters embrace the following topics: 1- Introduction to Reliability; 2- Basic Mathematical Concepts; 3- Catastrophic Failure Models; 4-Parametric Failure Models; 5- Systems Reliability; 6- Reliability in Design and Project; 7- Reliability Tests; 8- Software Reliability. This book is in Spanish language and has a potentially diverse audience as a text book from academic to industrial courses. (author)

  10. Operational safety reliability research

    International Nuclear Information System (INIS)

    Hall, R.E.; Boccio, J.L.

    1986-01-01

    Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant

  11. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  12. Physician capability to electronically exchange clinical information, 2011.

    Science.gov (United States)

    Patel, Vaishali; Swain, Matthew J; King, Jennifer; Furukawa, Michael F

    2013-10-01

    To provide national estimates of physician capability to electronically share clinical information with other providers and to describe variation in exchange capability across states and electronic health record (EHR) vendors using the 2011 National Ambulatory Medical Care Survey Electronic Medical Record Supplement. Survey of a nationally representative sample of nonfederal office-based physicians who provide direct patient care. The survey was administered by mail with telephone follow-up and had a 61% weighted response rate. The overall sample consisted of 4326 respondents. We calculated estimates of electronic exchange capability at the national and state levels, and applied multivariate analyses to examine the association between the capability to exchange different types of clinical information and physician and practice characteristics. In 2011, 55% of physicians had computerized capability to send prescriptions electronically; 67% had the capability to view lab results electronically; 42% were able to incorporate lab results into their EHR; 35% were able to send lab orders electronically; and, 31% exchanged patient clinical summaries with other providers. The strongest predictor of exchange capability is adoption of an EHR. However, substantial variation exists across geography and EHR vendors in exchange capability, especially electronic exchange of clinical summaries. In 2011, a majority of office-based physicians could exchange lab and medication data, and approximately one-third could exchange clinical summaries with patients or other providers. EHRs serve as a key mechanism by which physicians can exchange clinical data, though physicians' capability to exchange varies by vendor and by state.

  13. Prediction of software operational reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1995-01-01

    For many years, many researches have focused on the quantification of software reliability and there are many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. The experiences show that the operational reliability is higher than the test reliability User's interest is on the operational reliability rather than on the test reliability, however. With the assumption that the difference in reliability results from the change of environment, testing environment factors comprising the aging factor and the coverage factor are defined in this study to predict the ultimate operational reliability with the failure data. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results are close to the actual data

  14. Bayesian approach in the power electric systems study of reliability ...

    African Journals Online (AJOL)

    Keywords: Reliability - Power System - Bayes Theorem - Weibull Model - Probability. ... ensure a series of estimated parameter (failure rate, mean time to failure, function .... only on random variable r.v. describing the operating conditions: ..... Multivariate performance reliability prediction in real-time, Reliability Engineering.

  15. Reliability-based design of wind turbine blades

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    2011-01-01

    Reliability-based design of wind turbine blades requires identification of the important failure modes/limit states along with stochastic models for the uncertainties and methods for estimating the reliability. In the present paper it is described how reliability-based design can be applied to wi...

  16. MCNP Perturbation Capability for Monte Carlo Criticality Calculations

    International Nuclear Information System (INIS)

    Hendricks, J.S.; Carter, L.L.; McKinney, G.W.

    1999-01-01

    The differential operator perturbation capability in MCNP4B has been extended to automatically calculate perturbation estimates for the track length estimate of k eff in MCNP4B. The additional corrections required in certain cases for MCNP4B are no longer needed. Calculating the effect of small design changes on the criticality of nuclear systems with MCNP is now straightforward

  17. Accelerator and Electrodynamics Capability Review

    International Nuclear Information System (INIS)

    Jones, Kevin W.

    2010-01-01

    Los Alamos National Laboratory (LANL) uses capability reviews to assess the science, technology and engineering (STE) quality and institutional integration and to advise Laboratory Management on the current and future health of the STE. Capability reviews address the STE integration that LANL uses to meet mission requirements. The Capability Review Committees serve a dual role of providing assessment of the Laboratory's technical contributions and integration towards its missions and providing advice to Laboratory Management. The assessments and advice are documented in reports prepared by the Capability Review Committees that are delivered to the Director and to the Principal Associate Director for Science, Technology and Engineering (PADSTE). Laboratory Management will use this report for STE assessment and planning. LANL has defined fifteen STE capabilities. Electrodynamics and Accelerators is one of the seven STE capabilities that LANL Management (Director, PADSTE, technical Associate Directors) has identified for review in Fiscal Year (FY) 2010. Accelerators and electrodynamics at LANL comprise a blend of large-scale facilities and innovative small-scale research with a growing focus on national security applications. This review is organized into five topical areas: (1) Free Electron Lasers; (2) Linear Accelerator Science and Technology; (3) Advanced Electromagnetics; (4) Next Generation Accelerator Concepts; and (5) National Security Accelerator Applications. The focus is on innovative technology with an emphasis on applications relevant to Laboratory mission. The role of Laboratory Directed Research and Development (LDRD) in support of accelerators/electrodynamics will be discussed. The review provides an opportunity for interaction with early career staff. Program sponsors and customers will provide their input on the value of the accelerator and electrodynamics capability to the Laboratory mission.

  18. Accelerator and electrodynamics capability review

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Kevin W [Los Alamos National Laboratory

    2010-01-01

    Los Alamos National Laboratory (LANL) uses capability reviews to assess the science, technology and engineering (STE) quality and institutional integration and to advise Laboratory Management on the current and future health of the STE. Capability reviews address the STE integration that LANL uses to meet mission requirements. The Capability Review Committees serve a dual role of providing assessment of the Laboratory's technical contributions and integration towards its missions and providing advice to Laboratory Management. The assessments and advice are documented in reports prepared by the Capability Review Committees that are delivered to the Director and to the Principal Associate Director for Science, Technology and Engineering (PADSTE). Laboratory Management will use this report for STE assessment and planning. LANL has defined fifteen STE capabilities. Electrodynamics and Accelerators is one of the seven STE capabilities that LANL Management (Director, PADSTE, technical Associate Directors) has identified for review in Fiscal Year (FY) 2010. Accelerators and electrodynamics at LANL comprise a blend of large-scale facilities and innovative small-scale research with a growing focus on national security applications. This review is organized into five topical areas: (1) Free Electron Lasers; (2) Linear Accelerator Science and Technology; (3) Advanced Electromagnetics; (4) Next Generation Accelerator Concepts; and (5) National Security Accelerator Applications. The focus is on innovative technology with an emphasis on applications relevant to Laboratory mission. The role of Laboratory Directed Research and Development (LDRD) in support of accelerators/electrodynamics will be discussed. The review provides an opportunity for interaction with early career staff. Program sponsors and customers will provide their input on the value of the accelerator and electrodynamics capability to the Laboratory mission.

  19. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  20. MOV reliability evaluation and periodic verification scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.