WorldWideScience

Sample records for reliability estimation methods

  1. A Method of Nuclear Software Reliability Estimation

    International Nuclear Information System (INIS)

    Park, Gee Yong; Eom, Heung Seop; Cheon, Se Woo; Jang, Seung Cheol

    2011-01-01

    A method on estimating software reliability for nuclear safety software is proposed. This method is based on the software reliability growth model (SRGM) where the behavior of software failure is assumed to follow the non-homogeneous Poisson process. Several modeling schemes are presented in order to estimate and predict more precisely the number of software defects based on a few of software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating the software test cases into the model. It is identified that this method is capable of accurately estimating the remaining number of software defects which are on-demand type directly affecting safety trip functions. The software reliability can be estimated from a model equation and one method of obtaining the software reliability is proposed

  2. Reliability of Estimation Pile Load Capacity Methods

    Directory of Open Access Journals (Sweden)

    Yudhi Lastiasih

    2014-04-01

    Full Text Available None of numerous previous methods for predicting pile capacity is known how accurate any of them are when compared with the actual ultimate capacity of piles tested to failure. The author’s of the present paper have conducted such an analysis, based on 130 data sets of field loading tests. Out of these 130 data sets, only 44 could be analysed, of which 15 were conducted until the piles actually reached failure. The pile prediction methods used were: Brinch Hansen’s method (1963, Chin’s method (1970, Decourt’s Extrapolation Method (1999, Mazurkiewicz’s method (1972, Van der Veen’s method (1953, and the Quadratic Hyperbolic Method proposed by Lastiasih et al. (2012. It was obtained that all the above methods were sufficiently reliable when applied to data from pile loading tests that loaded to reach failure. However, when applied to data from pile loading tests that loaded without reaching failure, the methods that yielded lower values for correction factor N are more recommended. Finally, the empirical method of Reese and O’Neill (1988 was found to be reliable enough to be used to estimate the Qult of a pile foundation based on soil data only.

  3. Software Estimation: Developing an Accurate, Reliable Method

    Science.gov (United States)

    2011-08-01

    based and size-based estimates is able to accurately plan, launch, and execute on schedule. Bob Sinclair, NAWCWD Chris Rickets , NAWCWD Brad Hodgins...Office by Carnegie Mellon University. SMPSP and SMTSP are service marks of Carnegie Mellon University. 1. Rickets , Chris A, “A TSP Software Maintenance...Life Cycle”, CrossTalk, March, 2005. 2. Koch, Alan S, “TSP Can Be the Building blocks for CMMI”, CrossTalk, March, 2005. 3. Hodgins, Brad, Rickets

  4. A SOFTWARE RELIABILITY ESTIMATION METHOD TO NUCLEAR SAFETY SOFTWARE

    Directory of Open Access Journals (Sweden)

    GEE-YONG PARK

    2014-02-01

    Full Text Available A method for estimating software reliability for nuclear safety software is proposed in this paper. This method is based on the software reliability growth model (SRGM, where the behavior of software failure is assumed to follow a non-homogeneous Poisson process. Two types of modeling schemes based on a particular underlying method are proposed in order to more precisely estimate and predict the number of software defects based on very rare software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating software test cases as a covariate into the model. It was identified that these models are capable of reasonably estimating the remaining number of software defects which directly affects the reactor trip functions. The software reliability might be estimated from these modeling equations, and one approach of obtaining software reliability value is proposed in this paper.

  5. Investigation of MLE in nonparametric estimation methods of reliability function

    International Nuclear Information System (INIS)

    Ahn, Kwang Won; Kim, Yoon Ik; Chung, Chang Hyun; Kim, Kil Yoo

    2001-01-01

    There have been lots of trials to estimate a reliability function. In the ESReDA 20 th seminar, a new method in nonparametric way was proposed. The major point of that paper is how to use censored data efficiently. Generally there are three kinds of approach to estimate a reliability function in nonparametric way, i.e., Reduced Sample Method, Actuarial Method and Product-Limit (PL) Method. The above three methods have some limits. So we suggest an advanced method that reflects censored information more efficiently. In many instances there will be a unique maximum likelihood estimator (MLE) of an unknown parameter, and often it may be obtained by the process of differentiation. It is well known that the three methods generally used to estimate a reliability function in nonparametric way have maximum likelihood estimators that are uniquely exist. So, MLE of the new method is derived in this study. The procedure to calculate a MLE is similar just like that of PL-estimator. The difference of the two is that in the new method, the mass (or weight) of each has an influence of the others but the mass in PL-estimator not

  6. A generic method for estimating system reliability using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Ramirez-Marquez, Jose Emmanuel

    2009-01-01

    This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples

  7. A generic method for estimating system reliability using Bayesian networks

    Energy Technology Data Exchange (ETDEWEB)

    Doguc, Ozge [Stevens Institute of Technology, Hoboken, NJ 07030 (United States); Ramirez-Marquez, Jose Emmanuel [Stevens Institute of Technology, Hoboken, NJ 07030 (United States)], E-mail: jmarquez@stevens.edu

    2009-02-15

    This study presents a holistic method for constructing a Bayesian network (BN) model for estimating system reliability. BN is a probabilistic approach that is used to model and predict the behavior of a system based on observed stochastic events. The BN model is a directed acyclic graph (DAG) where the nodes represent system components and arcs represent relationships among them. Although recent studies on using BN for estimating system reliability have been proposed, they are based on the assumption that a pre-built BN has been designed to represent the system. In these studies, the task of building the BN is typically left to a group of specialists who are BN and domain experts. The BN experts should learn about the domain before building the BN, which is generally very time consuming and may lead to incorrect deductions. As there are no existing studies to eliminate the need for a human expert in the process of system reliability estimation, this paper introduces a method that uses historical data about the system to be modeled as a BN and provides efficient techniques for automated construction of the BN model, and hence estimation of the system reliability. In this respect K2, a data mining algorithm, is used for finding associations between system components, and thus building the BN model. This algorithm uses a heuristic to provide efficient and accurate results while searching for associations. Moreover, no human intervention is necessary during the process of BN construction and reliability estimation. The paper provides a step-by-step illustration of the method and evaluation of the approach with literature case examples.

  8. Structural Reliability Using Probability Density Estimation Methods Within NESSUS

    Science.gov (United States)

    Chamis, Chrisos C. (Technical Monitor); Godines, Cody Ric

    2003-01-01

    A reliability analysis studies a mathematical model of a physical system taking into account uncertainties of design variables and common results are estimations of a response density, which also implies estimations of its parameters. Some common density parameters include the mean value, the standard deviation, and specific percentile(s) of the response, which are measures of central tendency, variation, and probability regions, respectively. Reliability analyses are important since the results can lead to different designs by calculating the probability of observing safe responses in each of the proposed designs. All of this is done at the expense of added computational time as compared to a single deterministic analysis which will result in one value of the response out of many that make up the density of the response. Sampling methods, such as monte carlo (MC) and latin hypercube sampling (LHS), can be used to perform reliability analyses and can compute nonlinear response density parameters even if the response is dependent on many random variables. Hence, both methods are very robust; however, they are computationally expensive to use in the estimation of the response density parameters. Both methods are 2 of 13 stochastic methods that are contained within the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) program. NESSUS is a probabilistic finite element analysis (FEA) program that was developed through funding from NASA Glenn Research Center (GRC). It has the additional capability of being linked to other analysis programs; therefore, probabilistic fluid dynamics, fracture mechanics, and heat transfer are only a few of what is possible with this software. The LHS method is the newest addition to the stochastic methods within NESSUS. Part of this work was to enhance NESSUS with the LHS method. The new LHS module is complete, has been successfully integrated with NESSUS, and been used to study four different test cases that have been

  9. Reliability Estimation of the Pultrusion Process Using the First-Order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem Celal; Hattel, Jesper Henri

    2013-01-01

    In the present study the reliability estimation of the pultrusion process of a flat plate is analyzed by using the first order reliability method (FORM). The implementation of the numerical process model is validated by comparing the deterministic temperature and cure degree profiles...... with corresponding analyses in the literature. The centerline degree of cure at the exit (CDOCE) being less than a critical value and the maximum composite temperature (Tmax) during the process being greater than a critical temperature are selected as the limit state functions (LSFs) for the FORM. The cumulative...

  10. Evaluation and reliability of bone histological age estimation methods

    African Journals Online (AJOL)

    Human age estimation at death plays a vital role in forensic anthropology and bioarchaeology. Researchers used morphological and histological methods to estimate human age from their skeletal remains. This paper discussed different histological methods that used human long bones and ribs to determine age ...

  11. Reliability analysis based on a novel density estimation method for structures with correlations

    Directory of Open Access Journals (Sweden)

    Baoyu LI

    2017-06-01

    Full Text Available Estimating the Probability Density Function (PDF of the performance function is a direct way for structural reliability analysis, and the failure probability can be easily obtained by integration in the failure domain. However, efficiently estimating the PDF is still an urgent problem to be solved. The existing fractional moment based maximum entropy has provided a very advanced method for the PDF estimation, whereas the main shortcoming is that it limits the application of the reliability analysis method only to structures with independent inputs. While in fact, structures with correlated inputs always exist in engineering, thus this paper improves the maximum entropy method, and applies the Unscented Transformation (UT technique to compute the fractional moments of the performance function for structures with correlations, which is a very efficient moment estimation method for models with any inputs. The proposed method can precisely estimate the probability distributions of performance functions for structures with correlations. Besides, the number of function evaluations of the proposed method in reliability analysis, which is determined by UT, is really small. Several examples are employed to illustrate the accuracy and advantages of the proposed method.

  12. Reliability of Semiautomated Computational Methods for Estimating Tibiofemoral Contact Stress in the Multicenter Osteoarthritis Study

    Directory of Open Access Journals (Sweden)

    Donald D. Anderson

    2012-01-01

    Full Text Available Recent findings suggest that contact stress is a potent predictor of subsequent symptomatic osteoarthritis development in the knee. However, much larger numbers of knees (likely on the order of hundreds, if not thousands need to be reliably analyzed to achieve the statistical power necessary to clarify this relationship. This study assessed the reliability of new semiautomated computational methods for estimating contact stress in knees from large population-based cohorts. Ten knees of subjects from the Multicenter Osteoarthritis Study were included. Bone surfaces were manually segmented from sequential 1.0 Tesla magnetic resonance imaging slices by three individuals on two nonconsecutive days. Four individuals then registered the resulting bone surfaces to corresponding bone edges on weight-bearing radiographs, using a semi-automated algorithm. Discrete element analysis methods were used to estimate contact stress distributions for each knee. Segmentation and registration reliabilities (day-to-day and interrater for peak and mean medial and lateral tibiofemoral contact stress were assessed with Shrout-Fleiss intraclass correlation coefficients (ICCs. The segmentation and registration steps of the modeling approach were found to have excellent day-to-day (ICC 0.93–0.99 and good inter-rater reliability (0.84–0.97. This approach for estimating compartment-specific tibiofemoral contact stress appears to be sufficiently reliable for use in large population-based cohorts.

  13. An automated method for estimating reliability of grid systems using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Emmanuel Ramirez-Marquez, Jose

    2012-01-01

    Grid computing has become relevant due to its applications to large-scale resource sharing, wide-area information transfer, and multi-institutional collaborating. In general, in grid computing a service requests the use of a set of resources, available in a grid, to complete certain tasks. Although analysis tools and techniques for these types of systems have been studied, grid reliability analysis is generally computation-intensive to obtain due to the complexity of the system. Moreover, conventional reliability models have some common assumptions that cannot be applied to the grid systems. Therefore, new analytical methods are needed for effective and accurate assessment of grid reliability. This study presents a new method for estimating grid service reliability, which does not require prior knowledge about the grid system structure unlike the previous studies. Moreover, the proposed method does not rely on any assumptions about the link and node failure rates. This approach is based on a data-mining algorithm, the K2, to discover the grid system structure from raw historical system data, that allows to find minimum resource spanning trees (MRST) within the grid then, uses Bayesian networks (BN) to model the MRST and estimate grid service reliability.

  14. Methods for estimating the reliability of the RBMK fuel assemblies and elements

    International Nuclear Information System (INIS)

    Klemin, A.I.; Sitkarev, A.G.

    1985-01-01

    Applied non-parametric methods for calculation of point and interval estimations for the basic nomenclature of reliability factors for the RBMK fuel assemblies and elements are described. As the fuel assembly and element reliability factors, the average lifetime is considered at a preset operating time up to unloading due to fuel burnout as well as the average lifetime at the reactor transient operation and at the steady-state fuel reloading mode of reactor operation. The formulae obtained are included into the special standardized engineering documentation

  15. A rapid reliability estimation method for directed acyclic lifeline networks with statistically dependent components

    International Nuclear Information System (INIS)

    Kang, Won-Hee; Kliese, Alyce

    2014-01-01

    Lifeline networks, such as transportation, water supply, sewers, telecommunications, and electrical and gas networks, are essential elements for the economic and societal functions of urban areas, but their components are highly susceptible to natural or man-made hazards. In this context, it is essential to provide effective pre-disaster hazard mitigation strategies and prompt post-disaster risk management efforts based on rapid system reliability assessment. This paper proposes a rapid reliability estimation method for node-pair connectivity analysis of lifeline networks especially when the network components are statistically correlated. Recursive procedures are proposed to compound all network nodes until they become a single super node representing the connectivity between the origin and destination nodes. The proposed method is applied to numerical network examples and benchmark interconnected power and water networks in Memphis, Shelby County. The connectivity analysis results show the proposed method's reasonable accuracy and remarkable efficiency as compared to the Monte Carlo simulations

  16. Uncertainty analysis methods for estimation of reliability of passive system of VHTR

    International Nuclear Information System (INIS)

    Han, S.J.

    2012-01-01

    An estimation of reliability of passive system for the probabilistic safety assessment (PSA) of a very high temperature reactor (VHTR) is under development in Korea. The essential approach of this estimation is to measure the uncertainty of the system performance under a specific accident condition. The uncertainty propagation approach according to the simulation of phenomenological models (computer codes) is adopted as a typical method to estimate the uncertainty for this purpose. This presentation introduced the uncertainty propagation and discussed the related issues focusing on the propagation object and its surrogates. To achieve a sufficient level of depth of uncertainty results, the applicability of the propagation should be carefully reviewed. For an example study, Latin-hypercube sampling (LHS) method as a direct propagation was tested for a specific accident sequence of VHTR. The reactor cavity cooling system (RCCS) developed by KAERI was considered for this example study. This is an air-cooled type passive system that has no active components for its operation. The accident sequence is a low pressure conduction cooling (LPCC) accident that is considered as a design basis accident for the safety design of VHTR. This sequence is due to a large failure of the pressure boundary of the reactor system such as a guillotine break of coolant pipe lines. The presentation discussed the obtained insights (benefit and weakness) to apply an estimation of reliability of passive system

  17. Between-day reliability of a method for non-invasive estimation of muscle composition.

    Science.gov (United States)

    Simunič, Boštjan

    2012-08-01

    Tensiomyography is a method for valid and non-invasive estimation of skeletal muscle fibre type composition. The validity of selected temporal tensiomyographic measures has been well established recently; there is, however, no evidence regarding the method's between-day reliability. Therefore it is the aim of this paper to establish the between-day repeatability of tensiomyographic measures in three skeletal muscles. For three consecutive days, 10 healthy male volunteers (mean±SD: age 24.6 ± 3.0 years; height 177.9 ± 3.9 cm; weight 72.4 ± 5.2 kg) were examined in a supine position. Four temporal measures (delay, contraction, sustain, and half-relaxation time) and maximal amplitude were extracted from the displacement-time tensiomyogram. A reliability analysis was performed with calculations of bias, random error, coefficient of variation (CV), standard error of measurement, and intra-class correlation coefficient (ICC) with a 95% confidence interval. An analysis of ICC demonstrated excellent agreement (ICC were over 0.94 in 14 out of 15 tested parameters). However, lower CV was observed in half-relaxation time, presumably because of the specifics of the parameter definition itself. These data indicate that for the three muscles tested, tensiomyographic measurements were reproducible across consecutive test days. Furthermore, we indicated the most possible origin of the lowest reliability detected in half-relaxation time. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  19. On the Reliability of Source Time Functions Estimated Using Empirical Green's Function Methods

    Science.gov (United States)

    Gallegos, A. C.; Xie, J.; Suarez Salas, L.

    2017-12-01

    The Empirical Green's Function (EGF) method (Hartzell, 1978) has been widely used to extract source time functions (STFs). In this method, seismograms generated by collocated events with different magnitudes are deconvolved. Under a fundamental assumption that the STF of the small event is a delta function, the deconvolved Relative Source Time Function (RSTF) yields the large event's STF. While this assumption can be empirically justified by examination of differences in event size and frequency content of the seismograms, there can be a lack of rigorous justification of the assumption. In practice, a small event might have a finite duration when the RSTF is retrieved and interpreted as the large event STF with a bias. In this study, we rigorously analyze this bias using synthetic waveforms generated by convolving a realistic Green's function waveform with pairs of finite-duration triangular or parabolic STFs. The RSTFs are found using a time-domain based matrix deconvolution. We find when the STFs of smaller events are finite, the RSTFs are a series of narrow non-physical spikes. Interpreting these RSTFs as a series of high-frequency source radiations would be very misleading. The only reliable and unambiguous information we can retrieve from these RSTFs is the difference in durations and the moment ratio of the two STFs. We can apply a Tikhonov smoothing to obtain a single-pulse RSTF, but its duration is dependent on the choice of weighting, which may be subjective. We then test the Multi-Channel Deconvolution (MCD) method (Plourde & Bostock, 2017) which assumes that both STFs have finite durations to be solved for. A concern about the MCD method is that the number of unknown parameters is larger, which would tend to make the problem rank-deficient. Because the kernel matrix is dependent on the STFs to be solved for under a positivity constraint, we can only estimate the rank-deficiency with a semi-empirical approach. Based on the results so far, we find that the

  20. Simple and Reliable Method to Estimate the Fingertip Static Coefficient of Friction in Precision Grip.

    Science.gov (United States)

    Barrea, Allan; Bulens, David Cordova; Lefevre, Philippe; Thonnard, Jean-Louis

    2016-01-01

    The static coefficient of friction (µ static ) plays an important role in dexterous object manipulation. Minimal normal force (i.e., grip force) needed to avoid dropping an object is determined by the tangential force at the fingertip-object contact and the frictional properties of the skin-object contact. Although frequently assumed to be constant for all levels of normal force (NF, the force normal to the contact), µ static actually varies nonlinearly with NF and increases at low NF levels. No method is currently available to measure the relationship between µ static and NF easily. Therefore, we propose a new method allowing the simple and reliable measurement of the fingertip µ static at different NF levels, as well as an algorithm for determining µ static from measured forces and torques. Our method is based on active, back-and-forth movements of a subject's finger on the surface of a fixed six-axis force and torque sensor. µ static is computed as the ratio of the tangential to the normal force at slip onset. A negative power law captures the relationship between µ static and NF. Our method allows the continuous estimation of µ static as a function of NF during dexterous manipulation, based on the relationship between µ static and NF measured before manipulation.

  1. Reliable methods for computer simulation error control and a posteriori estimates

    CERN Document Server

    Neittaanmäki, P

    2004-01-01

    Recent decades have seen a very rapid success in developing numerical methods based on explicit control over approximation errors. It may be said that nowadays a new direction is forming in numerical analysis, the main goal of which is to develop methods ofreliable computations. In general, a reliable numerical method must solve two basic problems: (a) generate a sequence of approximations that converges to a solution and (b) verify the accuracy of these approximations. A computer code for such a method must consist of two respective blocks: solver and checker.In this book, we are chie

  2. Dependent systems reliability estimation by structural reliability approach

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2014-01-01

    Estimation of system reliability by classical system reliability methods generally assumes that the components are statistically independent, thus limiting its applicability in many practical situations. A method is proposed for estimation of the system reliability with dependent components, where...... the leading failure mechanism(s) is described by physics of failure model(s). The proposed method is based on structural reliability techniques and accounts for both statistical and failure effect correlations. It is assumed that failure of any component is due to increasing damage (fatigue phenomena...... identification. Application of the proposed method can be found in many real world systems....

  3. Reliability of different mark-recapture methods for population size estimation tested against reference population sizes constructed from field data.

    Directory of Open Access Journals (Sweden)

    Annegret Grimm

    Full Text Available Reliable estimates of population size are fundamental in many ecological studies and biodiversity conservation. Selecting appropriate methods to estimate abundance is often very difficult, especially if data are scarce. Most studies concerning the reliability of different estimators used simulation data based on assumptions about capture variability that do not necessarily reflect conditions in natural populations. Here, we used data from an intensively studied closed population of the arboreal gecko Gehyra variegata to construct reference population sizes for assessing twelve different population size estimators in terms of bias, precision, accuracy, and their 95%-confidence intervals. Two of the reference populations reflect natural biological entities, whereas the other reference populations reflect artificial subsets of the population. Since individual heterogeneity was assumed, we tested modifications of the Lincoln-Petersen estimator, a set of models in programs MARK and CARE-2, and a truncated geometric distribution. Ranking of methods was similar across criteria. Models accounting for individual heterogeneity performed best in all assessment criteria. For populations from heterogeneous habitats without obvious covariates explaining individual heterogeneity, we recommend using the moment estimator or the interpolated jackknife estimator (both implemented in CAPTURE/MARK. If data for capture frequencies are substantial, we recommend the sample coverage or the estimating equation (both models implemented in CARE-2. Depending on the distribution of catchabilities, our proposed multiple Lincoln-Petersen and a truncated geometric distribution obtained comparably good results. The former usually resulted in a minimum population size and the latter can be recommended when there is a long tail of low capture probabilities. Models with covariates and mixture models performed poorly. Our approach identified suitable methods and extended options to

  4. Efficient Estimation of Extreme Non-linear Roll Motions using the First-order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2007-01-01

    In on-board decision support systems efficient procedures are needed for real-time estimation of the maximum ship responses to be expected within the next few hours, given on-line information on the sea state and user defined ranges of possible headings and speeds. For linear responses standard...... frequency domain methods can be applied. To non-linear responses like the roll motion, standard methods like direct time domain simulations are not feasible due to the required computational time. However, the statistical distribution of non-linear ship responses can be estimated very accurately using...... the first-order reliability method (FORM), well-known from structural reliability problems. To illustrate the proposed procedure, the roll motion is modelled by a simplified non-linear procedure taking into account non-linear hydrodynamic damping, time-varying restoring and wave excitation moments...

  5. Advanced RESTART method for the estimation of the probability of failure of highly reliable hybrid dynamic systems

    International Nuclear Information System (INIS)

    Turati, Pietro; Pedroni, Nicola; Zio, Enrico

    2016-01-01

    The efficient estimation of system reliability characteristics is of paramount importance for many engineering applications. Real world system reliability modeling calls for the capability of treating systems that are: i) dynamic, ii) complex, iii) hybrid and iv) highly reliable. Advanced Monte Carlo (MC) methods offer a way to solve these types of problems, which are feasible according to the potentially high computational costs. In this paper, the REpetitive Simulation Trials After Reaching Thresholds (RESTART) method is employed, extending it to hybrid systems for the first time (to the authors’ knowledge). The estimation accuracy and precision of RESTART highly depend on the choice of the Importance Function (IF) indicating how close the system is to failure: in this respect, proper IFs are here originally proposed to improve the performance of RESTART for the analysis of hybrid systems. The resulting overall simulation approach is applied to estimate the probability of failure of the control system of a liquid hold-up tank and of a pump-valve subsystem subject to degradation induced by fatigue. The results are compared to those obtained by standard MC simulation and by RESTART with classical IFs available in the literature. The comparison shows the improvement in the performance obtained by our approach. - Highlights: • We consider the issue of estimating small failure probabilities in dynamic systems. • We employ the RESTART method to estimate the failure probabilities. • New Importance Functions (IFs) are introduced to increase the method performance. • We adopt two dynamic, hybrid, highly reliable systems as case studies. • A comparison with literature IFs proves the effectiveness of the new IFs.

  6. Application of fuzzy-MOORA method: Ranking of components for reliability estimation of component-based software systems

    Directory of Open Access Journals (Sweden)

    Zeeshan Ali Siddiqui

    2016-01-01

    Full Text Available Component-based software system (CBSS development technique is an emerging discipline that promises to take software development into a new era. As hardware systems are presently being constructed from kits of parts, software systems may also be assembled from components. It is more reliable to reuse software than to create. It is the glue code and individual components reliability that contribute to the reliability of the overall system. Every component contributes to overall system reliability according to the number of times it is being used, some components are of critical usage, known as usage frequency of component. The usage frequency decides the weight of each component. According to their weights, each component contributes to the overall reliability of the system. Therefore, ranking of components may be obtained by analyzing their reliability impacts on overall application. In this paper, we propose the application of fuzzy multi-objective optimization on the basis of ratio analysis, Fuzzy-MOORA. The method helps us find the best suitable alternative, software component, from a set of available feasible alternatives named software components. It is an accurate and easy to understand tool for solving multi-criteria decision making problems that have imprecise and vague evaluation data. By the use of ratio analysis, the proposed method determines the most suitable alternative among all possible alternatives, and dimensionless measurement will realize the job of ranking of components for estimating CBSS reliability in a non-subjective way. Finally, three case studies are shown to illustrate the use of the proposed technique.

  7. Estimation of Bridge Reliability Distributions

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    In this paper it is shown how the so-called reliability distributions can be estimated using crude Monte Carlo simulation. The main purpose is to demonstrate the methodology. Therefor very exact data concerning reliability and deterioration are not needed. However, it is intended in the paper to ...

  8. A fast and reliable method for simultaneous waveform, amplitude and latency estimation of single-trial EEG/MEG data.

    Directory of Open Access Journals (Sweden)

    Wouter D Weeda

    Full Text Available The amplitude and latency of single-trial EEG/MEG signals may provide valuable information concerning human brain functioning. In this article we propose a new method to reliably estimate single-trial amplitude and latency of EEG/MEG signals. The advantages of the method are fourfold. First, no a-priori specified template function is required. Second, the method allows for multiple signals that may vary independently in amplitude and/or latency. Third, the method is less sensitive to noise as it models data with a parsimonious set of basis functions. Finally, the method is very fast since it is based on an iterative linear least squares algorithm. A simulation study shows that the method yields reliable estimates under different levels of latency variation and signal-to-noise ratioÕs. Furthermore, it shows that the existence of multiple signals can be correctly determined. An application to empirical data from a choice reaction time study indicates that the method describes these data accurately.

  9. Expanding Reliability Generalization Methods with KR-21 Estimates: An RG Study of the Coopersmith Self-Esteem Inventory.

    Science.gov (United States)

    Lane, Ginny G.; White, Amy E.; Henson, Robin K.

    2002-01-01

    Conducted a reliability generalizability study on the Coopersmith Self-Esteem Inventory (CSEI; S. Coopersmith, 1967) to examine the variability of reliability estimates across studies and to identify study characteristics that may predict this variability. Results show that reliability for CSEI scores can vary considerably, especially at the…

  10. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  11. An Efficient and Reliable Statistical Method for Estimating Functional Connectivity in Large Scale Brain Networks Using Partial Correlation.

    Science.gov (United States)

    Wang, Yikai; Kang, Jian; Kemmer, Phebe B; Guo, Ying

    2016-01-01

    Currently, network-oriented analysis of fMRI data has become an important tool for understanding brain organization and brain networks. Among the range of network modeling methods, partial correlation has shown great promises in accurately detecting true brain network connections. However, the application of partial correlation in investigating brain connectivity, especially in large-scale brain networks, has been limited so far due to the technical challenges in its estimation. In this paper, we propose an efficient and reliable statistical method for estimating partial correlation in large-scale brain network modeling. Our method derives partial correlation based on the precision matrix estimated via Constrained L1-minimization Approach (CLIME), which is a recently developed statistical method that is more efficient and demonstrates better performance than the existing methods. To help select an appropriate tuning parameter for sparsity control in the network estimation, we propose a new Dens-based selection method that provides a more informative and flexible tool to allow the users to select the tuning parameter based on the desired sparsity level. Another appealing feature of the Dens-based method is that it is much faster than the existing methods, which provides an important advantage in neuroimaging applications. Simulation studies show that the Dens-based method demonstrates comparable or better performance with respect to the existing methods in network estimation. We applied the proposed partial correlation method to investigate resting state functional connectivity using rs-fMRI data from the Philadelphia Neurodevelopmental Cohort (PNC) study. Our results show that partial correlation analysis removed considerable between-module marginal connections identified by full correlation analysis, suggesting these connections were likely caused by global effects or common connection to other nodes. Based on partial correlation, we find that the most significant

  12. Estimation of gingival crevicular blood glucose level for the screening of diabetes mellitus: A simple yet reliable method.

    Science.gov (United States)

    Parihar, Sarita; Tripathi, Richik; Parihar, Ajit Vikram; Samadi, Fahad M; Chandra, Akhilesh; Bhavsar, Neeta

    2016-01-01

    This study was designed to assess the reliability of blood glucose level estimation in gingival crevicular blood(GCB) for screening diabetes mellitus. 70 patients were included in study. A randomized, double-blind clinical trial was performed. Among these, 39 patients were diabetic (including 4 patients who were diagnosed during the study) and rest 31 patients were non-diabetic. GCB obtained during routine periodontal examination was analyzed by glucometer to know blood glucose level. The same patient underwent for finger stick blood (FSB) glucose level estimation with glucometer and venous blood (VB) glucose level with standardized laboratory method as per American Diabetes Association Guidelines. 1 All the three blood glucose levels were compared. Periodontal parameters were also recorded including gingival index (GI) and probing pocket depth (PPD). A strong positive correlation ( r ) was observed between glucose levels of GCB with FSB and VB with the values of 0.986 and 0.972 in diabetic group and 0.820 and 0.721 in non-diabetic group. As well, the mean values of GI and PPD were more in diabetic group than non-diabetic group with the statistically significant difference ( p  blood glucose level as the values were closest to glucose levels estimated by VB. The technique is safe, easy to perform and non-invasive to the patient and can increase the frequency of diagnosing diabetes during routine periodontal therapy.

  13. Methods of Estimation the Reliability and Increasing the Informativeness of the Laboratory Results (Analysis of the Laboratory Case of Measurement the Indicators of Thyroid Function)

    OpenAIRE

    N A Kovyazina; N A Alhutova; N N Zybina; N M Kalinina

    2014-01-01

    The goal of the study was to demonstrate the multilevel laboratory quality management system and point at the methods of estimating the reliability and increasing the amount of information content of the laboratory results (on the example of the laboratory case). Results. The article examines the stages of laboratory quality management which has helped to estimate the reliability of the results of determining Free T3, Free T4 and TSH. The measurement results are presented by the expanded unce...

  14. Mission Reliability Estimation for Repairable Robot Teams

    Science.gov (United States)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  15. Reliability Estimation Based Upon Test Plan Results

    National Research Council Canada - National Science Library

    Read, Robert

    1997-01-01

    The report contains a brief summary of aspects of the Maximus reliability point and interval estimation technique as it has been applied to the reliability of a device whose surveillance tests contain...

  16. Methods of Estimation the Reliability and Increasing the Informativeness of the Laboratory Results (Analysis of the Laboratory Case of Measurement the Indicators of Thyroid Function

    Directory of Open Access Journals (Sweden)

    N A Kovyazina

    2014-06-01

    Full Text Available The goal of the study was to demonstrate the multilevel laboratory quality management system and point at the methods of estimating the reliability and increasing the amount of information content of the laboratory results (on the example of the laboratory case. Results. The article examines the stages of laboratory quality management which has helped to estimate the reliability of the results of determining Free T3, Free T4 and TSH. The measurement results are presented by the expanded uncertainty and the evaluation of the dynamics. Conclusion. Compliance with mandatory measures for laboratory quality management system enables laboratories to obtain reliable results and calculate the parameters that are able to increase the amount of information content of laboratory tests in clinical decision making.

  17. Digital photography provides a fast, reliable, and noninvasive method to estimate anthocyanin pigment concentration in reproductive and vegetative plant tissues.

    Science.gov (United States)

    Del Valle, José C; Gallardo-López, Antonio; Buide, Mª Luisa; Whittall, Justen B; Narbona, Eduardo

    2018-03-01

    Anthocyanin pigments have become a model trait for evolutionary ecology as they often provide adaptive benefits for plants. Anthocyanins have been traditionally quantified biochemically or more recently using spectral reflectance. However, both methods require destructive sampling and can be labor intensive and challenging with small samples. Recent advances in digital photography and image processing make it the method of choice for measuring color in the wild. Here, we use digital images as a quick, noninvasive method to estimate relative anthocyanin concentrations in species exhibiting color variation. Using a consumer-level digital camera and a free image processing toolbox, we extracted RGB values from digital images to generate color indices. We tested petals, stems, pedicels, and calyces of six species, which contain different types of anthocyanin pigments and exhibit different pigmentation patterns. Color indices were assessed by their correlation to biochemically determined anthocyanin concentrations. For comparison, we also calculated color indices from spectral reflectance and tested the correlation with anthocyanin concentration. Indices perform differently depending on the nature of the color variation. For both digital images and spectral reflectance, the most accurate estimates of anthocyanin concentration emerge from anthocyanin content-chroma ratio, anthocyanin content-chroma basic, and strength of green indices. Color indices derived from both digital images and spectral reflectance strongly correlate with biochemically determined anthocyanin concentration; however, the estimates from digital images performed better than spectral reflectance in terms of r 2 and normalized root-mean-square error. This was particularly noticeable in a species with striped petals, but in the case of striped calyces, both methods showed a comparable relationship with anthocyanin concentration. Using digital images brings new opportunities to accurately quantify the

  18. Reliability Estimation for Digital Instrument/Control System

    International Nuclear Information System (INIS)

    Yang, Yaguang; Sydnor, Russell

    2011-01-01

    Digital instrumentation and controls (DI and C) systems are widely adopted in various industries because of their flexibility and ability to implement various functions that can be used to automatically monitor, analyze, and control complicated systems. It is anticipated that the DI and C will replace the traditional analog instrumentation and controls (AI and C) systems in all future nuclear reactor designs. There is an increasing interest for reliability and risk analyses for safety critical DI and C systems in regulatory organizations, such as The United States Nuclear Regulatory Commission. Developing reliability models and reliability estimation methods for digital reactor control and protection systems will involve every part of the DI and C system, such as sensors, signal conditioning and processing components, transmission lines and digital communication systems, D/A and A/D converters, computer system, signal processing software, control and protection software, power supply system, and actuators. Some of these components are hardware, such as sensors and actuators, their failure mechanisms are well understood, and the traditional reliability model and estimation methods can be directly applied. But many of these components are firmware which has software embedded in the hardware, and software needs special consideration because its failure mechanism is unique, and the reliability estimation method for a software system will be different from the ones used for hardware systems. In this paper, we will propose a reliability estimation method for the entire DI and C system reliability using a recently developed software reliability estimation method and a traditional hardware reliability estimation method

  19. Reliability Estimation for Digital Instrument/Control System

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yaguang; Sydnor, Russell [U.S. Nuclear Regulatory Commission, Washington, D.C. (United States)

    2011-08-15

    Digital instrumentation and controls (DI and C) systems are widely adopted in various industries because of their flexibility and ability to implement various functions that can be used to automatically monitor, analyze, and control complicated systems. It is anticipated that the DI and C will replace the traditional analog instrumentation and controls (AI and C) systems in all future nuclear reactor designs. There is an increasing interest for reliability and risk analyses for safety critical DI and C systems in regulatory organizations, such as The United States Nuclear Regulatory Commission. Developing reliability models and reliability estimation methods for digital reactor control and protection systems will involve every part of the DI and C system, such as sensors, signal conditioning and processing components, transmission lines and digital communication systems, D/A and A/D converters, computer system, signal processing software, control and protection software, power supply system, and actuators. Some of these components are hardware, such as sensors and actuators, their failure mechanisms are well understood, and the traditional reliability model and estimation methods can be directly applied. But many of these components are firmware which has software embedded in the hardware, and software needs special consideration because its failure mechanism is unique, and the reliability estimation method for a software system will be different from the ones used for hardware systems. In this paper, we will propose a reliability estimation method for the entire DI and C system reliability using a recently developed software reliability estimation method and a traditional hardware reliability estimation method.

  20. Reliable Function Approximation and Estimation

    Science.gov (United States)

    2016-08-16

    compressed sensing results to a wide class of infinite -dimensional problems. We discuss four key application domains for the methods developed in this... infinite -dimensional problems. We discuss four key findings arising from this project, as related to uncertainty quantification, image processing, matrix...compressed sensing results to a wide class of infinite -dimensional problems. We discuss four key application domains for the methods developed in this project

  1. Reliability Estimates for Undergraduate Grade Point Average

    Science.gov (United States)

    Westrick, Paul A.

    2017-01-01

    Undergraduate grade point average (GPA) is a commonly employed measure in educational research, serving as a criterion or as a predictor depending on the research question. Over the decades, researchers have used a variety of reliability coefficients to estimate the reliability of undergraduate GPA, which suggests that there has been no consensus…

  2. Reliabilities of genomic estimated breeding values in Danish Jersey

    DEFF Research Database (Denmark)

    Thomasen, Jørn Rind; Guldbrandtsen, Bernt; Su, Guosheng

    2012-01-01

    In order to optimize the use of genomic selection in breeding plans, it is essential to have reliable estimates of the genomic breeding values. This study investigated reliabilities of direct genomic values (DGVs) in the Jersey population estimated by three different methods. The validation methods...... were (i) fivefold cross-validation and (ii) validation on the most recent 3 years of bulls. The reliability of DGV was assessed using squared correlations between DGV and deregressed proofs (DRPs). In the recent 3-year validation model, estimated reliabilities were also used to assess the reliabilities...... of DGV. The data set consisted of 1003 Danish Jersey bulls with conventional estimated breeding values (EBVs) for 14 different traits included in the Nordic selection index. The bulls were genotyped for Single-nucleotide polymorphism (SNP) markers using the Illumina 54 K chip. A Bayesian method was used...

  3. A Latent Class Approach to Estimating Test-Score Reliability

    Science.gov (United States)

    van der Ark, L. Andries; van der Palm, Daniel W.; Sijtsma, Klaas

    2011-01-01

    This study presents a general framework for single-administration reliability methods, such as Cronbach's alpha, Guttman's lambda-2, and method MS. This general framework was used to derive a new approach to estimating test-score reliability by means of the unrestricted latent class model. This new approach is the latent class reliability…

  4. Evaluation of structural reliability using simulation methods

    Directory of Open Access Journals (Sweden)

    Baballëku Markel

    2015-01-01

    Full Text Available Eurocode describes the 'index of reliability' as a measure of structural reliability, related to the 'probability of failure'. This paper is focused on the assessment of this index for a reinforced concrete bridge pier. It is rare to explicitly use reliability concepts for design of structures, but the problems of structural engineering are better known through them. Some of the main methods for the estimation of the probability of failure are the exact analytical integration, numerical integration, approximate analytical methods and simulation methods. Monte Carlo Simulation is used in this paper, because it offers a very good tool for the estimation of probability in multivariate functions. Complicated probability and statistics problems are solved through computer aided simulations of a large number of tests. The procedures of structural reliability assessment for the bridge pier and the comparison with the partial factor method of the Eurocodes have been demonstrated in this paper.

  5. Reliability estimation of semi-Markov systems: a case study

    International Nuclear Information System (INIS)

    Ouhbi, Brahim; Limnios, Nikolaos

    1997-01-01

    In this article, we are concerned with the estimation of the reliability and the availability of a turbo-generator rotor using a set of data observed in a real engineering situation provided by Electricite De France (EDF). The rotor is modeled by a semi-Markov process, which is used to estimate the rotor's reliability and availability. To do this, we present a method for estimating the semi-Markov kernel from a censored data

  6. Secondary dentine as a sole parameter for age estimation: Comparison and reliability of qualitative and quantitative methods among North Western adult Indians

    Directory of Open Access Journals (Sweden)

    Jasbir Arora

    2016-06-01

    Full Text Available The indestructible nature of teeth against most of the environmental abuses makes its use in disaster victim identification (DVI. The present study has been undertaken to examine the reliability of Gustafson’s qualitative method and Kedici’s quantitative method of measuring secondary dentine for age estimation among North Western adult Indians. 196 (M = 85; F = 111 single rooted teeth were collected from the Department of Oral Health Sciences, PGIMER, Chandigarh. Ground sections were prepared and the amount of secondary dentine formed was scored qualitatively according to Gustafson’s (0–3 scoring system (method 1 and quantitatively following Kedici’s micrometric measurement method (method 2. Out of 196 teeth 180 samples (M = 80; F = 100 were found to be suitable for measuring secondary dentine following Kedici’s method. Absolute mean error of age was calculated by both methodologies. Results clearly showed that in pooled data, method 1 gave an error of ±10.4 years whereas method 2 exhibited an error of approximately ±13 years. A statistically significant difference was noted in absolute mean error of age between two methods of measuring secondary dentine for age estimation. Further, it was also revealed that teeth extracted for periodontal reasons severely decreased the accuracy of Kedici’s method however, the disease had no effect while estimating age by Gustafson’s method. No significant gender differences were noted in the absolute mean error of age by both methods which suggest that there is no need to separate data on the basis of gender.

  7. Methodology for uranium resource estimates and reliability

    International Nuclear Information System (INIS)

    Blanchfield, D.M.

    1980-01-01

    The NURE uranium assessment method has evolved from a small group of geologists estimating resources on a few lease blocks, to a national survey involving an interdisciplinary system consisting of the following: (1) geology and geologic analogs; (2) engineering and cost modeling; (3) mathematics and probability theory, psychology and elicitation of subjective judgments; and (4) computerized calculations, computer graphics, and data base management. The evolution has been spurred primarily by two objectives; (1) quantification of uncertainty, and (2) elimination of simplifying assumptions. This has resulted in a tremendous data-gathering effort and the involvement of hundreds of technical experts, many in uranium geology, but many from other fields as well. The rationality of the methods is still largely based on the concept of an analog and the observation that the results are reasonable. The reliability, or repeatability, of the assessments is reasonably guaranteed by the series of peer and superior technical reviews which has been formalized under the current methodology. The optimism or pessimism of individual geologists who make the initial assessments is tempered by the review process, resulting in a series of assessments which are a consistent, unbiased reflection of the facts. Despite the many improvements over past methods, several objectives for future development remain, primarily to reduce subjectively in utilizing factual information in the estimation of endowment, and to improve the recognition of cost uncertainties in the assessment of economic potential. The 1980 NURE assessment methodology will undoubtly be improved, but the reader is reminded that resource estimates are and always will be a forecast for the future

  8. Adaptive Response Surface Techniques in Reliability Estimation

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Faber, M. H.; Sørensen, John Dalsgaard

    1993-01-01

    Problems in connection with estimation of the reliability of a component modelled by a limit state function including noise or first order discontinuitics are considered. A gradient free adaptive response surface algorithm is developed. The algorithm applies second order polynomial surfaces...

  9. Estimation of the effective heating systems radius as a method of the reliability improving and energy efficiency

    Science.gov (United States)

    Akhmetova, I. G.; Chichirova, N. D.

    2017-11-01

    When conducting an energy survey of heat supply enterprise operating several boilers located not far from each other, it is advisable to assess the degree of heat supply efficiency from individual boiler, the possibility of energy consumption reducing in the whole enterprise by switching consumers to a more efficient source, to close in effective boilers. It is necessary to consider the temporal dynamics of perspective load connection, conditions in the market changes. To solve this problem the radius calculation of the effective heat supply from the thermal energy source can be used. The disadvantage of existing methods is the high complexity, the need to collect large amounts of source data and conduct a significant amount of computational efforts. When conducting an energy survey of heat supply enterprise operating a large number of thermal energy sources, rapid assessment of the magnitude of the effective heating radius requires. Taking into account the specifics of conduct and objectives of the energy survey method of calculation of effective heating systems radius, to use while conducting the energy audit should be based on data available heat supply organization in open access, minimize efforts, but the result should be to match the results obtained by other methods. To determine the efficiency radius of Kazan heat supply system were determined share of cost for generation and transmission of thermal energy, capital investment to connect new consumers. The result were compared with the values obtained with the previously known methods. The suggested Express-method allows to determine the effective radius of the centralized heat supply from heat sources, in conducting energy audits with the effort minimum and the required accuracy.

  10. Unrecorded Alcohol Consumption: Quantitative Methods of Estimation

    OpenAIRE

    Razvodovsky, Y. E.

    2010-01-01

    unrecorded alcohol; methods of estimation In this paper we focused on methods of estimation of unrecorded alcohol consumption level. Present methods of estimation of unrevorded alcohol consumption allow only approximate estimation of unrecorded alcohol consumption level. Tacking into consideration the extreme importance of such kind of data, further investigation is necessary to improve the reliability of methods estimation of unrecorded alcohol consumption.

  11. Estimation of some stochastic models used in reliability engineering

    International Nuclear Information System (INIS)

    Huovinen, T.

    1989-04-01

    The work aims to study the estimation of some stochastic models used in reliability engineering. In reliability engineering continuous probability distributions have been used as models for the lifetime of technical components. We consider here the following distributions: exponential, 2-mixture exponential, conditional exponential, Weibull, lognormal and gamma. Maximum likelihood method is used to estimate distributions from observed data which may be either complete or censored. We consider models based on homogeneous Poisson processes such as gamma-poisson and lognormal-poisson models for analysis of failure intensity. We study also a beta-binomial model for analysis of failure probability. The estimators of the parameters for three models are estimated by the matching moments method and in the case of gamma-poisson and beta-binomial models also by maximum likelihood method. A great deal of mathematical or statistical problems that arise in reliability engineering can be solved by utilizing point processes. Here we consider the statistical analysis of non-homogeneous Poisson processes to describe the failing phenomena of a set of components with a Weibull intensity function. We use the method of maximum likelihood to estimate the parameters of the Weibull model. A common cause failure can seriously reduce the reliability of a system. We consider a binomial failure rate (BFR) model as an application of the marked point processes for modelling common cause failure in a system. The parameters of the binomial failure rate model are estimated with the maximum likelihood method

  12. MEASUREMENT: ACCOUNTING FOR RELIABILITY IN PERFORMANCE ESTIMATES.

    Science.gov (United States)

    Waterman, Brian; Sutter, Robert; Burroughs, Thomas; Dunagan, W Claiborne

    2014-01-01

    When evaluating physician performance measures, physician leaders are faced with the quandary of determining whether departures from expected physician performance measurements represent a true signal or random error. This uncertainty impedes the physician leader's ability and confidence to take appropriate performance improvement actions based on physician performance measurements. Incorporating reliability adjustment into physician performance measurement is a valuable way of reducing the impact of random error in the measurements, such as those caused by small sample sizes. Consequently, the physician executive has more confidence that the results represent true performance and is positioned to make better physician performance improvement decisions. Applying reliability adjustment to physician-level performance data is relatively new. As others have noted previously, it's important to keep in mind that reliability adjustment adds significant complexity to the production, interpretation and utilization of results. Furthermore, the methods explored in this case study only scratch the surface of the range of available Bayesian methods that can be used for reliability adjustment; further study is needed to test and compare these methods in practice and to examine important extensions for handling specialty-specific concerns (e.g., average case volumes, which have been shown to be important in cardiac surgery outcomes). Moreover, it's important to note that the provider group average as a basis for shrinkage is one of several possible choices that could be employed in practice and deserves further exploration in future research. With these caveats, our results demonstrate that incorporating reliability adjustment into physician performance measurements is feasible and can notably reduce the incidence of "real" signals relative to what one would expect to see using more traditional approaches. A physician leader who is interested in catalyzing performance improvement

  13. An integrated approach to estimate storage reliability with initial failures based on E-Bayesian estimates

    International Nuclear Information System (INIS)

    Zhang, Yongjin; Zhao, Ming; Zhang, Shitao; Wang, Jiamei; Zhang, Yanjun

    2017-01-01

    Storage reliability that measures the ability of products in a dormant state to keep their required functions is studied in this paper. For certain types of products, Storage reliability may not always be 100% at the beginning of storage, unlike the operational reliability, which exist possible initial failures that are normally neglected in the models of storage reliability. In this paper, a new integrated technique, the non-parametric measure based on the E-Bayesian estimates of current failure probabilities is combined with the parametric measure based on the exponential reliability function, is proposed to estimate and predict the storage reliability of products with possible initial failures, where the non-parametric method is used to estimate the number of failed products and the reliability at each testing time, and the parameter method is used to estimate the initial reliability and the failure rate of storage product. The proposed method has taken into consideration that, the reliability test data of storage products containing the unexamined before and during the storage process, is available for providing more accurate estimates of both the initial failure probability and the storage failure probability. When storage reliability prediction that is the main concern in this field should be made, the non-parametric estimates of failure numbers can be used into the parametric models for the failure process in storage. In the case of exponential models, the assessment and prediction method for storage reliability is presented in this paper. Finally, a numerical example is given to illustrate the method. Furthermore, a detailed comparison between the proposed and traditional method, for examining the rationality of assessment and prediction on the storage reliability, is investigated. The results should be useful for planning a storage environment, decision-making concerning the maximum length of storage, and identifying the production quality. - Highlights:

  14. Estimation of structural reliability under combined loads

    International Nuclear Information System (INIS)

    Shinozuka, M.; Kako, T.; Hwang, H.; Brown, P.; Reich, M.

    1983-01-01

    For the overall safety evaluation of seismic category I structures subjected to various load combinations, a quantitative measure of the structural reliability in terms of a limit state probability can be conveniently used. For this purpose, the reliability analysis method for dynamic loads, which has recently been developed by the authors, was combined with the existing standard reliability analysis procedure for static and quasi-static loads. The significant parameters that enter into the analysis are: the rate at which each load (dead load, accidental internal pressure, earthquake, etc.) will occur, its duration and intensity. All these parameters are basically random variables for most of the loads to be considered. For dynamic loads, the overall intensity is usually characterized not only by their dynamic components but also by their static components. The structure considered in the present paper is a reinforced concrete containment structure subjected to various static and dynamic loads such as dead loads, accidental pressure, earthquake acceleration, etc. Computations are performed to evaluate the limit state probabilities under each load combination separately and also under all possible combinations of such loads

  15. Estimation of structural reliability under combined loads

    International Nuclear Information System (INIS)

    Shinozuka, M.; Kako, T.; Hwang, H.; Brown, P.; Reich, M.

    1983-01-01

    For the overall safety evaluation of seismic category I structures subjected to various load combinations, a quantitative measure of the structural reliability in terms of a limit state probability can be conveniently used. For this purpose, the reliability analysis method for dynamic loads, which has recently been developed by the authors, was combined with the existing standard reliability analysis procedure for static and quasi-static loads. The significant parameters that enter into the analysis are: the rate at which each load (dead load, accidental internal pressure, earthquake, etc.) will occur, its duration and intensity. All these parameters are basically random variables for most of the loads to be considered. For dynamic loads, the overall intensity is usually characterized not only by their dynamic components but also by their static components. The structure considered in the present paper is a reinforced concrete containment structure subjected to various static and dynamic loads such as dead loads, accidental pressure, earthquake acceleration, etc. Computations are performed to evaluate the limit state probabilities under each load combination separately and also under all possible combinations of such loads. Indeed, depending on the limit state condition to be specified, these limit state probabilities can indicate which particular load combination provides the dominant contribution to the overall limit state probability. On the other hand, some of the load combinations contribute very little to the overall limit state probability. These observations provide insight into the complex problem of which load combinations must be considered for design, for which limit states and at what level of limit state probabilities. (orig.)

  16. On the reliability of a simple method for scoring phenotypes to estimate heritability: A case study with pupal color in Heliconius erato phyllis , Fabricius 1775 (Lepidoptera, Nymphalidae

    Directory of Open Access Journals (Sweden)

    Adriano Andrejew Ferreira

    2009-01-01

    Full Text Available In this paper, two methods for assessing the degree of melanization of pupal exuviae from the butterfly Heliconius erato phyllis , Fabricius 1775 (Lepidoptera, Nymphalidae, Heliconiini are compared. In the first method, which was qualitative, the exuviae were classified by scoring the degree of melanization, whereas in the second method, which was quantitative, the exuviae were classified by optical density followed by analysis with appropriate software. The heritability (h 2 of the degree of melanization was estimated by regression and analysis of variance. The estimates of h 2 were similar with both methods, indicating that the qualitative method could be particularly suitable for field work. The low estimates obtained for heritability may have resulted from the small sample size ( n = 7-18 broods, including the parents or from the allocation-priority hypothesis in which pupal color would be a lower priority trait compared to morphological traits and adequate larval development.

  17. Modelling and estimating degradation processes with application in structural reliability

    International Nuclear Information System (INIS)

    Chiquet, J.

    2007-06-01

    The characteristic level of degradation of a given structure is modeled through a stochastic process called the degradation process. The random evolution of the degradation process is governed by a differential system with Markovian environment. We put the associated reliability framework by considering the failure of the structure once the degradation process reaches a critical threshold. A closed form solution of the reliability function is obtained thanks to Markov renewal theory. Then, we build an estimation methodology for the parameters of the stochastic processes involved. The estimation methods and the theoretical results, as well as the associated numerical algorithms, are validated on simulated data sets. Our method is applied to the modelling of a real degradation mechanism, known as crack growth, for which an experimental data set is considered. (authors)

  18. Lower bounds to the reliabilities of factor score estimators

    NARCIS (Netherlands)

    Hessen, D.J.

    2017-01-01

    Under the general common factor model, the reliabilities of factor score estimators might be of more interest than the reliability of the total score (the unweighted sum of item scores). In this paper, lower bounds to the reliabilities of Thurstone’s factor score estimators, Bartlett’s factor score

  19. Application of subset simulation in reliability estimation of underground pipelines

    International Nuclear Information System (INIS)

    Tee, Kong Fah; Khan, Lutfor Rahman; Li, Hongshuang

    2014-01-01

    This paper presents a computational framework for implementing an advanced Monte Carlo simulation method, called Subset Simulation (SS) for time-dependent reliability prediction of underground flexible pipelines. The SS can provide better resolution for low failure probability level of rare failure events which are commonly encountered in pipeline engineering applications. Random samples of statistical variables are generated efficiently and used for computing probabilistic reliability model. It gains its efficiency by expressing a small probability event as a product of a sequence of intermediate events with larger conditional probabilities. The efficiency of SS has been demonstrated by numerical studies and attention in this work is devoted to scrutinise the robustness of the SS application in pipe reliability assessment and compared with direct Monte Carlo simulation (MCS) method. Reliability of a buried flexible steel pipe with time-dependent failure modes, namely, corrosion induced deflection, buckling, wall thrust and bending stress has been assessed in this study. The analysis indicates that corrosion induced excessive deflection is the most critical failure event whereas buckling is the least susceptible during the whole service life of the pipe. The study also shows that SS is robust method to estimate the reliability of buried pipelines and it is more efficient than MCS, especially in small failure probability prediction

  20. Computer Model to Estimate Reliability Engineering for Air Conditioning Systems

    International Nuclear Information System (INIS)

    Afrah Al-Bossly, A.; El-Berry, A.; El-Berry, A.

    2012-01-01

    Reliability engineering is used to predict the performance and optimize design and maintenance of air conditioning systems. Air conditioning systems are expose to a number of failures. The failures of an air conditioner such as turn on, loss of air conditioner cooling capacity, reduced air conditioning output temperatures, loss of cool air supply and loss of air flow entirely can be due to a variety of problems with one or more components of an air conditioner or air conditioning system. Forecasting for system failure rates are very important for maintenance. This paper focused on the reliability of the air conditioning systems. Statistical distributions that were commonly applied in reliability settings: the standard (2 parameter) Weibull and Gamma distributions. After distributions parameters had been estimated, reliability estimations and predictions were used for evaluations. To evaluate good operating condition in a building, the reliability of the air conditioning system that supplies conditioned air to the several The company's departments. This air conditioning system is divided into two, namely the main chilled water system and the ten air handling systems that serves the ten departments. In a chilled-water system the air conditioner cools water down to 40-45 degree F (4-7 degree C). The chilled water is distributed throughout the building in a piping system and connected to air condition cooling units wherever needed. Data analysis has been done with support a computer aided reliability software, this is due to the Weibull and Gamma distributions indicated that the reliability for the systems equal to 86.012% and 77.7% respectively. A comparison between the two important families of distribution functions, namely, the Weibull and Gamma families was studied. It was found that Weibull method performed for decision making.

  1. Lower Bounds to the Reliabilities of Factor Score Estimators.

    Science.gov (United States)

    Hessen, David J

    2016-10-06

    Under the general common factor model, the reliabilities of factor score estimators might be of more interest than the reliability of the total score (the unweighted sum of item scores). In this paper, lower bounds to the reliabilities of Thurstone's factor score estimators, Bartlett's factor score estimators, and McDonald's factor score estimators are derived and conditions are given under which these lower bounds are equal. The relative performance of the derived lower bounds is studied using classic example data sets. The results show that estimates of the lower bounds to the reliabilities of Thurstone's factor score estimators are greater than or equal to the estimates of the lower bounds to the reliabilities of Bartlett's and McDonald's factor score estimators.

  2. Electrical estimating methods

    CERN Document Server

    Del Pico, Wayne J

    2014-01-01

    Simplify the estimating process with the latest data, materials, and practices Electrical Estimating Methods, Fourth Edition is a comprehensive guide to estimating electrical costs, with data provided by leading construction database RS Means. The book covers the materials and processes encountered by the modern contractor, and provides all the information professionals need to make the most precise estimate. The fourth edition has been updated to reflect the changing materials, techniques, and practices in the field, and provides the most recent Means cost data available. The complexity of el

  3. Application of reliability methods in Ontario Hydro

    International Nuclear Information System (INIS)

    Jeppesen, R.; Ravishankar, T.J.

    1985-01-01

    Ontario Hydro have established a reliability program in support of its substantial nuclear program. Application of the reliability program to achieve both production and safety goals is described. The value of such a reliability program is evident in the record of Ontario Hydro's operating nuclear stations. The factors which have contributed to the success of the reliability program are identified as line management's commitment to reliability; selective and judicious application of reliability methods; establishing performance goals and monitoring the in-service performance; and collection, distribution, review and utilization of performance information to facilitate cost-effective achievement of goals and improvements. (orig.)

  4. 用Delta法估计多维测验合成信度的置信区间%Estimating the Confidence Interval of Composite Reliability of a Multidimensional Test With the Delta Method

    Institute of Scientific and Technical Information of China (English)

    叶宝娟; 温忠麟

    2012-01-01

    Reliability is very important in evaluating the quality of a test. Based on the confirmatory factor analysis, composite reliabili- ty is a good index to estimate the test reliability for general applications. As is well known, point estimate contains limited information a- bout a population parameter and cannot indicate how far it can be from the population parameter. The confidence interval of the parame- ter can provide more information. In evaluating the quality of a test, the confidence interval of composite reliability has received atten- tion in recent years. There are three approaches to estimating the confidence interval of composite reliability of an unidimensional test: the Bootstrap method, the Delta method, and the direct use of the standard error of a software output (e. g. , LISREL). The Bootstrap method pro- vides empirical results of the standard error, and is the most credible method. But it needs data simulation techniques, and its computa- tion process is rather complex. The Delta method computes the standard error of composite reliability by approximate calculation. It is simpler than the Bootstrap method. The LISREL software can directly prompt the standard error, and it is the easiest among the three methods. By simulation study, it had been found that the interval estimates obtained by the Delta method and the Bootstrap method were almost identical, whereas the results obtained by LISREL and by the Bootstrap method were substantially different ( Ye & Wen, 2011 ). The Delta method is recommended when the confidence interval of composite reliability of a unidimensional test is estimated, because the Delta method is simpler than the Bootstrap method. There was little research about how to compute the confidence interval of composite reliability of a multidimensional test. We de- duced a formula by using the Delta method for computing the standard error of composite reliability of a multidimensional test. Based on the standard error, the

  5. Structural reliability methods: Code development status

    Science.gov (United States)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  6. Generating human reliability estimates using expert judgment. Volume 2. Appendices

    International Nuclear Information System (INIS)

    Comer, M.K.; Seaver, D.A.; Stillwell, W.G.; Gaddy, C.D.

    1984-11-01

    The US Nuclear Regulatory Commission is conducting a research program to determine the practicality, acceptability, and usefulness of several different methods for obtaining human reliability data and estimates that can be used in nuclear power plant probabilistic risk assessments (PRA). One method, investigated as part of this overall research program, uses expert judgment to generate human error probability (HEP) estimates and associated uncertainty bounds. The project described in this document evaluated two techniques for using expert judgment: paired comparisons and direct numerical estimation. Volume 2 provides detailed procedures for using the techniques, detailed descriptions of the analyses performed to evaluate the techniques, and HEP estimates generated as part of this project. The results of the evaluation indicate that techniques using expert judgment should be given strong consideration for use in developing HEP estimates. Judgments were shown to be consistent and to provide HEP estimates with a good degree of convergent validity. Of the two techniques tested, direct numerical estimation appears to be preferable in terms of ease of application and quality of results

  7. Influences on and Limitations of Classical Test Theory Reliability Estimates.

    Science.gov (United States)

    Arnold, Margery E.

    It is incorrect to say "the test is reliable" because reliability is a function not only of the test itself, but of many factors. The present paper explains how different factors affect classical reliability estimates such as test-retest, interrater, internal consistency, and equivalent forms coefficients. Furthermore, the limits of classical test…

  8. Statistical estimation Monte Carlo for unreliability evaluation of highly reliable system

    International Nuclear Information System (INIS)

    Xiao Gang; Su Guanghui; Jia Dounan; Li Tianduo

    2000-01-01

    Based on analog Monte Carlo simulation, statistical Monte Carlo methods for unreliable evaluation of highly reliable system are constructed, including direct statistical estimation Monte Carlo method and weighted statistical estimation Monte Carlo method. The basal element is given, and the statistical estimation Monte Carlo estimators are derived. Direct Monte Carlo simulation method, bounding-sampling method, forced transitions Monte Carlo method, direct statistical estimation Monte Carlo and weighted statistical estimation Monte Carlo are used to evaluate unreliability of a same system. By comparing, weighted statistical estimation Monte Carlo estimator has smallest variance, and has highest calculating efficiency

  9. Estimation of the Reliability of Plastic Slabs

    DEFF Research Database (Denmark)

    Pirzada, G. B. : Ph.D.

    In this thesis, work related to fundamental conditions has been extended to non-fundamental or the general case of probabilistic analysis. Finally, using the ss-unzipping technique a door has been opened to system reliability analysis of plastic slabs. An attempt has been made in this thesis...... to give a probabilistic treatment of plastic slabs which is parallel to the deterministic and systematic treatment of plastic slabs by Nielsen (3). The fundamental reason is that in Nielsen (3) the treatment is based on a deterministic modelling of the basic material properties for the reinforced...

  10. Neglect Of Parameter Estimation Uncertainty Can Significantly Overestimate Structural Reliability

    Directory of Open Access Journals (Sweden)

    Rózsás Árpád

    2015-12-01

    Full Text Available Parameter estimation uncertainty is often neglected in reliability studies, i.e. point estimates of distribution parameters are used for representative fractiles, and in probabilistic models. A numerical example examines the effect of this uncertainty on structural reliability using Bayesian statistics. The study reveals that the neglect of parameter estimation uncertainty might lead to an order of magnitude underestimation of failure probability.

  11. Basics of Bayesian reliability estimation from attribute test data

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Waller, R.A.

    1975-10-01

    The basic notions of Bayesian reliability estimation from attribute lifetest data are presented in an introductory and expository manner. Both Bayesian point and interval estimates of the probability of surviving the lifetest, the reliability, are discussed. The necessary formulas are simply stated, and examples are given to illustrate their use. In particular, a binomial model in conjunction with a beta prior model is considered. Particular attention is given to the procedure for selecting an appropriate prior model in practice. Empirical Bayes point and interval estimates of reliability are discussed and examples are given. 7 figures, 2 tables

  12. A novel reliability evaluation method for large engineering systems

    Directory of Open Access Journals (Sweden)

    Reda Farag

    2016-06-01

    Full Text Available A novel reliability evaluation method for large nonlinear engineering systems excited by dynamic loading applied in time domain is presented. For this class of problems, the performance functions are expected to be function of time and implicit in nature. Available first- or second-order reliability method (FORM/SORM will be challenging to estimate reliability of such systems. Because of its inefficiency, the classical Monte Carlo simulation (MCS method also cannot be used for large nonlinear dynamic systems. In the proposed approach, only tens instead of hundreds or thousands of deterministic evaluations at intelligently selected points are used to extract the reliability information. A hybrid approach, consisting of the stochastic finite element method (SFEM developed by the author and his research team using FORM, response surface method (RSM, an interpolation scheme, and advanced factorial schemes, is proposed. The method is clarified with the help of several numerical examples.

  13. IRT-Estimated Reliability for Tests Containing Mixed Item Formats

    Science.gov (United States)

    Shu, Lianghua; Schwarz, Richard D.

    2014-01-01

    As a global measure of precision, item response theory (IRT) estimated reliability is derived for four coefficients (Cronbach's a, Feldt-Raju, stratified a, and marginal reliability). Models with different underlying assumptions concerning test-part similarity are discussed. A detailed computational example is presented for the targeted…

  14. Processes and Procedures for Estimating Score Reliability and Precision

    Science.gov (United States)

    Bardhoshi, Gerta; Erford, Bradley T.

    2017-01-01

    Precision is a key facet of test development, with score reliability determined primarily according to the types of error one wants to approximate and demonstrate. This article identifies and discusses several primary forms of reliability estimation: internal consistency (i.e., split-half, KR-20, a), test-retest, alternate forms, interscorer, and…

  15. On estimation of reliability for pipe lines of heat power plants under cyclic loading

    International Nuclear Information System (INIS)

    Verezemskij, V.G.

    1986-01-01

    One of the possible methods to obtain a quantitative estimate of the reliability for pipe lines of the welded heat power plants under cyclic loading due to heating-cooling and due to vibration is considered. Reliability estimate is carried out for a common case of loading by simultaneous cycles with different amplitudes and loading asymmetry. It is shown that scattering of the breaking number of cycles for the metal of welds may perceptibly decrease reliability of the welded pipe line

  16. User's guide to the Reliability Estimation System Testbed (REST)

    Science.gov (United States)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  17. Statistical Bayesian method for reliability evaluation based on ADT data

    Science.gov (United States)

    Lu, Dawei; Wang, Lizhi; Sun, Yusheng; Wang, Xiaohong

    2018-05-01

    Accelerated degradation testing (ADT) is frequently conducted in the laboratory to predict the products’ reliability under normal operating conditions. Two kinds of methods, degradation path models and stochastic process models, are utilized to analyze degradation data and the latter one is the most popular method. However, some limitations like imprecise solution process and estimation result of degradation ratio still exist, which may affect the accuracy of the acceleration model and the extrapolation value. Moreover, the conducted solution of this problem, Bayesian method, lose key information when unifying the degradation data. In this paper, a new data processing and parameter inference method based on Bayesian method is proposed to handle degradation data and solve the problems above. First, Wiener process and acceleration model is chosen; Second, the initial values of degradation model and parameters of prior and posterior distribution under each level is calculated with updating and iteration of estimation values; Third, the lifetime and reliability values are estimated on the basis of the estimation parameters; Finally, a case study is provided to demonstrate the validity of the proposed method. The results illustrate that the proposed method is quite effective and accuracy in estimating the lifetime and reliability of a product.

  18. Fatigue damage estimation in non-linear systems using a combination of Monte Carlo simulation and the First Order Reliability Method

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2015-01-01

    For non-linear systems the estimation of fatigue damage under stochastic loadings can be rather time-consuming. Usually Monte Carlo simulation (MCS) is applied, but the coefficient-of-variation (COV) can be large if only a small set of simulations can be done due to otherwise excessive CPU time...

  19. Assessment of reliability of Greulich and Pyle (gp) method for ...

    African Journals Online (AJOL)

    Background: Greulich and Pyle standards are the most widely used age estimation standards all over the world. The applicability of the Greulich and Pyle standards to populations which differ from their reference population is often questioned. This study aimed to assess the reliability of Greulich and Pyle (GP) method for ...

  20. An overview of reliability methods in mechanical and structural design

    Science.gov (United States)

    Wirsching, P. H.; Ortiz, K.; Lee, S. J.

    1987-01-01

    An evaluation is made of modern methods of fast probability integration and Monte Carlo treatment for the assessment of structural systems' and components' reliability. Fast probability integration methods are noted to be more efficient than Monte Carlo ones. This is judged to be an important consideration when several point probability estimates must be made in order to construct a distribution function. An example illustrating the relative efficiency of the various methods is included.

  1. An efficient reliable method to estimate the vaporization enthalpy of pure substances according to the normal boiling temperature and critical properties

    Directory of Open Access Journals (Sweden)

    Babak Mehmandoust

    2014-03-01

    Full Text Available The heat of vaporization of a pure substance at its normal boiling temperature is a very important property in many chemical processes. In this work, a new empirical method was developed to predict vaporization enthalpy of pure substances. This equation is a function of normal boiling temperature, critical temperature, and critical pressure. The presented model is simple to use and provides an improvement over the existing equations for 452 pure substances in wide boiling range. The results showed that the proposed correlation is more accurate than the literature methods for pure substances in a wide boiling range (20.3–722 K.

  2. An efficient reliable method to estimate the vaporization enthalpy of pure substances according to the normal boiling temperature and critical properties.

    Science.gov (United States)

    Mehmandoust, Babak; Sanjari, Ehsan; Vatani, Mostafa

    2014-03-01

    The heat of vaporization of a pure substance at its normal boiling temperature is a very important property in many chemical processes. In this work, a new empirical method was developed to predict vaporization enthalpy of pure substances. This equation is a function of normal boiling temperature, critical temperature, and critical pressure. The presented model is simple to use and provides an improvement over the existing equations for 452 pure substances in wide boiling range. The results showed that the proposed correlation is more accurate than the literature methods for pure substances in a wide boiling range (20.3-722 K).

  3. DEVELOPMENT OF ESTIMATION METHODS OF ORGANIZATIONAL AND TECHNOLOGICAL RELIABILITY LEVEL OF BUILDINGS AND STRUCTURES IN PROJECTS OF BIOPHER-SUPPORTING CONSTRUCTION

    Directory of Open Access Journals (Sweden)

    CHERNYSHEV D. О.

    2017-03-01

    Full Text Available Summary. The article is devoted to the search of advanced analytical tools and methodical-algorithmic techniques of organizational and technological and stochastic evaluation, risks and threats overcoming during the implementation of biosphere construction projects. The application expediency of theory and methods of wavelet analysis in the study of non-stationary stochastic oscillations of complex spatial structures is substantiated due to the need for more accurate prediction of their dynamic behavior and identification of the structures characteristics in the frequency-time space.

  4. An efficient reliable method to estimate the vaporization enthalpy of pure substances according to the normal boiling temperature and critical properties

    OpenAIRE

    Mehmandoust, Babak; Sanjari, Ehsan; Vatani, Mostafa

    2014-01-01

    The heat of vaporization of a pure substance at its normal boiling temperature is a very important property in many chemical processes. In this work, a new empirical method was developed to predict vaporization enthalpy of pure substances. This equation is a function of normal boiling temperature, critical temperature, and critical pressure. The presented model is simple to use and provides an improvement over the existing equations for 452 pure substances in wide boiling range. The results s...

  5. Case Study: Zutphen : Estimates of levee system reliability

    NARCIS (Netherlands)

    Roscoe, K.; Kothuis, Baukje; Kok, Matthijs

    2017-01-01

    Estimates of levee system reliability can conflict with experience and intuition. For example, a very high failure probability may be computed while no evidence of failure has been observed, or a very low failure probability when signs of failure have been detected.

  6. Reliability and risk analysis methods research plan

    International Nuclear Information System (INIS)

    1984-10-01

    This document presents a plan for reliability and risk analysis methods research to be performed mainly by the Reactor Risk Branch (RRB), Division of Risk Analysis and Operations (DRAO), Office of Nuclear Regulatory Research. It includes those activities of other DRAO branches which are very closely related to those of the RRB. Related or interfacing programs of other divisions, offices and organizations are merely indicated. The primary use of this document is envisioned as an NRC working document, covering about a 3-year period, to foster better coordination in reliability and risk analysis methods development between the offices of Nuclear Regulatory Research and Nuclear Reactor Regulation. It will also serve as an information source for contractors and others to more clearly understand the objectives, needs, programmatic activities and interfaces together with the overall logical structure of the program

  7. Sample size planning for composite reliability coefficients: accuracy in parameter estimation via narrow confidence intervals.

    Science.gov (United States)

    Terry, Leann; Kelley, Ken

    2012-11-01

    Composite measures play an important role in psychology and related disciplines. Composite measures almost always have error. Correspondingly, it is important to understand the reliability of the scores from any particular composite measure. However, the point estimates of the reliability of composite measures are fallible and thus all such point estimates should be accompanied by a confidence interval. When confidence intervals are wide, there is much uncertainty in the population value of the reliability coefficient. Given the importance of reporting confidence intervals for estimates of reliability, coupled with the undesirability of wide confidence intervals, we develop methods that allow researchers to plan sample size in order to obtain narrow confidence intervals for population reliability coefficients. We first discuss composite reliability coefficients and then provide a discussion on confidence interval formation for the corresponding population value. Using the accuracy in parameter estimation approach, we develop two methods to obtain accurate estimates of reliability by planning sample size. The first method provides a way to plan sample size so that the expected confidence interval width for the population reliability coefficient is sufficiently narrow. The second method ensures that the confidence interval width will be sufficiently narrow with some desired degree of assurance (e.g., 99% assurance that the 95% confidence interval for the population reliability coefficient will be less than W units wide). The effectiveness of our methods was verified with Monte Carlo simulation studies. We demonstrate how to easily implement the methods with easy-to-use and freely available software. ©2011 The British Psychological Society.

  8. Reliability Estimation for Single-unit Ceramic Crown Restorations

    Science.gov (United States)

    Lekesiz, H.

    2014-01-01

    The objective of this study was to evaluate the potential of a survival prediction method for the assessment of ceramic dental restorations. For this purpose, fast-fracture and fatigue reliabilities for 2 bilayer (metal ceramic alloy core veneered with fluorapatite leucite glass-ceramic, d.Sign/d.Sign-67, by Ivoclar; glass-infiltrated alumina core veneered with feldspathic porcelain, VM7/In-Ceram Alumina, by Vita) and 3 monolithic (leucite-reinforced glass-ceramic, Empress, and ProCAD, by Ivoclar; lithium-disilicate glass-ceramic, Empress 2, by Ivoclar) single posterior crown restorations were predicted, and fatigue predictions were compared with the long-term clinical data presented in the literature. Both perfectly bonded and completely debonded cases were analyzed for evaluation of the influence of the adhesive/restoration bonding quality on estimations. Material constants and stress distributions required for predictions were calculated from biaxial tests and finite element analysis, respectively. Based on the predictions, In-Ceram Alumina presents the best fast-fracture resistance, and ProCAD presents a comparable resistance for perfect bonding; however, ProCAD shows a significant reduction of resistance in case of complete debonding. Nevertheless, it is still better than Empress and comparable with Empress 2. In-Ceram Alumina and d.Sign have the highest long-term reliability, with almost 100% survivability even after 10 years. When compared with clinical failure rates reported in the literature, predictions show a promising match with clinical data, and this indicates the soundness of the settings used in the proposed predictions. PMID:25048249

  9. Reliability of Bluetooth Technology for Travel Time Estimation

    DEFF Research Database (Denmark)

    Araghi, Bahar Namaki; Olesen, Jonas Hammershøj; Krishnan, Rajesh

    2015-01-01

    . However, their corresponding impacts on accuracy and reliability of estimated travel time have not been evaluated. In this study, a controlled field experiment is conducted to collect both Bluetooth and GPS data for 1000 trips to be used as the basis for evaluation. Data obtained by GPS logger is used...... to calculate actual travel time, referred to as ground truth, and to geo-code the Bluetooth detection events. In this setting, reliability is defined as the percentage of devices captured per trip during the experiment. It is found that, on average, Bluetooth-enabled devices will be detected 80% of the time......-range antennae detect Bluetooth-enabled devices in a closer location to the sensor, thus providing a more accurate travel time estimate. However, the smaller the size of the detection zone, the lower the penetration rate, which could itself influence the accuracy of estimates. Therefore, there has to be a trade...

  10. 1/f noise as a reliability estimation for solar panels

    Science.gov (United States)

    Alabedra, R.; Orsal, B.

    The purpose of this work is a study of the 1/f noise from a forward biased dark solar cell as a nondestructive reliability estimation of solar panels. It is shown that one cell with a given defect can be detected in a solar panel by low frequency noise measurements at obscurity. One real solar panel of 5 cells in parallel and 5 cells in series is tested by this method. The cells for space application are n(+)p monocrystalline silicon junction with an area of 8 sq cm and a base resistivity of 10 ohm/cm. In the first part of this paper the I-V, Rd=f(1) characteristics of one cell or of a panel are not modified when a small defect is introduced by a mechanical constraint. In the second part, the theoretical results on the 1/f noise in a p-n junction under forward bias are recalled. It is shown that the noise of the cell with a defect is about 10 to 15 times higher than that of a good cell. If one good cell is replaced by a cell with defect in the panel 5 x 5, this leads to an increase of about 30 percent of the noise level of the panel.

  11. A method of predicting the reliability of CDM coil insulation

    International Nuclear Information System (INIS)

    Kytasty, A.; Ogle, C.; Arrendale, H.

    1992-01-01

    This paper presents a method of predicting the reliability of the Collider Dipole Magnet (CDM) coil insulation design. The method proposes a probabilistic treatment of electrical test data, stress analysis, material properties variability and loading uncertainties to give the reliability estimate. The approach taken to predict reliability of design related failure modes of the CDM is to form analytical models of the various possible failure modes and their related mechanisms or causes, and then statistically assess the contributions of the various contributing variables. The probability of the failure mode occurring is interpreted as the number of times one would expect certain extreme situations to combine and randomly occur. One of the more complex failure modes of the CDM will be used to illustrate this methodology

  12. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  13. Review of Quantitative Software Reliability Methods

    Energy Technology Data Exchange (ETDEWEB)

    Chu, T.L.; Yue, M.; Martinez-Guridi, M.; Lehner, J.

    2010-09-17

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process for digital systems rests on deterministic engineering criteria. In its 1995 probabilistic risk assessment (PRA) policy statement, the Commission encouraged the use of PRA technology in all regulatory matters to the extent supported by the state-of-the-art in PRA methods and data. Although many activities have been completed in the area of risk-informed regulation, the risk-informed analysis process for digital systems has not yet been satisfactorily developed. Since digital instrumentation and control (I&C) systems are expected to play an increasingly important role in nuclear power plant (NPP) safety, the NRC established a digital system research plan that defines a coherent set of research programs to support its regulatory needs. One of the research programs included in the NRC's digital system research plan addresses risk assessment methods and data for digital systems. Digital I&C systems have some unique characteristics, such as using software, and may have different failure causes and/or modes than analog I&C systems; hence, their incorporation into NPP PRAs entails special challenges. The objective of the NRC's digital system risk research is to identify and develop methods, analytical tools, and regulatory guidance for (1) including models of digital systems into NPP PRAs, and (2) using information on the risks of digital systems to support the NRC's risk-informed licensing and oversight activities. For several years, Brookhaven National Laboratory (BNL) has worked on NRC projects to investigate methods and tools for the probabilistic modeling of digital systems, as documented mainly in NUREG/CR-6962 and NUREG/CR-6997. However, the scope of this research principally focused on hardware failures, with limited reviews of software failure experience and software reliability methods. NRC also sponsored research at the Ohio State University investigating the modeling of

  14. Assessment of the reliability of ultrasonic inspection methods

    International Nuclear Information System (INIS)

    Haines, N.F.; Langston, D.B.; Green, A.J.; Wilson, R.

    1982-01-01

    The reliability of NDT techniques has remained an open question for many years. A reliable technique may be defined as one that, when rigorously applied by a number of inspection teams, consistently finds then correctly sizes all defects of concern. In this paper we report an assessment of the reliability of defect detection by manual ultrasonic methods applied to the inspection of thick section pressure vessel weldments. Initially we consider the available data relating to the inherent physical capabilities of ultrasonic techniques to detect cracks in weldment and then, independently, we assess the likely variability in team to team performance when several teams are asked to follow the same specified test procedure. The two aspects of 'capability' and 'variability' are brought together to provide quantitative estimates of the overall reliability of ultrasonic inspection of thick section pressure vessel weldments based on currently existing data. The final section of the paper considers current research programmes on reliability and presents a view on how these will help to further improve NDT reliability. (author)

  15. Reliability estimation system: its application to the nuclear geophysical sampling of ore deposits

    International Nuclear Information System (INIS)

    Khaykovich, I.M.; Savosin, S.I.

    1992-01-01

    The reliability estimation system accepted in the Soviet Union for sampling data in nuclear geophysics is based on unique requirements in metrology and methodology. It involves estimating characteristic errors in calibration, as well as errors in measurement and interpretation. This paper describes the methods of estimating the levels of systematic and random errors at each stage of the problem. The data of nuclear geophysics sampling are considered to be reliable if there are no statistically significant, systematic differences between ore intervals determined by this method and by geological control, or by other methods of sampling; the reliability of the latter having been verified. The difference between the random errors is statistically insignificant. The system allows one to obtain information on the parameters of ore intervals with a guaranteed random error and without systematic errors. (Author)

  16. A note on reliability estimation of functionally diverse systems

    International Nuclear Information System (INIS)

    Littlewood, B.; Popov, P.; Strigini, L.

    1999-01-01

    It has been argued that functional diversity might be a plausible means of claiming independence of failures between two versions of a system. We present a model of functional diversity, in the spirit of earlier models of diversity such as those of Eckhardt and Lee, and Hughes. In terms of the model, we show that the claims for independence between functionally diverse systems seem rather unrealistic. Instead, it seems likely that functionally diverse systems will exhibit positively correlated failures, and thus will be less reliable than an assumption of independence would suggest. The result does not, of course, suggest that functional diversity is not worthwhile; instead, it places upon the evaluator of such a system the onus to estimate the degree of dependence so as to evaluate the reliability of the system

  17. Probabilistic confidence for decisions based on uncertain reliability estimates

    Science.gov (United States)

    Reid, Stuart G.

    2013-05-01

    Reliability assessments are commonly carried out to provide a rational basis for risk-informed decisions concerning the design or maintenance of engineering systems and structures. However, calculated reliabilities and associated probabilities of failure often have significant uncertainties associated with the possible estimation errors relative to the 'true' failure probabilities. For uncertain probabilities of failure, a measure of 'probabilistic confidence' has been proposed to reflect the concern that uncertainty about the true probability of failure could result in a system or structure that is unsafe and could subsequently fail. The paper describes how the concept of probabilistic confidence can be applied to evaluate and appropriately limit the probabilities of failure attributable to particular uncertainties such as design errors that may critically affect the dependability of risk-acceptance decisions. This approach is illustrated with regard to the dependability of structural design processes based on prototype testing with uncertainties attributable to sampling variability.

  18. "A Comparison of Consensus, Consistency, and Measurement Approaches to Estimating Interrater Reliability"

    OpenAIRE

    Steven E. Stemler

    2004-01-01

    This article argues that the general practice of describing interrater reliability as a single, unified concept is..at best imprecise, and at worst potentially misleading. Rather than representing a single concept, different..statistical methods for computing interrater reliability can be more accurately classified into one of three..categories based upon the underlying goals of analysis. The three general categories introduced and..described in this paper are: 1) consensus estimates, 2) cons...

  19. Bayesian and Classical Estimation of Stress-Strength Reliability for Inverse Weibull Lifetime Models

    Directory of Open Access Journals (Sweden)

    Qixuan Bi

    2017-06-01

    Full Text Available In this paper, we consider the problem of estimating stress-strength reliability for inverse Weibull lifetime models having the same shape parameters but different scale parameters. We obtain the maximum likelihood estimator and its asymptotic distribution. Since the classical estimator doesn’t hold explicit forms, we propose an approximate maximum likelihood estimator. The asymptotic confidence interval and two bootstrap intervals are obtained. Using the Gibbs sampling technique, Bayesian estimator and the corresponding credible interval are obtained. The Metropolis-Hastings algorithm is used to generate random variates. Monte Carlo simulations are conducted to compare the proposed methods. Analysis of a real dataset is performed.

  20. ESTIMATING RELIABILITY OF DISTURBANCES IN SATELLITE TIME SERIES DATA BASED ON STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Z.-G. Zhou

    2016-06-01

    Full Text Available Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with “Change/ No change” by most of the present methods, while few methods focus on estimating reliability (or confidence level of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1 Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST. (2 Forecasting and detecting disturbances in new time series data. (3 Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI and Confidence Levels (CL. The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  1. An Evaluation Method of Equipment Reliability Configuration Management

    Science.gov (United States)

    Wang, Wei; Feng, Weijia; Zhang, Wei; Li, Yuan

    2018-01-01

    At present, many equipment development companies have been aware of the great significance of reliability of the equipment development. But, due to the lack of effective management evaluation method, it is very difficult for the equipment development company to manage its own reliability work. Evaluation method of equipment reliability configuration management is to determine the reliability management capabilities of equipment development company. Reliability is not only designed, but also managed to achieve. This paper evaluates the reliability management capabilities by reliability configuration capability maturity model(RCM-CMM) evaluation method.

  2. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  3. Integrated Reliability Estimation of a Nuclear Maintenance Robot including a Software

    Energy Technology Data Exchange (ETDEWEB)

    Eom, Heung Seop; Kim, Jae Hee; Jeong, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    Conventional reliability estimation techniques such as Fault Tree Analysis (FTA), Reliability Block Diagram (RBD), Markov Model, and Event Tree Analysis (ETA) have been used widely and approved in some industries. Then there are some limitations when we use them for a complicate robot systems including software such as intelligent reactor inspection robots. Therefore an expert's judgment plays an important role in estimating the reliability of a complicate system in practice, because experts can deal with diverse evidence related to the reliability and then perform an inference based on them. The proposed method in this paper combines qualitative and quantitative evidences and performs an inference like experts. Furthermore, it does the work in a formal and in a quantitative way unlike human experts, by the benefits of Bayesian Nets (BNs)

  4. Availability and Reliability of FSO Links Estimated from Visibility

    Directory of Open Access Journals (Sweden)

    M. Tatarko

    2012-06-01

    Full Text Available This paper is focused on estimation availability and reliability of FSO systems. Shortcut FSO means Free Space Optics. It is a system which allows optical transmission between two steady points. We can say that it is a last mile communication system. It is an optical communication system, but the propagation media is air. This solution of last mile does not require expensive optical fiber and establishing of connection is very simple. But there are some drawbacks which have a bad influence of quality of services and availability of the link. Number of phenomena in the atmosphere such as scattering, absorption and turbulence cause a large variation of receiving optical power and laser beam attenuation. The influence of absorption and turbulence can be significantly reduced by an appropriate design of FSO link. But the visibility has the main influence on quality of the optical transmission channel. Thus, in typical continental area where rain, snow or fog occurs is important to know their values. This article gives a description of device for measuring weather conditions and information about estimation of availability and reliability of FSO links in Slovakia.

  5. How Many Sleep Diary Entries Are Needed to Reliably Estimate Adolescent Sleep?

    Science.gov (United States)

    Arora, Teresa; Gradisar, Michael; Taheri, Shahrad; Carskadon, Mary A.

    2017-01-01

    Abstract Study Objectives: To investigate (1) how many nights of sleep diary entries are required for reliable estimates of five sleep-related outcomes (bedtime, wake time, sleep onset latency [SOL], sleep duration, and wake after sleep onset [WASO]) and (2) the test–retest reliability of sleep diary estimates of school night sleep across 12 weeks. Methods: Data were drawn from four adolescent samples (Australia [n = 385], Qatar [n = 245], United Kingdom [n = 770], and United States [n = 366]), who provided 1766 eligible sleep diary weeks for reliability analyses. We performed reliability analyses for each cohort using complete data (7 days), one to five school nights, and one to two weekend nights. We also performed test–retest reliability analyses on 12-week sleep diary data available from a subgroup of 55 US adolescents. Results: Intraclass correlation coefficients for bedtime, SOL, and sleep duration indicated good-to-excellent reliability from five weekday nights of sleep diary entries across all adolescent cohorts. Four school nights was sufficient for wake times in the Australian and UK samples, but not the US or Qatari samples. Only Australian adolescents showed good reliability for two weekend nights of bedtime reports; estimates of SOL were adequate for UK adolescents based on two weekend nights. WASO was not reliably estimated using 1 week of sleep diaries. We observed excellent test–rest reliability across 12 weeks of sleep diary data in a subsample of US adolescents. Conclusion: We recommend at least five weekday nights of sleep dairy entries to be made when studying adolescent bedtimes, SOL, and sleep duration. Adolescent sleep patterns were stable across 12 consecutive school weeks. PMID:28199718

  6. A Data-Driven Reliability Estimation Approach for Phased-Mission Systems

    Directory of Open Access Journals (Sweden)

    Hua-Feng He

    2014-01-01

    Full Text Available We attempt to address the issues associated with reliability estimation for phased-mission systems (PMS and present a novel data-driven approach to achieve reliability estimation for PMS using the condition monitoring information and degradation data of such system under dynamic operating scenario. In this sense, this paper differs from the existing methods only considering the static scenario without using the real-time information, which aims to estimate the reliability for a population but not for an individual. In the presented approach, to establish a linkage between the historical data and real-time information of the individual PMS, we adopt a stochastic filtering model to model the phase duration and obtain the updated estimation of the mission time by Bayesian law at each phase. At the meanwhile, the lifetime of PMS is estimated from degradation data, which are modeled by an adaptive Brownian motion. As such, the mission reliability can be real time obtained through the estimated distribution of the mission time in conjunction with the estimated lifetime distribution. We demonstrate the usefulness of the developed approach via a numerical example.

  7. Reliability estimation for multiunit nuclear and fossil-fired industrial energy systems

    International Nuclear Information System (INIS)

    Sullivan, W.G.; Wilson, J.V.; Klepper, O.H.

    1977-01-01

    As petroleum-based fuels grow increasingly scarce and costly, nuclear energy may become an important alternative source of industrial energy. Initial applications would most likely include a mix of fossil-fired and nuclear sources of process energy. A means for determining the overall reliability of these mixed systems is a fundamental aspect of demonstrating their feasibility to potential industrial users. Reliability data from nuclear and fossil-fired plants are presented, and several methods of applying these data for calculating the reliability of reasonably complex industrial energy supply systems are given. Reliability estimates made under a number of simplifying assumptions indicate that multiple nuclear units or a combination of nuclear and fossil-fired plants could provide adequate reliability to meet industrial requirements for continuity of service

  8. Advancing methods for reliably assessing motivational interviewing fidelity using the motivational interviewing skills code.

    Science.gov (United States)

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C

    2015-02-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Perceptual and Acoustic Reliability Estimates for the Speech Disorders Classification System (SDCS)

    Science.gov (United States)

    Shriberg, Lawrence D.; Fourakis, Marios; Hall, Sheryl D.; Karlsson, Heather B.; Lohmeier, Heather L.; McSweeny, Jane L.; Potter, Nancy L.; Scheer-Cohen, Alison R.; Strand, Edythe A.; Tilkens, Christie M.; Wilson, David L.

    2010-01-01

    A companion paper describes three extensions to a classification system for paediatric speech sound disorders termed the Speech Disorders Classification System (SDCS). The SDCS uses perceptual and acoustic data reduction methods to obtain information on a speaker's speech, prosody, and voice. The present paper provides reliability estimates for…

  10. Safeprops: A Software for Fast and Reliable Estimation of Safety and Environmental Properties for Organic Compounds

    DEFF Research Database (Denmark)

    Jones, Mark Nicholas; Frutiger, Jerome; Abildskov, Jens

    We present a new software tool called SAFEPROPS which is able to estimate major safety-related and environmental properties for organic compounds. SAFEPROPS provides accurate, reliable and fast predictions using the Marrero-Gani group contribution (MG-GC) method. It is implemented using Python...... as the main programming language, while the necessary parameters together with their correlation matrix are obtained from a SQLite database which has been populated using off-line parameter and error estimation routines (Eq. 3-8)....

  11. Method matters: Understanding diagnostic reliability in DSM-IV and DSM-5.

    Science.gov (United States)

    Chmielewski, Michael; Clark, Lee Anna; Bagby, R Michael; Watson, David

    2015-08-01

    Diagnostic reliability is essential for the science and practice of psychology, in part because reliability is necessary for validity. Recently, the DSM-5 field trials documented lower diagnostic reliability than past field trials and the general research literature, resulting in substantial criticism of the DSM-5 diagnostic criteria. Rather than indicating specific problems with DSM-5, however, the field trials may have revealed long-standing diagnostic issues that have been hidden due to a reliance on audio/video recordings for estimating reliability. We estimated the reliability of DSM-IV diagnoses using both the standard audio-recording method and the test-retest method used in the DSM-5 field trials, in which different clinicians conduct separate interviews. Psychiatric patients (N = 339) were diagnosed using the SCID-I/P; 218 were diagnosed a second time by an independent interviewer. Diagnostic reliability using the audio-recording method (N = 49) was "good" to "excellent" (M κ = .80) and comparable to the DSM-IV field trials estimates. Reliability using the test-retest method (N = 218) was "poor" to "fair" (M κ = .47) and similar to DSM-5 field-trials' estimates. Despite low test-retest diagnostic reliability, self-reported symptoms were highly stable. Moreover, there was no association between change in self-report and change in diagnostic status. These results demonstrate the influence of method on estimates of diagnostic reliability. (c) 2015 APA, all rights reserved).

  12. Reliability of fish size estimates obtained from multibeam imaging sonar

    Science.gov (United States)

    Hightower, Joseph E.; Magowan, Kevin J.; Brown, Lori M.; Fox, Dewayne A.

    2013-01-01

    Multibeam imaging sonars have considerable potential for use in fisheries surveys because the video-like images are easy to interpret, and they contain information about fish size, shape, and swimming behavior, as well as characteristics of occupied habitats. We examined images obtained using a dual-frequency identification sonar (DIDSON) multibeam sonar for Atlantic sturgeon Acipenser oxyrinchus oxyrinchus, striped bass Morone saxatilis, white perch M. americana, and channel catfish Ictalurus punctatus of known size (20–141 cm) to determine the reliability of length estimates. For ranges up to 11 m, percent measurement error (sonar estimate – total length)/total length × 100 varied by species but was not related to the fish's range or aspect angle (orientation relative to the sonar beam). Least-square mean percent error was significantly different from 0.0 for Atlantic sturgeon (x̄  =  −8.34, SE  =  2.39) and white perch (x̄  = 14.48, SE  =  3.99) but not striped bass (x̄  =  3.71, SE  =  2.58) or channel catfish (x̄  = 3.97, SE  =  5.16). Underestimating lengths of Atlantic sturgeon may be due to difficulty in detecting the snout or the longer dorsal lobe of the heterocercal tail. White perch was the smallest species tested, and it had the largest percent measurement errors (both positive and negative) and the lowest percentage of images classified as good or acceptable. Automated length estimates for the four species using Echoview software varied with position in the view-field. Estimates tended to be low at more extreme azimuthal angles (fish's angle off-axis within the view-field), but mean and maximum estimates were highly correlated with total length. Software estimates also were biased by fish images partially outside the view-field and when acoustic crosstalk occurred (when a fish perpendicular to the sonar and at relatively close range is detected in the side lobes of adjacent beams). These sources of

  13. Boundary methods for mode estimation

    Science.gov (United States)

    Pierson, William E., Jr.; Ulug, Batuhan; Ahalt, Stanley C.

    1999-08-01

    This paper investigates the use of Boundary Methods (BMs), a collection of tools used for distribution analysis, as a method for estimating the number of modes associated with a given data set. Model order information of this type is required by several pattern recognition applications. The BM technique provides a novel approach to this parameter estimation problem and is comparable in terms of both accuracy and computations to other popular mode estimation techniques currently found in the literature and automatic target recognition applications. This paper explains the methodology used in the BM approach to mode estimation. Also, this paper quickly reviews other common mode estimation techniques and describes the empirical investigation used to explore the relationship of the BM technique to other mode estimation techniques. Specifically, the accuracy and computational efficiency of the BM technique are compared quantitatively to the a mixture of Gaussian (MOG) approach and a k-means approach to model order estimation. The stopping criteria of the MOG and k-means techniques is the Akaike Information Criteria (AIC).

  14. Heuristic introduction to estimation methods

    International Nuclear Information System (INIS)

    Feeley, J.J.; Griffith, J.M.

    1982-08-01

    The methods and concepts of optimal estimation and control have been very successfully applied in the aerospace industry during the past 20 years. Although similarities exist between the problems (control, modeling, measurements) in the aerospace and nuclear power industries, the methods and concepts have found only scant acceptance in the nuclear industry. Differences in technical language seem to be a major reason for the slow transfer of estimation and control methods to the nuclear industry. Therefore, this report was written to present certain important and useful concepts with a minimum of specialized language. By employing a simple example throughout the report, the importance of several information and uncertainty sources is stressed and optimal ways of using or allowing for these sources are presented. This report discusses optimal estimation problems. A future report will discuss optimal control problems

  15. Feasibility and reliability of digital imaging for estimating food selection and consumption from students' packed lunches.

    Science.gov (United States)

    Taylor, Jennifer C; Sutter, Carolyn; Ontai, Lenna L; Nishina, Adrienne; Zidenberg-Cherr, Sheri

    2018-01-01

    Although increasing attention is placed on the quality of foods in children's packed lunches, few studies have examined the capacity of observational methods to reliably determine both what is selected and consumed from these lunches. The objective of this project was to assess the feasibility and inter-rater reliability of digital imaging for determining selection and consumption from students' packed lunches, by adapting approaches previously applied to school lunches. Study 1 assessed feasibility and reliability of data collection among a sample of packed lunches (n = 155), while Study 2 further examined reliability in a larger sample of packed (n = 386) as well as school (n = 583) lunches. Based on the results from Study 1, it was feasible to collect and code most items in packed lunch images; missing data were most commonly attributed to packaging that limited visibility of contents. Across both studies, there was satisfactory reliability for determining food types selected, quantities selected, and quantities consumed in the eight food categories examined (weighted kappa coefficients 0.68-0.97 for packed lunches, 0.74-0.97 for school lunches), with lowest reliability for estimating condiments and meats/meat alternatives in packed lunches. In extending methods predominately applied to school lunches, these findings demonstrate the capacity of digital imaging for the objective estimation of selection and consumption from both school and packed lunches. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine

    DEFF Research Database (Denmark)

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon

    2013-01-01

    as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine...

  17. Assessment of the Maximal Split-Half Coefficient to Estimate Reliability

    Science.gov (United States)

    Thompson, Barry L.; Green, Samuel B.; Yang, Yanyun

    2010-01-01

    The maximal split-half coefficient is computed by calculating all possible split-half reliability estimates for a scale and then choosing the maximal value as the reliability estimate. Osburn compared the maximal split-half coefficient with 10 other internal consistency estimates of reliability and concluded that it yielded the most consistently…

  18. Estimated Value of Service Reliability for Electric Utility Customers in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, M.J.; Mercurio, Matthew; Schellenberg, Josh

    2009-06-01

    Information on the value of reliable electricity service can be used to assess the economic efficiency of investments in generation, transmission and distribution systems, to strategically target investments to customer segments that receive the most benefit from system improvements, and to numerically quantify the risk associated with different operating, planning and investment strategies. This paper summarizes research designed to provide estimates of the value of service reliability for electricity customers in the US. These estimates were obtained by analyzing the results from 28 customer value of service reliability studies conducted by 10 major US electric utilities over the 16 year period from 1989 to 2005. Because these studies used nearly identical interruption cost estimation or willingness-to-pay/accept methods it was possible to integrate their results into a single meta-database describing the value of electric service reliability observed in all of them. Once the datasets from the various studies were combined, a two-part regression model was used to estimate customer damage functions that can be generally applied to calculate customer interruption costs per event by season, time of day, day of week, and geographical regions within the US for industrial, commercial, and residential customers. Estimated interruption costs for different types of customers and of different duration are provided. Finally, additional research and development designed to expand the usefulness of this powerful database and analysis are suggested.

  19. Reliability of stellar inclination estimated from asteroseismology: analytical criteria, mock simulations and Kepler data analysis

    Science.gov (United States)

    Kamiaka, Shoya; Benomar, Othman; Suto, Yasushi

    2018-05-01

    Advances in asteroseismology of solar-like stars, now provide a unique method to estimate the stellar inclination i⋆. This enables to evaluate the spin-orbit angle of transiting planetary systems, in a complementary fashion to the Rossiter-McLaughlineffect, a well-established method to estimate the projected spin-orbit angle λ. Although the asteroseismic method has been broadly applied to the Kepler data, its reliability has yet to be assessed intensively. In this work, we evaluate the accuracy of i⋆ from asteroseismology of solar-like stars using 3000 simulated power spectra. We find that the low signal-to-noise ratio of the power spectra induces a systematic under-estimate (over-estimate) bias for stars with high (low) inclinations. We derive analytical criteria for the reliable asteroseismic estimate, which indicates that reliable measurements are possible in the range of 20° ≲ i⋆ ≲ 80° only for stars with high signal-to-noise ratio. We also analyse and measure the stellar inclination of 94 Kepler main-sequence solar-like stars, among which 33 are planetary hosts. According to our reliability criteria, a third of them (9 with planets, 22 without) have accurate stellar inclination. Comparison of our asteroseismic estimate of vsin i⋆ against spectroscopic measurements indicates that the latter suffers from a large uncertainty possibly due to the modeling of macro-turbulence, especially for stars with projected rotation speed vsin i⋆ ≲ 5km/s. This reinforces earlier claims, and the stellar inclination estimated from the combination of measurements from spectroscopy and photometric variation for slowly rotating stars needs to be interpreted with caution.

  20. A Comparison of the Approaches of Generalizability Theory and Item Response Theory in Estimating the Reliability of Test Scores for Testlet-Composed Tests

    Science.gov (United States)

    Lee, Guemin; Park, In-Yong

    2012-01-01

    Previous assessments of the reliability of test scores for testlet-composed tests have indicated that item-based estimation methods overestimate reliability. This study was designed to address issues related to the extent to which item-based estimation methods overestimate the reliability of test scores composed of testlets and to compare several…

  1. Order statistics & inference estimation methods

    CERN Document Server

    Balakrishnan, N

    1991-01-01

    The literature on order statistics and inferenc eis quite extensive and covers a large number of fields ,but most of it is dispersed throughout numerous publications. This volume is the consolidtion of the most important results and places an emphasis on estimation. Both theoretical and computational procedures are presented to meet the needs of researchers, professionals, and students. The methods of estimation discussed are well-illustrated with numerous practical examples from both the physical and life sciences, including sociology,psychology,a nd electrical and chemical engineering. A co

  2. Methods for estimating the semivariogram

    DEFF Research Database (Denmark)

    Lophaven, Søren Nymand; Carstensen, Niels Jacob; Rootzen, Helle

    2002-01-01

    . In the existing literature various methods for modelling the semivariogram have been proposed, while only a few studies have been made on comparing different approaches. In this paper we compare eight approaches for modelling the semivariogram, i.e. six approaches based on least squares estimation...... maximum likelihood performed better than the least squares approaches. We also applied maximum likelihood and least squares estimation to a real dataset, containing measurements of salinity at 71 sampling stations in the Kattegat basin. This showed that the calculation of spatial predictions...

  3. Empirical Study of Travel Time Estimation and Reliability

    OpenAIRE

    Li, Ruimin; Chai, Huajun; Tang, Jin

    2013-01-01

    This paper explores the travel time distribution of different types of urban roads, the link and path average travel time, and variance estimation methods by analyzing the large-scale travel time dataset detected from automatic number plate readers installed throughout Beijing. The results show that the best-fitting travel time distribution for different road links in 15 min time intervals differs for different traffic congestion levels. The average travel time for all links on all days can b...

  4. Numerical Model based Reliability Estimation of Selective Laser Melting Process

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2014-01-01

    Selective laser melting is developing into a standard manufacturing technology with applications in various sectors. However, the process is still far from being at par with conventional processes such as welding and casting, the primary reason of which is the unreliability of the process. While...... of the selective laser melting process. A validated 3D finite-volume alternating-direction-implicit numerical technique is used to model the selective laser melting process, and is calibrated against results from single track formation experiments. Correlation coefficients are determined for process input...... parameters such as laser power, speed, beam profile, etc. Subsequently, uncertainties in the processing parameters are utilized to predict a range for the various outputs, using a Monte Carlo method based uncertainty analysis methodology, and the reliability of the process is established....

  5. Research on Control Method Based on Real-Time Operational Reliability Evaluation for Space Manipulator

    Directory of Open Access Journals (Sweden)

    Yifan Wang

    2014-05-01

    Full Text Available A control method based on real-time operational reliability evaluation for space manipulator is presented for improving the success rate of a manipulator during the execution of a task. In this paper, a method for quantitative analysis of operational reliability is given when manipulator is executing a specified task; then a control model which could control the quantitative operational reliability is built. First, the control process is described by using a state space equation. Second, process parameters are estimated in real time using Bayesian method. Third, the expression of the system's real-time operational reliability is deduced based on the state space equation and process parameters which are estimated using Bayesian method. Finally, a control variable regulation strategy which considers the cost of control is given based on the Theory of Statistical Process Control. It is shown via simulations that this method effectively improves the operational reliability of space manipulator control system.

  6. Extrapolation Method for System Reliability Assessment

    DEFF Research Database (Denmark)

    Qin, Jianjun; Nishijima, Kazuyoshi; Faber, Michael Havbro

    2012-01-01

    of integrals with scaled domains. The performance of this class of approximation depends on the approach applied for the scaling and the functional form utilized for the extrapolation. A scheme for this task is derived here taking basis in the theory of asymptotic solutions to multinormal probability integrals......The present paper presents a new scheme for probability integral solution for system reliability analysis, which takes basis in the approaches by Naess et al. (2009) and Bucher (2009). The idea is to evaluate the probability integral by extrapolation, based on a sequence of MC approximations...... that the proposed scheme is efficient and adds to generality for this class of approximations for probability integrals....

  7. Comparison of Model Reliabilities from Single-Step and Bivariate Blending Methods

    DEFF Research Database (Denmark)

    Taskinen, Matti; Mäntysaari, Esa; Lidauer, Martin

    2013-01-01

    Model based reliabilities in genetic evaluation are compared between three methods: animal model BLUP, single-step BLUP, and bivariate blending after genomic BLUP. The original bivariate blending is revised in this work to better account animal models. The study data is extracted from...... be calculated. Model reliabilities by the single-step and the bivariate blending methods were higher than by animal model due to genomic information. Compared to the single-step method, the bivariate blending method reliability estimates were, in general, lower. Computationally bivariate blending method was......, on the other hand, lighter than the single-step method....

  8. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    International Nuclear Information System (INIS)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee; Lee, Minuk; Choi, Jong-su; Hong, Sup

    2015-01-01

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF

  9. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee [Hanyang University, Seoul (Korea, Republic of); Lee, Minuk; Choi, Jong-su; Hong, Sup [Korea Research Insitute of Ships and Ocean Engineering, Daejeon (Korea, Republic of)

    2015-02-15

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF.

  10. Monte Carlo methods for the reliability analysis of Markov systems

    International Nuclear Information System (INIS)

    Buslik, A.J.

    1985-01-01

    This paper presents Monte Carlo methods for the reliability analysis of Markov systems. Markov models are useful in treating dependencies between components. The present paper shows how the adjoint Monte Carlo method for the continuous time Markov process can be derived from the method for the discrete-time Markov process by a limiting process. The straightforward extensions to the treatment of mean unavailability (over a time interval) are given. System unavailabilities can also be estimated; this is done by making the system failed states absorbing, and not permitting repair from them. A forward Monte Carlo method is presented in which the weighting functions are related to the adjoint function. In particular, if the exact adjoint function is known then weighting factors can be constructed such that the exact answer can be obtained with a single Monte Carlo trial. Of course, if the exact adjoint function is known, there is no need to perform the Monte Carlo calculation. However, the formulation is useful since it gives insight into choices of the weight factors which will reduce the variance of the estimator

  11. Reliability of non-destructive testing methods

    International Nuclear Information System (INIS)

    Broekhoven, M.J.G.

    1988-01-01

    This contribution regards the results of an evaluation of the reliability of radiography (X-rays and gamma-rays), manual-, and mechanized/automated ultrasonic examination by generally accepted codes/rules, with respect to detection, characterization and sizing/localization of defects. The evaluation is based on the results of examinations, by a number of teams, of 30 test plates, 30 and 50 mm thickness, containing V,U, X and K-shaped welds each containing several types of imperfections (211) in total) typical for steel arc fusion welding, such as porosity, inclusions, lack of fusion or penetration and cracks. Besides, some results are presented obtained from research on advanced UT-techniques, viz. the time-of-flight-diffraction and flaw-tip deflection technique. (author)

  12. Reliability of non-destructive testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Broekhoven, M J.G. [Ministry of Social Affairs, (Netherlands)

    1988-12-31

    This contribution regards the results of an evaluation of the reliability of radiography (X-rays and gamma-rays), manual-, and mechanized/automated ultrasonic examination by generally accepted codes/rules, with respect to detection, characterization and sizing/localization of defects. The evaluation is based on the results of examinations, by a number of teams, of 30 test plates, 30 and 50 mm thickness, containing V,U, X and K-shaped welds each containing several types of imperfections (211) in total) typical for steel arc fusion welding, such as porosity, inclusions, lack of fusion or penetration and cracks. Besides, some results are presented obtained from research on advanced UT-techniques, viz. the time-of-flight-diffraction and flaw-tip deflection technique. (author). 4 refs.

  13. Reliability: How much is it worth? Beyond its estimation or prediction, the (net) present value of reliability

    International Nuclear Information System (INIS)

    Saleh, J.H.; Marais, K.

    2006-01-01

    In this article, we link an engineering concept, reliability, to a financial and managerial concept, net present value, by exploring the impact of a system's reliability on its revenue generation capability. The framework here developed for non-repairable systems quantitatively captures the value of reliability from a financial standpoint. We show that traditional present value calculations of engineering systems do not account for system reliability, thus over-estimate a system's worth and can therefore lead to flawed investment decisions. It is therefore important to involve reliability engineers upfront before investment decisions are made in technical systems. In addition, the analyses here developed help designers identify the optimal level of reliability that maximizes a system's net present value-the financial value reliability provides to the system minus the cost to achieve this level of reliability. Although we recognize that there are numerous considerations driving the specification of an engineering system's reliability, we contend that the financial analysis of reliability here developed should be made available to decision-makers to support in part, or at least be factored into, the system reliability specification

  14. A reliability evaluation method for NPP safety DCS application software

    International Nuclear Information System (INIS)

    Li Yunjian; Zhang Lei; Liu Yuan

    2014-01-01

    In the field of nuclear power plant (NPP) digital i and c application, reliability evaluation for safety DCS application software is a key obstacle to be removed. In order to quantitatively evaluate reliability of NPP safety DCS application software, this paper propose a reliability evaluating method based on software development life cycle every stage's v and v defects density characteristics, by which the operating reliability level of the software can be predicted before its delivery, and helps to improve the reliability of NPP safety important software. (authors)

  15. Empiric reliability of diagnostic and prognostic estimations of physical standards of children, going in for sports.

    Directory of Open Access Journals (Sweden)

    Zaporozhanov V.A.

    2012-12-01

    Full Text Available In the conditions of sporting-pedagogical practices objective estimation of potential possibilities gettings busy already on the initial stages of long-term preparation examined as one of issues of the day. The proper quantitative information allows to individualize preparation of gettings in obedience to requirements to the guided processes busy. Research purpose - logically and metrical to rotin expedience of metrical method of calculations of reliability of results of the control measurings, in-use for diagnostics of psychophysical fitness and prognosis of growth of trade gettings busy in the select type of sport. Material and methods. Analysed the results of the control measurings on four indexes of psychophysical preparedness and estimation of experts of fitness 24th gettings busy composition of children of gymnastic school. The results of initial and final inspection of gymnasts on the same control tests processed the method of mathematical statistics. Expected the metrical estimations of reliability of measurings is stability, co-ordination and informing of control information for current diagnostics and prognosis of sporting possibilities inspected. Results. Expedience of the use in these aims of metrical operations of calculation of complex estimation of the psychophysical state of gettings busy is metrology grounded. Conclusions. Research results confirm expedience of calculation of complex estimation of psychophysical features gettings busy for diagnostics of fitness in the select type of sport and trade prognosis on the subsequent stages of preparation.

  16. Reliability studies of diagnostic methods in Indian traditional Ayurveda medicine: An overview

    Science.gov (United States)

    Kurande, Vrinda Hitendra; Waagepetersen, Rasmus; Toft, Egon; Prasad, Ramjee

    2013-01-01

    Recently, a need to develop supportive new scientific evidence for contemporary Ayurveda has emerged. One of the research objectives is an assessment of the reliability of diagnoses and treatment. Reliability is a quantitative measure of consistency. It is a crucial issue in classification (such as prakriti classification), method development (pulse diagnosis), quality assurance for diagnosis and treatment and in the conduct of clinical studies. Several reliability studies are conducted in western medicine. The investigation of the reliability of traditional Chinese, Japanese and Sasang medicine diagnoses is in the formative stage. However, reliability studies in Ayurveda are in the preliminary stage. In this paper, examples are provided to illustrate relevant concepts of reliability studies of diagnostic methods and their implication in practice, education, and training. An introduction to reliability estimates and different study designs and statistical analysis is given for future studies in Ayurveda. PMID:23930037

  17. Validity and reliability of central blood pressure estimated by upper arm oscillometric cuff pressure.

    Science.gov (United States)

    Climie, Rachel E D; Schultz, Martin G; Nikolic, Sonja B; Ahuja, Kiran D K; Fell, James W; Sharman, James E

    2012-04-01

    Noninvasive central blood pressure (BP) independently predicts mortality, but current methods are operator-dependent, requiring skill to obtain quality recordings. The aims of this study were first, to determine the validity of an automatic, upper arm oscillometric cuff method for estimating central BP (O(CBP)) by comparison with the noninvasive reference standard of radial tonometry (T(CBP)). Second, we determined the intratest and intertest reliability of O(CBP). To assess validity, central BP was estimated by O(CBP) (Pulsecor R6.5B monitor) and compared with T(CBP) (SphygmoCor) in 47 participants free from cardiovascular disease (aged 57 ± 9 years) in supine, seated, and standing positions. Brachial mean arterial pressure (MAP) and diastolic BP (DBP) from the O(CBP) device were used to calibrate in both devices. Duplicate measures were recorded in each position on the same day to assess intratest reliability, and participants returned within 10 ± 7 days for repeat measurements to assess intertest reliability. There was a strong intraclass correlation (ICC = 0.987, P difference (1.2 ± 2.2 mm Hg) for central systolic BP (SBP) determined by O(CBP) compared with T(CBP). Ninety-six percent of all comparisons (n = 495 acceptable recordings) were within 5 mm Hg. With respect to reliability, there were strong correlations but higher limits of agreement for the intratest (ICC = 0.975, P difference 0.6 ± 4.5 mm Hg) and intertest (ICC = 0.895, P difference 4.3 ± 8.0 mm Hg) comparisons. Estimation of central SBP using cuff oscillometry is comparable to radial tonometry and has good reproducibility. As a noninvasive, relatively operator-independent method, O(CBP) may be as useful as T(CBP) for estimating central BP in clinical practice.

  18. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    Science.gov (United States)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  19. ARA and ARI imperfect repair models: Estimation, goodness-of-fit and reliability prediction

    International Nuclear Information System (INIS)

    Toledo, Maria Luíza Guerra de; Freitas, Marta A.; Colosimo, Enrico A.; Gilardoni, Gustavo L.

    2015-01-01

    An appropriate maintenance policy is essential to reduce expenses and risks related to equipment failures. A fundamental aspect to be considered when specifying such policies is to be able to predict the reliability of the systems under study, based on a well fitted model. In this paper, the classes of models Arithmetic Reduction of Age and Arithmetic Reduction of Intensity are explored. Likelihood functions for such models are derived, and a graphical method is proposed for model selection. A real data set involving failures in trucks used by a Brazilian mining is analyzed considering models with different memories. Parameters, namely, shape and scale for Power Law Process, and the efficiency of repair were estimated for the best fitted model. Estimation of model parameters allowed us to derive reliability estimators to predict the behavior of the failure process. These results are a valuable information for the mining company and can be used to support decision making regarding preventive maintenance policy. - Highlights: • Likelihood functions for imperfect repair models are derived. • A goodness-of-fit technique is proposed as a tool for model selection. • Failures in trucks owned by a Brazilian mining are modeled. • Estimation allowed deriving reliability predictors to forecast the future failure process of the trucks

  20. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In the nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model to Dynamic Safety System(DDS) shows that the estimated reliability of the system is quite reasonable and realistic

  1. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model of dynamic safety system (DSS) shows that the estimated reliability of the system is quite reasonable and realistic. (author)

  2. Estimation of reliability of a interleaving PFC boost converter

    Directory of Open Access Journals (Sweden)

    Gulam Amer Sandepudi

    2010-01-01

    Full Text Available Reliability plays an important role in power supplies. For other electronic equipment, a certain failure mode, at least for a part of the total system, can often be employed without serious (critical effects. However, for power supply no such condition can be accepted, since very high demands on its reliability must be achieved. At higher power levels, the continuous conduction mode (CCM boost converter is preferred topology for implementation a front end with PFC. As a result, significant efforts have been made to improve the performance of high boost converter. This paper is one of the efforts for improving the performance of the converter from the reliability point of view. In this paper, interleaving boost power factor correction converter is simulated with single switch in continuous conduction mode (CCM, discontinuous conduction mode (DCM and critical conduction mode (CRM under different output power ratings. Results of the converter are explored from reliability point of view.

  3. Mathematical Methods in Survival Analysis, Reliability and Quality of Life

    CERN Document Server

    Huber, Catherine; Mesbah, Mounir

    2008-01-01

    Reliability and survival analysis are important applications of stochastic mathematics (probability, statistics and stochastic processes) that are usually covered separately in spite of the similarity of the involved mathematical theory. This title aims to redress this situation: it includes 21 chapters divided into four parts: Survival analysis, Reliability, Quality of life, and Related topics. Many of these chapters were presented at the European Seminar on Mathematical Methods for Survival Analysis, Reliability and Quality of Life in 2006.

  4. Improvement of human reliability analysis method for PRA

    International Nuclear Information System (INIS)

    Tanji, Junichi; Fujimoto, Haruo

    2013-09-01

    It is required to refine human reliability analysis (HRA) method by, for example, incorporating consideration for the cognitive process of operator into the evaluation of diagnosis errors and decision-making errors, as a part of the development and improvement of methods used in probabilistic risk assessments (PRAs). JNES has been developed a HRA method based on ATHENA which is suitable to handle the structured relationship among diagnosis errors, decision-making errors and operator cognition process. This report summarizes outcomes obtained from the improvement of HRA method, in which enhancement to evaluate how the plant degraded condition affects operator cognitive process and to evaluate human error probabilities (HEPs) which correspond to the contents of operator tasks is made. In addition, this report describes the results of case studies on the representative accident sequences to investigate the applicability of HRA method developed. HEPs of the same accident sequences are also estimated using THERP method, which is most popularly used HRA method, and comparisons of the results obtained using these two methods are made to depict the differences of these methods and issues to be solved. Important conclusions obtained are as follows: (1) Improvement of HRA method using operator cognitive action model. Clarification of factors to be considered in the evaluation of human errors, incorporation of degraded plant safety condition into HRA and investigation of HEPs which are affected by the contents of operator tasks were made to improve the HRA method which can integrate operator cognitive action model into ATHENA method. In addition, the detail of procedures of the improved method was delineated in the form of flowchart. (2) Case studies and comparison with the results evaluated by THERP method. Four operator actions modeled in the PRAs of representative BWR5 and 4-loop PWR plants were selected and evaluated as case studies. These cases were also evaluated using

  5. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    Science.gov (United States)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  6. Bayesian estimation methods in metrology

    International Nuclear Information System (INIS)

    Cox, M.G.; Forbes, A.B.; Harris, P.M.

    2004-01-01

    In metrology -- the science of measurement -- a measurement result must be accompanied by a statement of its associated uncertainty. The degree of validity of a measurement result is determined by the validity of the uncertainty statement. In recognition of the importance of uncertainty evaluation, the International Standardization Organization in 1995 published the Guide to the Expression of Uncertainty in Measurement and the Guide has been widely adopted. The validity of uncertainty statements is tested in interlaboratory comparisons in which an artefact is measured by a number of laboratories and their measurement results compared. Since the introduction of the Mutual Recognition Arrangement, key comparisons are being undertaken to determine the degree of equivalence of laboratories for particular measurement tasks. In this paper, we discuss the possible development of the Guide to reflect Bayesian approaches and the evaluation of key comparison data using Bayesian estimation methods

  7. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  8. Model uncertainty and multimodel inference in reliability estimation within a longitudinal framework.

    Science.gov (United States)

    Alonso, Ariel; Laenen, Annouschka

    2013-05-01

    Laenen, Alonso, and Molenberghs (2007) and Laenen, Alonso, Molenberghs, and Vangeneugden (2009) proposed a method to assess the reliability of rating scales in a longitudinal context. The methodology is based on hierarchical linear models, and reliability coefficients are derived from the corresponding covariance matrices. However, finding a good parsimonious model to describe complex longitudinal data is a challenging task. Frequently, several models fit the data equally well, raising the problem of model selection uncertainty. When model uncertainty is high one may resort to model averaging, where inferences are based not on one but on an entire set of models. We explored the use of different model building strategies, including model averaging, in reliability estimation. We found that the approach introduced by Laenen et al. (2007, 2009) combined with some of these strategies may yield meaningful results in the presence of high model selection uncertainty and when all models are misspecified, in so far as some of them manage to capture the most salient features of the data. Nonetheless, when all models omit prominent regularities in the data, misleading results may be obtained. The main ideas are further illustrated on a case study in which the reliability of the Hamilton Anxiety Rating Scale is estimated. Importantly, the ambit of model selection uncertainty and model averaging transcends the specific setting studied in the paper and may be of interest in other areas of psychometrics. © 2012 The British Psychological Society.

  9. Reliability estimation of safety-critical software-based systems using Bayesian networks

    International Nuclear Information System (INIS)

    Helminen, A.

    2001-06-01

    Due to the nature of software faults and the way they cause system failures new methods are needed for the safety and reliability evaluation of software-based safety-critical automation systems in nuclear power plants. In the research project 'Programmable automation system safety integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002), various safety assessment methods and tools for software based systems are developed and evaluated. The project is financed together by the Radiation and Nuclear Safety Authority (STUK), the Ministry of Trade and Industry (KTM) and the Technical Research Centre of Finland (VTT). In this report the applicability of Bayesian networks to the reliability estimation of software-based systems is studied. The applicability is evaluated by building Bayesian network models for the systems of interest and performing simulations for these models. In the simulations hypothetical evidence is used for defining the parameter relations and for determining the ability to compensate disparate evidence in the models. Based on the experiences from modelling and simulations we are able to conclude that Bayesian networks provide a good method for the reliability estimation of software-based systems. (orig.)

  10. Estimating reliability of degraded system based on the probability density evolution with multi-parameter

    Directory of Open Access Journals (Sweden)

    Jiang Ge

    2017-01-01

    Full Text Available System degradation was usually caused by multiple-parameter degradation. The assessment result of system reliability by universal generating function was low accurate when compared with the Monte Carlo simulation. And the probability density function of the system output performance cannot be got. So the reliability assessment method based on the probability density evolution with multi-parameter was presented for complexly degraded system. Firstly, the system output function was founded according to the transitive relation between component parameters and the system output performance. Then, the probability density evolution equation based on the probability conservation principle and the system output function was established. Furthermore, probability distribution characteristics of the system output performance was obtained by solving differential equation. Finally, the reliability of the degraded system was estimated. This method did not need to discrete the performance parameters and can establish continuous probability density function of the system output performance with high calculation efficiency and low cost. Numerical example shows that this method is applicable to evaluate the reliability of multi-parameter degraded system.

  11. Reliability estimation for check valves and other components

    International Nuclear Information System (INIS)

    McElhaney, K.L.; Staunton, R.H.

    1996-01-01

    For years the nuclear industry has depended upon component operational reliability information compiled from reliability handbooks and other generic sources as well as private databases generated by recognized experts both within and outside the nuclear industry. Regrettably, these technical bases lacked the benefit of large-scale operational data and comprehensive data verification, and did not take into account the parameters and combinations of parameters that affect the determination of failure rates. This paper briefly examines the historic use of generic component reliability data, its sources, and its limitations. The concept of using a single failure rate for a particular component type is also examined. Particular emphasis is placed on check valves due to the information available on those components. The Appendix presents some of the results of the extensive analyses done by Oak Ridge National Laboratory (ORNL) on check valve performance

  12. Generating human reliability estimates using expert judgment. Volume 1. Main report

    International Nuclear Information System (INIS)

    Comer, M.K.; Seaver, D.A.; Stillwell, W.G.; Gaddy, C.D.

    1984-11-01

    The US Nuclear Regulatory Commission is conducting a research program to determine the practicality, acceptability, and usefulness of several different methods for obtaining human reliability data and estimates that can be used in nuclear power plant probabilistic risk assessment (PRA). One method, investigated as part of this overall research program, uses expert judgment to generate human error probability (HEP) estimates and associated uncertainty bounds. The project described in this document evaluated two techniques for using expert judgment: paired comparisons and direct numerical estimation. Volume 1 of this report provides a brief overview of the background of the project, the procedure for using psychological scaling techniques to generate HEP estimates and conclusions from evaluation of the techniques. Results of the evaluation indicate that techniques using expert judgment should be given strong consideration for use in developing HEP estimates. In addition, HEP estimates for 35 tasks related to boiling water reactors (BMRs) were obtained as part of the evaluation. These HEP estimates are also included in the report

  13. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    Energy Technology Data Exchange (ETDEWEB)

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  14. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    International Nuclear Information System (INIS)

    Boring, Ronald Laurids

    2010-01-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  15. A Survey of Software Reliability Modeling and Estimation

    Science.gov (United States)

    1983-09-01

    considered include: the Jelinski-Moranda Model, the ,Geometric Model,’ and Musa’s Model. A Monte -Carlo study of the behavior of the ’V"’"*least squares...ceedings Number 261, 1979, pp. 34-1, 34-11. IoelAmrit, AGieboSSukert, Alan and Goel, Ararat , "A Guidebookfor Software Reliability Assessment, 1980

  16. A method to assign failure rates for piping reliability assessments

    International Nuclear Information System (INIS)

    Gamble, R.M.; Tagart, S.W. Jr.

    1991-01-01

    This paper reports on a simplified method that has been developed to assign failure rates that can be used in reliability and risk studies of piping. The method can be applied on a line-by-line basis by identifying line and location specific attributes that can lead to piping unreliability from in-service degradation mechanisms and random events. A survey of service experience for nuclear piping reliability also was performed. The data from this survey provides a basis for identifying in-service failure attributes and assigning failure rates for risk and reliability studies

  17. Reliability analysis of neutron transport simulation using Monte Carlo method

    International Nuclear Information System (INIS)

    Souza, Bismarck A. de; Borges, Jose C.

    1995-01-01

    This work presents a statistical and reliability analysis covering data obtained by computer simulation of neutron transport process, using the Monte Carlo method. A general description of the method and its applications is presented. Several simulations, corresponding to slowing down and shielding problems have been accomplished. The influence of the physical dimensions of the materials and of the sample size on the reliability level of results was investigated. The objective was to optimize the sample size, in order to obtain reliable results, optimizing computation time. (author). 5 refs, 8 figs

  18. Modifying nodal pricing method considering market participants optimality and reliability

    Directory of Open Access Journals (Sweden)

    A. R. Soofiabadi

    2015-06-01

    Full Text Available This paper develops a method for nodal pricing and market clearing mechanism considering reliability of the system. The effects of components reliability on electricity price, market participants’ profit and system social welfare is considered. This paper considers reliability both for evaluation of market participant’s optimality as well as for fair pricing and market clearing mechanism. To achieve fair pricing, nodal price has been obtained through a two stage optimization problem and to achieve fair market clearing mechanism, comprehensive criteria has been introduced for optimality evaluation of market participant. Social welfare of the system and system efficiency are increased under proposed modified nodal pricing method.

  19. Engineer’s estimate reliability and statistical characteristics of bids

    Directory of Open Access Journals (Sweden)

    Fariborz M. Tehrani

    2016-12-01

    Full Text Available The objective of this report is to provide a methodology for examining bids and evaluating the performance of engineer’s estimates in capturing the true cost of projects. This study reviews the cost development for transportation projects in addition to two sources of uncertainties in a cost estimate, including modeling errors and inherent variability. Sample projects are highway maintenance projects with a similar scope of the work, size, and schedule. Statistical analysis of engineering estimates and bids examines the adaptability of statistical models for sample projects. Further, the variation of engineering cost estimates from inception to implementation has been presented and discussed for selected projects. Moreover, the applicability of extreme values theory is assessed for available data. The results indicate that the performance of engineer’s estimate is best evaluated based on trimmed average of bids, excluding discordant bids.

  20. Reliability estimates for selected sensors in fusion applications

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1996-09-01

    This report presents the results of a study to define several types of sensors in use, the qualitative reliability (failure modes) and quantitative reliability (average failure rates) for these types of process sensors. Temperature, pressure, flow, and level sensors are discussed for water coolant and for cryogenic coolants. The failure rates that have been found are useful for risk assessment and safety analysis. Repair times and calibration intervals are also given when found in the literature. All of these values can also be useful to plant operators and maintenance personnel. Designers may be able to make use of these data when planning systems. The final chapter in this report discusses failure rates for several types of personnel safety sensors, including ionizing radiation monitors, toxic and combustible gas detectors, humidity sensors, and magnetic field sensors. These data could be useful to industrial hygienists and other safety professionals when designing or auditing for personnel safety

  1. Sensitivity of Reliability Estimates in Partially Damaged RC Structures subject to Earthquakes, using Reduced Hysteretic Models

    DEFF Research Database (Denmark)

    Iwankiewicz, R.; Nielsen, Søren R. K.; Skjærbæk, P. S.

    The subject of the paper is the investigation of the sensitivity of structural reliability estimation by a reduced hysteretic model for a reinforced concrete frame under an earthquake excitation.......The subject of the paper is the investigation of the sensitivity of structural reliability estimation by a reduced hysteretic model for a reinforced concrete frame under an earthquake excitation....

  2. Reliability and precision of pellet-group counts for estimating landscape-level deer density

    Science.gov (United States)

    David S. deCalesta

    2013-01-01

    This study provides hitherto unavailable methodology for reliably and precisely estimating deer density within forested landscapes, enabling quantitative rather than qualitative deer management. Reliability and precision of the deer pellet-group technique were evaluated in 1 small and 2 large forested landscapes. Density estimates, adjusted to reflect deer harvest and...

  3. Alternative Estimates of the Reliability of College Grade Point Averages. Professional File. Article 130, Spring 2013

    Science.gov (United States)

    Saupe, Joe L.; Eimers, Mardy T.

    2013-01-01

    The purpose of this paper is to explore differences in the reliabilities of cumulative college grade point averages (GPAs), estimated for unweighted and weighted, one-semester, 1-year, 2-year, and 4-year GPAs. Using cumulative GPAs for a freshman class at a major university, we estimate internal consistency (coefficient alpha) reliabilities for…

  4. Reliability assessment using Bayesian networks. Case study on quantative reliability estimation of a software-based motor protection relay

    International Nuclear Information System (INIS)

    Helminen, A.; Pulkkinen, U.

    2003-06-01

    In this report a quantitative reliability assessment of motor protection relay SPAM 150 C has been carried out. The assessment focuses to the methodological analysis of the quantitative reliability assessment using the software-based motor protection relay as a case study. The assessment method is based on Bayesian networks and tries to take the full advantage of the previous work done in a project called Programmable Automation System Safety Integrity assessment (PASSI). From the results and experiences achieved during the work it is justified to claim that the assessment method presented in the work enables a flexible use of qualitative and quantitative elements of reliability related evidence in a single reliability assessment. At the same time the assessment method is a concurrent way of reasoning one's beliefs and references about the reliability of the system. Full advantage of the assessment method is taken when using the method as a way to cultivate the information related to the reliability of software-based systems. The method can also be used as a communicational instrument in a licensing process of software-based systems. (orig.)

  5. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  6. Approximate estimation of system reliability via fault trees

    International Nuclear Information System (INIS)

    Dutuit, Y.; Rauzy, A.

    2005-01-01

    In this article, we show how fault tree analysis, carried out by means of binary decision diagrams (BDD), is able to approximate reliability of systems made of independent repairable components with a good accuracy and a good efficiency. We consider four algorithms: the Murchland lower bound, the Barlow-Proschan lower bound, the Vesely full approximation and the Vesely asymptotic approximation. For each of these algorithms, we consider an implementation based on the classical minimal cut sets/rare events approach and another one relying on the BDD technology. We present numerical results obtained with both approaches on various examples

  7. Lifetime Reliability Estimate and Extreme Permanent Deformations of Randomly Excited Elasto-Plastic Structures

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1983-01-01

    plastic deformation during several loadings can be modelled as a filtered Poisson process. Using the Markov property of this quantity the considered first-passage problem as well as the related extreme distribution problems are then solved numerically, and the results are compared to simulation studies.......A method is presented for life-time reliability' estimates of randomly excited yielding systems, assuming the structure to be safe, when the plastic deformations are confined below certain limits. The accumulated plastic deformations during any single significant loading history are considered...

  8. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    OpenAIRE

    Hai An; Ling Zhou; Hui Sun

    2016-01-01

    Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new...

  9. Consumptive use of upland rice as estimated by different methods

    International Nuclear Information System (INIS)

    Chhabda, P.R.; Varade, S.B.

    1985-01-01

    The consumptive use of upland rice (Oryza sativa Linn.) grown during the wet season (kharif) as estimated by modified Penman, radiation, pan-evaporation and Hargreaves methods showed a variation from computed consumptive use estimated by the gravimetric method. The variability increased with an increase in the irrigation interval, and decreased with an increase in the level of N applied. The average variability was less in pan-evaporation method, which could reliably be used for estimating water requirement of upland rice if percolation losses are considered

  10. The psychophysiological assessment method for pilot's professional reliability.

    Science.gov (United States)

    Zhang, L M; Yu, L S; Wang, K N; Jing, B S; Fang, C

    1997-05-01

    Previous research has shown that a pilot's professional reliability depends on two relative factors: the pilot's functional state and the demands of task workload. The Psychophysiological Reserve Capacity (PRC) is defined as a pilot's ability to accomplish additive tasks without reducing the performance of the primary task (flight task). We hypothesized that the PRC was a mirror of the pilot's functional state. The purpose of this study was to probe the psychophysiological method for evaluating a pilot's professional reliability on a simulator. The PRC Comprehensive Evaluating System (PRCCES) which was used in the experiment included four subsystems: a) quantitative evaluation system for pilot's performance on simulator; b) secondary task display and quantitative estimating system; c) multiphysiological data monitoring and statistical system; and d) comprehensive evaluation system for pilot PRC. Two studies were performed. In study one, 63 healthy and 13 hospitalized pilots participated. Each pilot performed a double 180 degrees circuit flight program with and without secondary task (three digit operation). The operator performance, score of secondary task and cost of physiological effort were measured and compared by PRCCES in the two conditions. Then, each pilot's flight skill in training was subjectively scored by instructor pilot ratings. In study two, 7 healthy pilots volunteered to take part in the experiment on the effects of sleep deprivation on pilot's PRC. Each participant had PRC tested pre- and post-8 h sleep deprivation. The results show that the PRC values of a healthy pilot was positively correlated with abilities of flexibility, operating and correcting deviation, attention distribution, and accuracy of instrument flight in the air (r = 0.27-0.40, p < 0.05), and negatively correlated with emotional anxiety in flight (r = -0.40, p < 0.05). The values of PRC in healthy pilots (0.61 +/- 0.17) were significantly higher than that of hospitalized pilots

  11. Estimation of the reliability function for two-parameter exponentiated Rayleigh or Burr type X distribution

    Directory of Open Access Journals (Sweden)

    Anupam Pathak

    2014-11-01

    Full Text Available Abstract: Problem Statement: The two-parameter exponentiated Rayleigh distribution has been widely used especially in the modelling of life time event data. It provides a statistical model which has a wide variety of application in many areas and the main advantage is its ability in the context of life time event among other distributions. The uniformly minimum variance unbiased and maximum likelihood estimation methods are the way to estimate the parameters of the distribution. In this study we explore and compare the performance of the uniformly minimum variance unbiased and maximum likelihood estimators of the reliability function R(t=P(X>t and P=P(X>Y for the two-parameter exponentiated Rayleigh distribution. Approach: A new technique of obtaining these parametric functions is introduced in which major role is played by the powers of the parameter(s and the functional forms of the parametric functions to be estimated are not needed.  We explore the performance of these estimators numerically under varying conditions. Through the simulation study a comparison are made on the performance of these estimators with respect to the Biasness, Mean Square Error (MSE, 95% confidence length and corresponding coverage percentage. Conclusion: Based on the results of simulation study the UMVUES of R(t and ‘P’ for the two-parameter exponentiated Rayleigh distribution found to be superior than MLES of R(t and ‘P’.

  12. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  13. Estimating the reliability of eyewitness identifications from police lineups.

    Science.gov (United States)

    Wixted, John T; Mickes, Laura; Dunn, John C; Clark, Steven E; Wells, William

    2016-01-12

    Laboratory-based mock crime studies have often been interpreted to mean that (i) eyewitness confidence in an identification made from a lineup is a weak indicator of accuracy and (ii) sequential lineups are diagnostically superior to traditional simultaneous lineups. Largely as a result, juries are increasingly encouraged to disregard eyewitness confidence, and up to 30% of law enforcement agencies in the United States have adopted the sequential procedure. We conducted a field study of actual eyewitnesses who were assigned to simultaneous or sequential photo lineups in the Houston Police Department over a 1-y period. Identifications were made using a three-point confidence scale, and a signal detection model was used to analyze and interpret the results. Our findings suggest that (i) confidence in an eyewitness identification from a fair lineup is a highly reliable indicator of accuracy and (ii) if there is any difference in diagnostic accuracy between the two lineup formats, it likely favors the simultaneous procedure.

  14. Statistical models and methods for reliability and survival analysis

    CERN Document Server

    Couallier, Vincent; Huber-Carol, Catherine; Mesbah, Mounir; Huber -Carol, Catherine; Limnios, Nikolaos; Gerville-Reache, Leo

    2013-01-01

    Statistical Models and Methods for Reliability and Survival Analysis brings together contributions by specialists in statistical theory as they discuss their applications providing up-to-date developments in methods used in survival analysis, statistical goodness of fit, stochastic processes for system reliability, amongst others. Many of these are related to the work of Professor M. Nikulin in statistics over the past 30 years. The authors gather together various contributions with a broad array of techniques and results, divided into three parts - Statistical Models and Methods, Statistical

  15. Estimation methods for special nuclear materials holdup

    International Nuclear Information System (INIS)

    Pillay, K.K.S.; Picard, R.R.

    1984-01-01

    The potential value of statistical models for the estimation of residual inventories of special nuclear materials was examined using holdup data from processing facilities and through controlled experiments. Although the measurement of hidden inventories of special nuclear materials in large facilities is a challenging task, reliable estimates of these inventories can be developed through a combination of good measurements and the use of statistical models. 7 references, 5 figures

  16. Development on methods for evaluating structure reliability of piping components

    International Nuclear Information System (INIS)

    Schimpfke, T.; Grebner, H.; Peschke, J.; Sievers, J.

    2003-01-01

    In the frame of the German reactor safety research program of the Federal Ministry of Economics and Labour, GRS has started to develop an analysis code named PROST (PRObabilistic STructure analysis) for estimating the leak and break probabilities of piping systems in nuclear power plants. The development is based on the experience achieved with applications of the public available US code PRAISE 3.10 (Piping Reliability Analysis Including Seismic Events), which was supplemented by additional features regarding the statistical evaluation and the crack orientation. PROST is designed to be more flexible to changes and supplementations. Up to now it can be used for calculating fatigue problems. The paper mentions the main capabilities and theoretical background of the present PROST development and presents a parametric study on the influence by changing the method of stress intensity factor and limit load calculation and the statistical evaluation options on the leak probability of an exemplary pipe with postulated axial crack distribution. Furthermore the resulting leak probability of an exemplary pipe with postulated circumferential crack distribution is compared with the results of the modified PRAISE computer program. The intention of this investigation is to show trends. Therefore the resulting absolute values for probabilities should not be considered as realistic evaluations. (author)

  17. Level III Reliability methods feasible for complex structures

    NARCIS (Netherlands)

    Waarts, P.H.; Boer, A. de

    2001-01-01

    The paper describes the comparison between three types of reliability methods: code type level I used by a designer, full level I and a level III method. Two cases that are typical for civil engineering practise, a cable-stayed subjected to traffic load and the installation of a soil retaining sheet

  18. Developing a reliable signal wire attachment method for rail.

    Science.gov (United States)

    2014-11-01

    The goal of this project was to develop a better attachment method for rail signal wires to improve the reliability of signaling : systems. EWI conducted basic research into the failure mode of current attachment methods and developed and tested a ne...

  19. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  20. Using the graphs models for evaluating in-core monitoring systems reliability by the method of imiting simulaton

    International Nuclear Information System (INIS)

    Golovanov, M.N.; Zyuzin, N.N.; Levin, G.L.; Chesnokov, A.N.

    1987-01-01

    An approach for estimation of reliability factors of complex reserved systems at early stages of development using the method of imitating simulation is considered. Different types of models, their merits and lacks are given. Features of in-core monitoring systems and advosability of graph model and graph theory element application for estimating reliability of such systems are shown. The results of investigation of the reliability factors of the reactor monitoring, control and core local protection subsystem are shown

  1. Metrological and reliable characteristics of transducers: estimation techniques

    International Nuclear Information System (INIS)

    Volkov, V.A.; Ryzhakov, V.V.

    1993-01-01

    Methods and techniques of finding the evaluations of metering method error dispersions by different factors, non-linearity of transformation functions, their hysteresis, as well as evaluations of full operating time of long-term use metering means, are presented. A program of static data computer processing is given. 65 refs

  2. Reliably detectable flaw size for NDE methods that use calibration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-04-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-1823 and associated mh18232 POD software gives most common methods of POD analysis. In this paper, POD analysis is applied to an NDE method, such as eddy current testing, where calibration is used. NDE calibration standards have known size artificial flaws such as electro-discharge machined (EDM) notches and flat bottom hole (FBH) reflectors which are used to set instrument sensitivity for detection of real flaws. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. Therefore, it is important to correlate signal responses from real flaws with signal responses form artificial flaws used in calibration process to determine reliably detectable flaw size.

  3. Information-theoretic methods for estimating of complicated probability distributions

    CERN Document Server

    Zong, Zhi

    2006-01-01

    Mixing up various disciplines frequently produces something that are profound and far-reaching. Cybernetics is such an often-quoted example. Mix of information theory, statistics and computing technology proves to be very useful, which leads to the recent development of information-theory based methods for estimating complicated probability distributions. Estimating probability distribution of a random variable is the fundamental task for quite some fields besides statistics, such as reliability, probabilistic risk analysis (PSA), machine learning, pattern recognization, image processing, neur

  4. Structural Reliability Methods for Wind Power Converter System Component Reliability Assessment

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Sørensen, John Dalsgaard

    2012-01-01

    Wind power converter systems are essential subsystems in both off-shore and on-shore wind turbines. It is the main interface between generator and grid connection. This system is affected by numerous stresses where the main contributors might be defined as vibration and temperature loadings....... The temperature variations induce time-varying stresses and thereby fatigue loads. A probabilistic model is used to model fatigue failure for an electrical component in the power converter system. This model is based on a linear damage accumulation and physics of failure approaches, where a failure criterion...... is defined by the threshold model. The attention is focused on crack propagation in solder joints of electrical components due to the temperature loadings. Structural Reliability approaches are used to incorporate model, physical and statistical uncertainties. Reliability estimation by means of structural...

  5. HUMAN RELIABILITY ANALYSIS DENGAN PENDEKATAN COGNITIVE RELIABILITY AND ERROR ANALYSIS METHOD (CREAM

    Directory of Open Access Journals (Sweden)

    Zahirah Alifia Maulida

    2015-01-01

    Full Text Available Kecelakaan kerja pada bidang grinding dan welding menempati urutan tertinggi selama lima tahun terakhir di PT. X. Kecelakaan ini disebabkan oleh human error. Human error terjadi karena pengaruh lingkungan kerja fisik dan non fisik.Penelitian kali menggunakan skenario untuk memprediksi serta mengurangi kemungkinan terjadinya error pada manusia dengan pendekatan CREAM (Cognitive Reliability and Error Analysis Method. CREAM adalah salah satu metode human reliability analysis yang berfungsi untuk mendapatkan nilai Cognitive Failure Probability (CFP yang dapat dilakukan dengan dua cara yaitu basic method dan extended method. Pada basic method hanya akan didapatkan nilai failure probabailty secara umum, sedangkan untuk extended method akan didapatkan CFP untuk setiap task. Hasil penelitian menunjukkan faktor- faktor yang mempengaruhi timbulnya error pada pekerjaan grinding dan welding adalah kecukupan organisasi, kecukupan dari Man Machine Interface (MMI & dukungan operasional, ketersediaan prosedur/ perencanaan, serta kecukupan pelatihan dan pengalaman. Aspek kognitif pada pekerjaan grinding yang memiliki nilai error paling tinggi adalah planning dengan nilai CFP 0.3 dan pada pekerjaan welding yaitu aspek kognitif execution dengan nilai CFP 0.18. Sebagai upaya untuk mengurangi nilai error kognitif pada pekerjaan grinding dan welding rekomendasi yang diberikan adalah memberikan training secara rutin, work instrucstion yang lebih rinci dan memberikan sosialisasi alat. Kata kunci: CREAM (cognitive reliability and error analysis method, HRA (human reliability analysis, cognitive error Abstract The accidents in grinding and welding sectors were the highest cases over the last five years in PT. X and it caused by human error. Human error occurs due to the influence of working environment both physically and non-physically. This study will implement an approaching scenario called CREAM (Cognitive Reliability and Error Analysis Method. CREAM is one of human

  6. Updated Value of Service Reliability Estimates for Electric Utility Customers in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Sullivan, Michael [Nexant Inc., Burlington, MA (United States); Schellenberg, Josh [Nexant Inc., Burlington, MA (United States); Blundell, Marshall [Nexant Inc., Burlington, MA (United States)

    2015-01-01

    This report updates the 2009 meta-analysis that provides estimates of the value of service reliability for electricity customers in the United States (U.S.). The meta-dataset now includes 34 different datasets from surveys fielded by 10 different utility companies between 1989 and 2012. Because these studies used nearly identical interruption cost estimation or willingness-to-pay/accept methods, it was possible to integrate their results into a single meta-dataset describing the value of electric service reliability observed in all of them. Once the datasets from the various studies were combined, a two-part regression model was used to estimate customer damage functions that can be generally applied to calculate customer interruption costs per event by season, time of day, day of week, and geographical regions within the U.S. for industrial, commercial, and residential customers. This report focuses on the backwards stepwise selection process that was used to develop the final revised model for all customer classes. Across customer classes, the revised customer interruption cost model has improved significantly because it incorporates more data and does not include the many extraneous variables that were in the original specification from the 2009 meta-analysis. The backwards stepwise selection process led to a more parsimonious model that only included key variables, while still achieving comparable out-of-sample predictive performance. In turn, users of interruption cost estimation tools such as the Interruption Cost Estimate (ICE) Calculator will have less customer characteristics information to provide and the associated inputs page will be far less cumbersome. The upcoming new version of the ICE Calculator is anticipated to be released in 2015.

  7. Estimating the Optimal Capacity for Reservoir Dam based on Reliability Level for Meeting Demands

    Directory of Open Access Journals (Sweden)

    Mehrdad Taghian

    2017-02-01

    Full Text Available Introduction: One of the practical and classic problems in the water resource studies is estimation of the optimal reservoir capacity to satisfy demands. However, full supplying demands for total periods need a very high dam to supply demands during severe drought conditions. That means a major part of reservoir capacity and costs is only usable for a short period of the reservoir lifetime, which would be unjustified in economic analysis. Thus, in the proposed method and model, the full meeting demand is only possible for a percent time of the statistical period that is according to reliability constraint. In the general methods, although this concept apparently seems simple, there is a necessity to add binary variables for meeting or not meeting demands in the linear programming model structures. Thus, with many binary variables, solving the problem will be time consuming and difficult. Another way to solve the problem is the application of the yield model. This model includes some simpler assumptions and that is so difficult to consider details of the water resource system. The applicationof evolutionary algorithms, for the problems have many constraints, is also very complicated. Therefore, this study pursues another solution. Materials and Methods: In this study, for development and improvement the usual methods, instead of mix integer linear programming (MILP and the above methods, a simulation model including flow network linear programming is used coupled with an interface manual code in Matlab to account the reliability based on output file of the simulation model. The acre reservoir simulation program (ARSP has been utilized as a simulation model. A major advantage of the ARSP is its inherent flexibility in defining the operating policies through a penalty structure specified by the user. The ARSP utilizes network flow optimization techniques to handle a subset of general linear programming (LP problems for individual time intervals

  8. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  9. Dose estimation by biological methods

    International Nuclear Information System (INIS)

    Guerrero C, C.; David C, L.; Serment G, J.; Brena V, M.

    1997-01-01

    The human being is exposed to strong artificial radiation sources, mainly of two forms: the first is referred to the occupationally exposed personnel (POE) and the second, to the persons that require radiological treatment. A third form less common is by accidents. In all these conditions it is very important to estimate the absorbed dose. The classical biological dosimetry is based in the dicentric analysis. The present work is part of researches to the process to validate the In situ Fluorescent hybridation (FISH) technique which allows to analyse the aberrations on the chromosomes. (Author)

  10. Estimating the Parameters of Software Reliability Growth Models Using the Grey Wolf Optimization Algorithm

    OpenAIRE

    Alaa F. Sheta; Amal Abdel-Raouf

    2016-01-01

    In this age of technology, building quality software is essential to competing in the business market. One of the major principles required for any quality and business software product for value fulfillment is reliability. Estimating software reliability early during the software development life cycle saves time and money as it prevents spending larger sums fixing a defective software product after deployment. The Software Reliability Growth Model (SRGM) can be used to predict the number of...

  11. Reliability methods in nuclear power plant ageing management

    International Nuclear Information System (INIS)

    Simola, K.

    1999-01-01

    The aim of nuclear power plant ageing management is to maintain an adequate safety level throughout the lifetime of the plant. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time-dependent degradation. The phases of ageing analyses are generally the identification of critical components, identification and evaluation of ageing effects, and development of mitigation methods. This thesis focuses on the use of reliability methods and analyses of plant- specific operating experience in nuclear power plant ageing studies. The presented applications and method development have been related to nuclear power plants, but many of the approaches can also be applied outside the nuclear industry. The thesis consists of a summary and seven publications. The summary provides an overview of ageing management and discusses the role of reliability methods in ageing analyses. In the publications, practical applications and method development are described in more detail. The application areas at component and system level are motor-operated valves and protection automation systems, for which experience-based ageing analyses have been demonstrated. Furthermore, Bayesian ageing models for repairable components have been developed, and the management of ageing by improving maintenance practices is discussed. Recommendations for improvement of plant information management in order to facilitate ageing analyses are also given. The evaluation and mitigation of ageing effects on structural components is addressed by promoting the use of probabilistic modelling of crack growth, and developing models for evaluation of the reliability of inspection results. (orig.)

  12. Reliability methods in nuclear power plant ageing management

    Energy Technology Data Exchange (ETDEWEB)

    Simola, K. [VTT Automation, Espoo (Finland). Industrial Automation

    1999-07-01

    The aim of nuclear power plant ageing management is to maintain an adequate safety level throughout the lifetime of the plant. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time-dependent degradation. The phases of ageing analyses are generally the identification of critical components, identification and evaluation of ageing effects, and development of mitigation methods. This thesis focuses on the use of reliability methods and analyses of plant- specific operating experience in nuclear power plant ageing studies. The presented applications and method development have been related to nuclear power plants, but many of the approaches can also be applied outside the nuclear industry. The thesis consists of a summary and seven publications. The summary provides an overview of ageing management and discusses the role of reliability methods in ageing analyses. In the publications, practical applications and method development are described in more detail. The application areas at component and system level are motor-operated valves and protection automation systems, for which experience-based ageing analyses have been demonstrated. Furthermore, Bayesian ageing models for repairable components have been developed, and the management of ageing by improving maintenance practices is discussed. Recommendations for improvement of plant information management in order to facilitate ageing analyses are also given. The evaluation and mitigation of ageing effects on structural components is addressed by promoting the use of probabilistic modelling of crack growth, and developing models for evaluation of the reliability of inspection results. (orig.)

  13. Selected Methods For Increases Reliability The Of Electronic Systems Security

    Directory of Open Access Journals (Sweden)

    Paś Jacek

    2015-11-01

    Full Text Available The article presents the issues related to the different methods to increase the reliability of electronic security systems (ESS for example, a fire alarm system (SSP. Reliability of the SSP in the descriptive sense is a property preservation capacity to implement the preset function (e.g. protection: fire airport, the port, logistics base, etc., at a certain time and under certain conditions, e.g. Environmental, despite the possible non-compliance by a specific subset of elements this system. Analyzing the available literature on the ESS-SSP is not available studies on methods to increase the reliability (several works similar topics but moving with respect to the burglary and robbery (Intrusion. Based on the analysis of the set of all paths in the system suitability of the SSP for the scenario mentioned elements fire events (device critical because of security.

  14. Nuclear reactor component populations, reliability data bases, and their relationship to failure rate estimation and uncertainty analysis

    International Nuclear Information System (INIS)

    Martz, H.F.; Beckman, R.J.

    1981-12-01

    Probabilistic risk analyses are used to assess the risks inherent in the operation of existing and proposed nuclear power reactors. In performing such risk analyses the failure rates of various components which are used in a variety of reactor systems must be estimated. These failure rate estimates serve as input to fault trees and event trees used in the analyses. Component failure rate estimation is often based on relevant field failure data from different reliability data sources such as LERs, NPRDS, and the In-Plant Data Program. Various statistical data analysis and estimation methods have been proposed over the years to provide the required estimates of the component failure rates. This report discusses the basis and extent to which statistical methods can be used to obtain component failure rate estimates. The report is expository in nature and focuses on the general philosophical basis for such statistical methods. Various terms and concepts are defined and illustrated by means of numerous simple examples

  15. Fault-tolerant embedded system design and optimization considering reliability estimation uncertainty

    International Nuclear Information System (INIS)

    Wattanapongskorn, Naruemon; Coit, David W.

    2007-01-01

    In this paper, we model embedded system design and optimization, considering component redundancy and uncertainty in the component reliability estimates. The systems being studied consist of software embedded in associated hardware components. Very often, component reliability values are not known exactly. Therefore, for reliability analysis studies and system optimization, it is meaningful to consider component reliability estimates as random variables with associated estimation uncertainty. In this new research, the system design process is formulated as a multiple-objective optimization problem to maximize an estimate of system reliability, and also, to minimize the variance of the reliability estimate. The two objectives are combined by penalizing the variance for prospective solutions. The two most common fault-tolerant embedded system architectures, N-Version Programming and Recovery Block, are considered as strategies to improve system reliability by providing system redundancy. Four distinct models are presented to demonstrate the proposed optimization techniques with or without redundancy. For many design problems, multiple functionally equivalent software versions have failure correlation even if they have been independently developed. The failure correlation may result from faults in the software specification, faults from a voting algorithm, and/or related faults from any two software versions. Our approach considers this correlation in formulating practical optimization models. Genetic algorithms with a dynamic penalty function are applied in solving this optimization problem, and reasonable and interesting results are obtained and discussed

  16. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-11-01

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  17. A method of bias correction for maximal reliability with dichotomous measures.

    Science.gov (United States)

    Penev, Spiridon; Raykov, Tenko

    2010-02-01

    This paper is concerned with the reliability of weighted combinations of a given set of dichotomous measures. Maximal reliability for such measures has been discussed in the past, but the pertinent estimator exhibits a considerable bias and mean squared error for moderate sample sizes. We examine this bias, propose a procedure for bias correction, and develop a more accurate asymptotic confidence interval for the resulting estimator. In most empirically relevant cases, the bias correction and mean squared error correction can be performed simultaneously. We propose an approximate (asymptotic) confidence interval for the maximal reliability coefficient, discuss the implementation of this estimator, and investigate the mean squared error of the associated asymptotic approximation. We illustrate the proposed methods using a numerical example.

  18. A reliable method for the stability analysis of structures ...

    African Journals Online (AJOL)

    The detection of structural configurations with singular tangent stiffness matrix is essential because they can be unstable. The secondary paths, especially in unstable buckling, can play the most important role in the loss of stability and collapse of the structure. A new method for reliable detection and accurate computation of ...

  19. Methods to compute reliabilities for genomic predictions of feed intake

    Science.gov (United States)

    For new traits without historical reference data, cross-validation is often the preferred method to validate reliability (REL). Time truncation is less useful because few animals gain substantial REL after the truncation point. Accurate cross-validation requires separating genomic gain from pedigree...

  20. Planning of operation & maintenance using risk and reliability based methods

    DEFF Research Database (Denmark)

    Florian, Mihai; Sørensen, John Dalsgaard

    2015-01-01

    Operation and maintenance (OM) of offshore wind turbines contributes with a substantial part of the total levelized cost of energy (LCOE). The objective of this paper is to present an application of risk- and reliability-based methods for planning of OM. The theoretical basis is presented...

  1. Improving reliability of state estimation programming and computing suite based on analyzing a fault tree

    Directory of Open Access Journals (Sweden)

    Kolosok Irina

    2017-01-01

    Full Text Available Reliable information on the current state parameters obtained as a result of processing the measurements from systems of the SCADA and WAMS data acquisition and processing through methods of state estimation (SE is a condition that enables to successfully manage an energy power system (EPS. SCADA and WAMS systems themselves, as any technical systems, are subject to failures and faults that lead to distortion and loss of information. The SE procedure enables to find erroneous measurements, therefore, it is a barrier for the distorted information to penetrate into control problems. At the same time, the programming and computing suite (PCS implementing the SE functions may itself provide a wrong decision due to imperfection of the software algorithms and errors. In this study, we propose to use a fault tree to analyze consequences of failures and faults in SCADA and WAMS and in the very SE procedure. Based on the analysis of the obtained measurement information and on the SE results, we determine the state estimation PCS fault tolerance level featuring its reliability.

  2. Evaluating the reliability of multi-body mechanisms: A method considering the uncertainties of dynamic performance

    International Nuclear Information System (INIS)

    Wu, Jianing; Yan, Shaoze; Zuo, Ming J.

    2016-01-01

    Mechanism reliability is defined as the ability of a certain mechanism to maintain output accuracy under specified conditions. Mechanism reliability is generally assessed by the classical direct probability method (DPM) derived from the first order second moment (FOSM) method. The DPM relies strongly on the analytical form of the dynamic solution so it is not applicable to multi-body mechanisms that have only numerical solutions. In this paper, an indirect probability model (IPM) is proposed for mechanism reliability evaluation of multi-body mechanisms. IPM combines the dynamic equation, degradation function and Kaplan–Meier estimator to evaluate mechanism reliability comprehensively. Furthermore, to reduce the amount of computation in practical applications, the IPM is simplified into the indirect probability step model (IPSM). A case study of a crank–slider mechanism with clearance is investigated. Results show that relative errors between the theoretical and experimental results of mechanism reliability are less than 5%, demonstrating the effectiveness of the proposed method. - Highlights: • An indirect probability model (IPM) is proposed for mechanism reliability evaluation. • The dynamic equation, degradation function and Kaplan–Meier estimator are used. • Then the simplified form of indirect probability model is proposed. • The experimental results agree well with the predicted results.

  3. Girsanov's transformation based variance reduced Monte Carlo simulation schemes for reliability estimation in nonlinear stochastic dynamics

    Science.gov (United States)

    Kanjilal, Oindrila; Manohar, C. S.

    2017-07-01

    The study considers the problem of simulation based time variant reliability analysis of nonlinear randomly excited dynamical systems. Attention is focused on importance sampling strategies based on the application of Girsanov's transformation method. Controls which minimize the distance function, as in the first order reliability method (FORM), are shown to minimize a bound on the sampling variance of the estimator for the probability of failure. Two schemes based on the application of calculus of variations for selecting control signals are proposed: the first obtains the control force as the solution of a two-point nonlinear boundary value problem, and, the second explores the application of the Volterra series in characterizing the controls. The relative merits of these schemes, vis-à-vis the method based on ideas from the FORM, are discussed. Illustrative examples, involving archetypal single degree of freedom (dof) nonlinear oscillators, and a multi-degree of freedom nonlinear dynamical system, are presented. The credentials of the proposed procedures are established by comparing the solutions with pertinent results from direct Monte Carlo simulations.

  4. Issues in benchmarking human reliability analysis methods: A literature review

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Hendrickson, Stacey M.L.; Forester, John A.; Tran, Tuan Q.; Lois, Erasmia

    2010-01-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessments (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study comparing and evaluating HRA methods in assessing operator performance in simulator experiments is currently underway. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies is presented in order to aid in the design of future HRA benchmarking endeavors.

  5. Issues in benchmarking human reliability analysis methods : a literature review.

    Energy Technology Data Exchange (ETDEWEB)

    Lois, Erasmia (US Nuclear Regulatory Commission); Forester, John Alan; Tran, Tuan Q. (Idaho National Laboratory, Idaho Falls, ID); Hendrickson, Stacey M. Langfitt; Boring, Ronald L. (Idaho National Laboratory, Idaho Falls, ID)

    2008-04-01

    There is a diversity of human reliability analysis (HRA) methods available for use in assessing human performance within probabilistic risk assessment (PRA). Due to the significant differences in the methods, including the scope, approach, and underlying models, there is a need for an empirical comparison investigating the validity and reliability of the methods. To accomplish this empirical comparison, a benchmarking study is currently underway that compares HRA methods with each other and against operator performance in simulator studies. In order to account for as many effects as possible in the construction of this benchmarking study, a literature review was conducted, reviewing past benchmarking studies in the areas of psychology and risk assessment. A number of lessons learned through these studies are presented in order to aid in the design of future HRA benchmarking endeavors.

  6. Survey of industry methods for producing highly reliable software

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Persons, W.L.

    1994-11-01

    The Nuclear Reactor Regulation Office of the US Nuclear Regulatory Commission is charged with assessing the safety of new instrument and control designs for nuclear power plants which may use computer-based reactor protection systems. Lawrence Livermore National Laboratory has evaluated the latest techniques in software reliability for measurement, estimation, error detection, and prediction that can be used during the software life cycle as a means of risk assessment for reactor protection systems. One aspect of this task has been a survey of the software industry to collect information to help identify the design factors used to improve the reliability and safety of software. The intent was to discover what practices really work in industry and what design factors are used by industry to achieve highly reliable software. The results of the survey are documented in this report. Three companies participated in the survey: Computer Sciences Corporation, International Business Machines (Federal Systems Company), and TRW. Discussions were also held with NASA Software Engineering Lab/University of Maryland/CSC, and the AIAA Software Reliability Project

  7. Reliability/Cost Evaluation on Power System connected with Wind Power for the Reserve Estimation

    DEFF Research Database (Denmark)

    Lee, Go-Eun; Cha, Seung-Tae; Shin, Je-Seok

    2012-01-01

    Wind power is ideally a renewable energy with no fuel cost, but has a risk to reduce reliability of the whole system because of uncertainty of the output. If the reserve of the system is increased, the reliability of the system may be improved. However, the cost would be increased. Therefore...... the reserve needs to be estimated considering the trade-off between reliability and economic aspects. This paper suggests a methodology to estimate the appropriate reserve, when wind power is connected to the power system. As a case study, when wind power is connected to power system of Korea, the effects...

  8. An Investment Level Decision Method to Secure Long-term Reliability

    Science.gov (United States)

    Bamba, Satoshi; Yabe, Kuniaki; Seki, Tomomichi; Shibaya, Tetsuji

    The slowdown in power demand increase and facility replacement causes the aging and lower reliability in power facility. And the aging is followed by the rapid increase of repair and replacement when many facilities reach their lifetime in future. This paper describes a method to estimate the repair and replacement costs in future by applying the life-cycle cost model and renewal theory to the historical data. This paper also describes a method to decide the optimum investment plan, which replaces facilities in the order of cost-effectiveness by setting replacement priority formula, and the minimum investment level to keep the reliability. Estimation examples applied to substation facilities show that the reasonable and leveled future cash-out can keep the reliability by lowering the percentage of replacements caused by fatal failures.

  9. An adaptive neuro fuzzy model for estimating the reliability of component-based software systems

    Directory of Open Access Journals (Sweden)

    Kirti Tyagi

    2014-01-01

    Full Text Available Although many algorithms and techniques have been developed for estimating the reliability of component-based software systems (CBSSs, much more research is needed. Accurate estimation of the reliability of a CBSS is difficult because it depends on two factors: component reliability and glue code reliability. Moreover, reliability is a real-world phenomenon with many associated real-time problems. Soft computing techniques can help to solve problems whose solutions are uncertain or unpredictable. A number of soft computing approaches for estimating CBSS reliability have been proposed. These techniques learn from the past and capture existing patterns in data. The two basic elements of soft computing are neural networks and fuzzy logic. In this paper, we propose a model for estimating CBSS reliability, known as an adaptive neuro fuzzy inference system (ANFIS, that is based on these two basic elements of soft computing, and we compare its performance with that of a plain FIS (fuzzy inference system for different data sets.

  10. Reliable Dual Tensor Model Estimation in Single and Crossing Fibers Based on Jeffreys Prior

    Science.gov (United States)

    Yang, Jianfei; Poot, Dirk H. J.; Caan, Matthan W. A.; Su, Tanja; Majoie, Charles B. L. M.; van Vliet, Lucas J.; Vos, Frans M.

    2016-01-01

    Purpose This paper presents and studies a framework for reliable modeling of diffusion MRI using a data-acquisition adaptive prior. Methods Automated relevance determination estimates the mean of the posterior distribution of a rank-2 dual tensor model exploiting Jeffreys prior (JARD). This data-acquisition prior is based on the Fisher information matrix and enables the assessment whether two tensors are mandatory to describe the data. The method is compared to Maximum Likelihood Estimation (MLE) of the dual tensor model and to FSL’s ball-and-stick approach. Results Monte Carlo experiments demonstrated that JARD’s volume fractions correlated well with the ground truth for single and crossing fiber configurations. In single fiber configurations JARD automatically reduced the volume fraction of one compartment to (almost) zero. The variance in fractional anisotropy (FA) of the main tensor component was thereby reduced compared to MLE. JARD and MLE gave a comparable outcome in data simulating crossing fibers. On brain data, JARD yielded a smaller spread in FA along the corpus callosum compared to MLE. Tract-based spatial statistics demonstrated a higher sensitivity in detecting age-related white matter atrophy using JARD compared to both MLE and the ball-and-stick approach. Conclusions The proposed framework offers accurate and precise estimation of diffusion properties in single and dual fiber regions. PMID:27760166

  11. Reliance on and Reliability of the Engineer’s Estimate in Heavy Civil Projects

    Directory of Open Access Journals (Sweden)

    George Okere

    2017-06-01

    Full Text Available To the contractor, the engineer’s estimate is the target number to aim for, and the basis for a contractor to evaluate the accuracy of their estimate. To the owner, the engineer’s estimate is the basis for funding, evaluation of bids, and for predicting project costs. As such the engineer’s estimate is the benchmark. This research sought to investigate the reliance on, and the reliability of the engineer’s estimate in heavy civil cost estimate. The research objective was to characterize the engineer’s estimate and allow owners and contractors re-evaluate or affirm their reliance on the engineer’s estimate. A literature review was conducted to understand the reliance on the engineer’s estimate, and secondary data from Washington State Department of Transportation was used to investigate the reliability of the engineer’s estimate. The findings show the need for practitioners to re-evaluate their reliance on the engineer’s estimate. The empirical data showed that, within various contexts, the engineer’s estimate fell outside the expected accuracy range of the low bids or the cost to complete projects. The study recommends direct tracking of costs by project owners while projects are under construction, the use of a second estimate to improve the accuracy of their estimates, and use of the cost estimating practices found in highly reputable construction companies.

  12. Survey of methods used to asses human reliability in the human factors reliability benchmark exercise

    International Nuclear Information System (INIS)

    Poucet, A.

    1988-01-01

    The Joint Research Centre of the European Commission has organised a Human Factors Reliability Benchmark Exercise (HF-RBE) with the aim to assess the state-of-the-art in human reliability modelling and assessment. Fifteen teams from eleven countries, representing industry, utilities, licensing organisations and research institutes, participate in the HF-RBE, which is organised around two study cases: (1) analysis of routine functional test and maintenance procedures, with the aim to assess the probability of test-induced failures, the probability of failures to remain unrevealed, and the potential to initiate transients because of errors performed in the test; and (2) analysis of human actions during an operational transient, with the aim to assess the probability that the operators will correctly diagnose the malfunctions and take proper corrective action. The paper briefly reports how the HF-RBE was structured and gives an overview of the methods that have been used for predicting human reliability in both study cases. The experience in applying these methods is discussed and the results obtained are compared. (author)

  13. Reliance on and Reliability of the Engineer’s Estimate in Heavy Civil Projects

    OpenAIRE

    Okere, George

    2017-01-01

    To the contractor, the engineer’s estimate is the target number to aim for, and the basis for a contractor to evaluate the accuracy of their estimate. To the owner, the engineer’s estimate is the basis for funding, evaluation of bids, and for predicting project costs. As such the engineer’s estimate is the benchmark. This research sought to investigate the reliance on, and the reliability of the engineer’s estimate in heavy civil cost estimate. The research objective was to characterize the e...

  14. Reliability and discriminatory power of methods for dental plaque quantification

    Directory of Open Access Journals (Sweden)

    Daniela Prócida Raggio

    2010-04-01

    Full Text Available OBJECTIVE: This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI and fluorescence camera (FC to detect plaque. MATERIAL AND METHODS: Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. RESULTS: Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. CONCLUSION: The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque.

  15. An Energy-Based Limit State Function for Estimation of Structural Reliability in Shock Environments

    Directory of Open Access Journals (Sweden)

    Michael A. Guthrie

    2013-01-01

    Full Text Available limit state function is developed for the estimation of structural reliability in shock environments. This limit state function uses peak modal strain energies to characterize environmental severity and modal strain energies at failure to characterize the structural capacity. The Hasofer-Lind reliability index is briefly reviewed and its computation for the energy-based limit state function is discussed. Applications to two degree of freedom mass-spring systems and to a simple finite element model are considered. For these examples, computation of the reliability index requires little effort beyond a modal analysis, but still accounts for relevant uncertainties in both the structure and environment. For both examples, the reliability index is observed to agree well with the results of Monte Carlo analysis. In situations where fast, qualitative comparison of several candidate designs is required, the reliability index based on the proposed limit state function provides an attractive metric which can be used to compare and control reliability.

  16. Root biomass in cereals, catch crops and weeds can be reliably estimated without considering aboveground biomass

    DEFF Research Database (Denmark)

    Hu, Teng; Sørensen, Peter; Wahlström, Ellen Margrethe

    2018-01-01

    and management factors may affect this allometric relationship making such estimates uncertain and biased. Therefore, we aimed to explore how root biomass for typical cereal crops, catch crops and weeds could most reliably be estimated. Published and unpublished data on aboveground and root biomass (corrected...

  17. Lifetime prediction and reliability estimation methodology for Stirling-type pulse tube refrigerators by gaseous contamination accelerated degradation testing

    Science.gov (United States)

    Wan, Fubin; Tan, Yuanyuan; Jiang, Zhenhua; Chen, Xun; Wu, Yinong; Zhao, Peng

    2017-12-01

    Lifetime and reliability are the two performance parameters of premium importance for modern space Stirling-type pulse tube refrigerators (SPTRs), which are required to operate in excess of 10 years. Demonstration of these parameters provides a significant challenge. This paper proposes a lifetime prediction and reliability estimation method that utilizes accelerated degradation testing (ADT) for SPTRs related to gaseous contamination failure. The method was experimentally validated via three groups of gaseous contamination ADT. First, the performance degradation model based on mechanism of contamination failure and material outgassing characteristics of SPTRs was established. Next, a preliminary test was performed to determine whether the mechanism of contamination failure of the SPTRs during ADT is consistent with normal life testing. Subsequently, the experimental program of ADT was designed for SPTRs. Then, three groups of gaseous contamination ADT were performed at elevated ambient temperatures of 40 °C, 50 °C, and 60 °C, respectively and the estimated lifetimes of the SPTRs under normal condition were obtained through acceleration model (Arrhenius model). The results show good fitting of the degradation model with the experimental data. Finally, we obtained the reliability estimation of SPTRs through using the Weibull distribution. The proposed novel methodology enables us to take less than one year time to estimate the reliability of the SPTRs designed for more than 10 years.

  18. Parameter estimation of component reliability models in PSA model of Krsko NPP

    International Nuclear Information System (INIS)

    Jordan Cizelj, R.; Vrbanic, I.

    2001-01-01

    In the paper, the uncertainty analysis of component reliability models for independent failures is shown. The present approach for parameter estimation of component reliability models in NPP Krsko is presented. Mathematical approaches for different types of uncertainty analyses are introduced and used in accordance with some predisposed requirements. Results of the uncertainty analyses are shown in an example for time-related components. As the most appropriate uncertainty analysis proved the Bayesian estimation with the numerical estimation of a posterior, which can be approximated with some appropriate probability distribution, in this paper with lognormal distribution.(author)

  19. Reliability of third molar development for age estimation in Gujarati population: A comparative study.

    Science.gov (United States)

    Gandhi, Neha; Jain, Sandeep; Kumar, Manish; Rupakar, Pratik; Choyal, Kanaram; Prajapati, Seema

    2015-01-01

    Age assessment may be a crucial step in postmortem profiling leading to confirmative identification. In children, Demirjian's method based on eight developmental stages was developed to determine maturity scores as a function of age and polynomial functions to determine age as a function of score. Of this study was to evaluate the reliability of age estimation using Demirjian's eight teeth method following the French maturity scores and Indian-specific formula from developmental stages of third molar with the help of orthopantomograms using the Demirjian method. Dental panoramic tomograms from 30 subjects each of known chronological age and sex were collected and were evaluated according to Demirjian's criteria. Age calculations were performed using Demirjian's formula and Indian formula. Statistical analysis used was Chi-square test and ANOVA test and the P values obtained were statistically significant. There was an average underestimation of age with both Indian and Demirjian's formulas. The mean absolute error was lower using Indian formula hence it can be applied for age estimation in present Gujarati population. Also, females were ahead of achieving dental maturity than males thus completion of dental development is attained earlier in females. Greater accuracy can be obtained if population-specific formulas considering the ethnic and environmental variation are derived performing the regression analysis.

  20. Limitations in simulator time-based human reliability analysis methods

    International Nuclear Information System (INIS)

    Wreathall, J.

    1989-01-01

    Developments in human reliability analysis (HRA) methods have evolved slowly. Current methods are little changed from those of almost a decade ago, particularly in the use of time-reliability relationships. While these methods were suitable as an interim step, the time (and the need) has come to specify the next evolution of HRA methods. As with any performance-oriented data source, power plant simulator data have no direct connection to HRA models. Errors reported in data are normal deficiencies observed in human performance; failures are events modeled in probabilistic risk assessments (PRAs). Not all errors cause failures; not all failures are caused by errors. Second, the times at which actions are taken provide no measure of the likelihood of failures to act correctly within an accident scenario. Inferences can be made about human reliability, but they must be made with great care. Specific limitations are discussed. Simulator performance data are useful in providing qualitative evidence of the variety of error types and their potential influences on operating systems. More work is required to combine recent developments in the psychology of error with the qualitative data collected at stimulators. Until data become openly available, however, such an advance will not be practical

  1. Reliability-based design optimization via high order response surface method

    International Nuclear Information System (INIS)

    Li, Hong Shuang

    2013-01-01

    To reduce the computational effort of reliability-based design optimization (RBDO), the response surface method (RSM) has been widely used to evaluate reliability constraints. We propose an efficient methodology for solving RBDO problems based on an improved high order response surface method (HORSM) that takes advantage of an efficient sampling method, Hermite polynomials and uncertainty contribution concept to construct a high order response surface function with cross terms for reliability analysis. The sampling method generates supporting points from Gauss-Hermite quadrature points, which can be used to approximate response surface function without cross terms, to identify the highest order of each random variable and to determine the significant variables connected with point estimate method. The cross terms between two significant random variables are added to the response surface function to improve the approximation accuracy. Integrating the nested strategy, the improved HORSM is explored in solving RBDO problems. Additionally, a sampling based reliability sensitivity analysis method is employed to reduce the computational effort further when design variables are distributional parameters of input random variables. The proposed methodology is applied on two test problems to validate its accuracy and efficiency. The proposed methodology is more efficient than first order reliability method based RBDO and Monte Carlo simulation based RBDO, and enables the use of RBDO as a practical design tool.

  2. On the q-Weibull distribution for reliability applications: An adaptive hybrid artificial bee colony algorithm for parameter estimation

    International Nuclear Information System (INIS)

    Xu, Meng; Droguett, Enrique López; Lins, Isis Didier; Chagas Moura, Márcio das

    2017-01-01

    The q-Weibull model is based on the Tsallis non-extensive entropy and is able to model various behaviors of the hazard rate function, including bathtub curves, by using a single set of parameters. Despite its flexibility, the q-Weibull has not been widely used in reliability applications partly because of the complicated parameters estimation. In this work, the parameters of the q-Weibull are estimated by the maximum likelihood (ML) method. Due to the intricate system of nonlinear equations, derivative-based optimization methods may fail to converge. Thus, the heuristic optimization method of artificial bee colony (ABC) is used instead. To deal with the slow convergence of ABC, it is proposed an adaptive hybrid ABC (AHABC) algorithm that dynamically combines Nelder-Mead simplex search method with ABC for the ML estimation of the q-Weibull parameters. Interval estimates for the q-Weibull parameters, including confidence intervals based on the ML asymptotic theory and on bootstrap methods, are also developed. The AHABC is validated via numerical experiments involving the q-Weibull ML for reliability applications and results show that it produces faster and more accurate convergence when compared to ABC and similar approaches. The estimation procedure is applied to real reliability failure data characterized by a bathtub-shaped hazard rate. - Highlights: • Development of an Adaptive Hybrid ABC (AHABC) algorithm for q-Weibull distribution. • AHABC combines local Nelder-Mead simplex method with ABC to enhance local search. • AHABC efficiently finds the optimal solution for the q-Weibull ML problem. • AHABC outperforms ABC and self-adaptive hybrid ABC in accuracy and convergence speed. • Useful model for reliability data with non-monotonic hazard rate.

  3. Reliability of using nondestructive tests to estimate compressive strength of building stones and bricks

    Directory of Open Access Journals (Sweden)

    Ali Abd Elhakam Aliabdo

    2012-09-01

    Full Text Available This study aims to investigate the relationships between Schmidt hardness rebound number (RN and ultrasonic pulse velocity (UPV versus compressive strength (fc of stones and bricks. Four types of rocks (marble, pink lime stone, white lime stone and basalt and two types of burned bricks and lime-sand bricks were studied. Linear and non-linear models were proposed. High correlations were found between RN and UPV versus compressive strength. Validation of proposed models was assessed using other specimens for each material. Linear models for each material showed good correlations than non-linear models. General model between RN and compressive strength of tested stones and bricks showed a high correlation with regression coefficient R2 value of 0.94. Estimation of compressive strength for the studied stones and bricks using their rebound number and ultrasonic pulse velocity in a combined method was generally more reliable than using rebound number or ultrasonic pulse velocity only.

  4. Reliability Estimation with Uncertainties Consideration for High Power IGBTs in 2.3 MW Wind Turbine Converter System

    DEFF Research Database (Denmark)

    Kostandyan, Erik; Ma, Ke

    2012-01-01

    This paper investigates the lifetime of high power IGBTs (insulated gate bipolar transistors) used in large wind turbine applications. Since the IGBTs are critical components in a wind turbine power converter, it is of great importance to assess their reliability in the design phase of the turbine....... Minimum, maximum and average junction temperatures profiles for the grid side IGBTs are estimated at each wind speed input values. The selected failure mechanism is the crack propagation in solder joint under the silicon die. Based on junction temperature profiles and physics of failure model......, the probabilistic and determinist damage models are presented with estimated fatigue lives. Reliably levels were assessed by means of First Order Reliability Method taking into account uncertainties....

  5. An exact method for solving logical loops in reliability analysis

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2009-01-01

    This paper presents an exact method for solving logical loops in reliability analysis. The systems that include logical loops are usually described by simultaneous Boolean equations. First, present a basic rule of solving simultaneous Boolean equations. Next, show the analysis procedures for three-component system with external supports. Third, more detailed discussions are given for the establishment of logical loop relation. Finally, take up two typical structures which include more than one logical loop. Their analysis results and corresponding GO-FLOW charts are given. The proposed analytical method is applicable to loop structures that can be described by simultaneous Boolean equations, and it is very useful in evaluating the reliability of complex engineering systems.

  6. Method-related estimates of sperm vitality.

    Science.gov (United States)

    Cooper, Trevor G; Hellenkemper, Barbara

    2009-01-01

    Comparison of methods that estimate viability of human spermatozoa by monitoring head membrane permeability revealed that wet preparations (whether using positive or negative phase-contrast microscopy) generated significantly higher percentages of nonviable cells than did air-dried eosin-nigrosin smears. Only with the latter method did the sum of motile (presumed live) and stained (presumed dead) preparations never exceed 100%, making this the method of choice for sperm viability estimates.

  7. Matrix-based system reliability method and applications to bridge networks

    International Nuclear Information System (INIS)

    Kang, W.-H.; Song Junho; Gardoni, Paolo

    2008-01-01

    Using a matrix-based system reliability (MSR) method, one can estimate the probabilities of complex system events by simple matrix calculations. Unlike existing system reliability methods whose complexity depends highly on that of the system event, the MSR method describes any general system event in a simple matrix form and therefore provides a more convenient way of handling the system event and estimating its probability. Even in the case where one has incomplete information on the component probabilities and/or the statistical dependence thereof, the matrix-based framework enables us to estimate the narrowest bounds on the system failure probability by linear programming. This paper presents the MSR method and applies it to a transportation network consisting of bridge structures. The seismic failure probabilities of bridges are estimated by use of the predictive fragility curves developed by a Bayesian methodology based on experimental data and existing deterministic models of the seismic capacity and demand. Using the MSR method, the probability of disconnection between each city/county and a critical facility is estimated. The probability mass function of the number of failed bridges is computed as well. In order to quantify the relative importance of bridges, the MSR method is used to compute the conditional probabilities of bridge failures given that there is at least one city disconnected from the critical facility. The bounds on the probability of disconnection are also obtained for cases with incomplete information

  8. Estimating Between-Person and Within-Person Subscore Reliability with Profile Analysis.

    Science.gov (United States)

    Bulut, Okan; Davison, Mark L; Rodriguez, Michael C

    2017-01-01

    Subscores are of increasing interest in educational and psychological testing due to their diagnostic function for evaluating examinees' strengths and weaknesses within particular domains of knowledge. Previous studies about the utility of subscores have mostly focused on the overall reliability of individual subscores and ignored the fact that subscores should be distinct and have added value over the total score. This study introduces a profile reliability approach that partitions the overall subscore reliability into within-person and between-person subscore reliability. The estimation of between-person reliability and within-person reliability coefficients is demonstrated using subscores from number-correct scoring, unidimensional and multidimensional item response theory scoring, and augmented scoring approaches via a simulation study and a real data study. The effects of various testing conditions, such as subtest length, correlations among subscores, and the number of subtests, are examined. Results indicate that there is a substantial trade-off between within-person and between-person reliability of subscores. Profile reliability coefficients can be useful in determining the extent to which subscores provide distinct and reliable information under various testing conditions.

  9. Reliability Study Regarding the Use of Histogram Similarity Methods for Damage Detection

    Directory of Open Access Journals (Sweden)

    Nicoleta Gillich

    2013-01-01

    Full Text Available The paper analyses the reliability of three dissimilarity estimators to compare histograms, as support for a frequency-based damage detection method, able to identify structural changes in beam-like structures. First a brief presentation of the own developed damage detection method is made, with focus on damage localization. It consists actually in comparing a histogram derived from measurement results, with a large series of histograms, namely the damage location indexes for all locations along the beam, obtained by calculus. We tested some dissimilarity estimators like the Minkowski-form Distances, the Kullback-Leibler Divergence and the Histogram Intersection and found the Minkowski Distance as the method providing best results. It was tested for numerous locations, using real measurement results and with results artificially debased by noise, proving its reliability.

  10. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    Science.gov (United States)

    Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra

    2014-06-01

    different design scenarios. Finally, the reliability index values are obtained for the entire support structure in different design scenarios. The results of the work demonstrates that proposed reliability evaluation method of tunnel support system is effective not only for investigating the reliability of individual elements in the structure, but also for building an overall estimation about reliability performance of the entire tunnel structure.

  11. COMPOSITE METHOD OF RELIABILITY RESEARCH FOR HIERARCHICAL MULTILAYER ROUTING SYSTEMS

    Directory of Open Access Journals (Sweden)

    R. B. Tregubov

    2016-09-01

    Full Text Available The paper deals with the idea of a research method for hierarchical multilayer routing systems. The method represents a composition of methods of graph theories, reliability, probabilities, etc. These methods are applied to the solution of different private analysis and optimization tasks and are systemically connected and coordinated with each other through uniform set-theoretic representation of the object of research. The hierarchical multilayer routing systems are considered as infrastructure facilities (gas and oil pipelines, automobile and railway networks, systems of power supply and communication with distribution of material resources, energy or information with the use of hierarchically nested functions of routing. For descriptive reasons theoretical constructions are considered on the example of task solution of probability determination for up state of specific infocommunication system. The author showed the possibility of constructive combination of graph representation of structure of the object of research and a logic probable analysis method of its reliability indices through uniform set-theoretic representation of its elements and processes proceeding in them.

  12. A method of estimating log weights.

    Science.gov (United States)

    Charles N. Mann; Hilton H. Lysons

    1972-01-01

    This paper presents a practical method of estimating the weights of logs before they are yarded. Knowledge of log weights is required to achieve optimum loading of modern yarding equipment. Truckloads of logs are weighed and measured to obtain a local density index (pounds per cubic foot) for a species of logs. The density index is then used to estimate the weights of...

  13. Simple method for quick estimation of aquifer hydrogeological parameters

    Science.gov (United States)

    Ma, C.; Li, Y. Y.

    2017-08-01

    Development of simple and accurate methods to determine the aquifer hydrogeological parameters was of importance for groundwater resources assessment and management. Aiming at the present issue of estimating aquifer parameters based on some data of the unsteady pumping test, a fitting function of Theis well function was proposed using fitting optimization method and then a unitary linear regression equation was established. The aquifer parameters could be obtained by solving coefficients of the regression equation. The application of the proposed method was illustrated, using two published data sets. By the error statistics and analysis on the pumping drawdown, it showed that the method proposed in this paper yielded quick and accurate estimates of the aquifer parameters. The proposed method could reliably identify the aquifer parameters from long distance observed drawdowns and early drawdowns. It was hoped that the proposed method in this paper would be helpful for practicing hydrogeologists and hydrologists.

  14. Stochastic models and reliability parameter estimation applicable to nuclear power plant safety

    International Nuclear Information System (INIS)

    Mitra, S.P.

    1979-01-01

    A set of stochastic models and related estimation schemes for reliability parameters are developed. The models are applicable for evaluating reliability of nuclear power plant systems. Reliability information is extracted from model parameters which are estimated from the type and nature of failure data that is generally available or could be compiled in nuclear power plants. Principally, two aspects of nuclear power plant reliability have been investigated: (1) The statistical treatment of inplant component and system failure data; (2) The analysis and evaluation of common mode failures. The model inputs are failure data which have been classified as either the time type of failure data or the demand type of failure data. Failures of components and systems in nuclear power plant are, in general, rare events.This gives rise to sparse failure data. Estimation schemes for treating sparse data, whenever necessary, have been considered. The following five problems have been studied: 1) Distribution of sparse failure rate component data. 2) Failure rate inference and reliability prediction from time type of failure data. 3) Analyses of demand type of failure data. 4) Common mode failure model applicable to time type of failure data. 5) Estimation of common mode failures from 'near-miss' demand type of failure data

  15. The relationship between cost estimates reliability and BIM adoption: SEM analysis

    Science.gov (United States)

    Ismail, N. A. A.; Idris, N. H.; Ramli, H.; Rooshdi, R. R. Raja Muhammad; Sahamir, S. R.

    2018-02-01

    This paper presents the usage of Structural Equation Modelling (SEM) approach in analysing the effects of Building Information Modelling (BIM) technology adoption in improving the reliability of cost estimates. Based on the questionnaire survey results, SEM analysis using SPSS-AMOS application examined the relationships between BIM-improved information and cost estimates reliability factors, leading to BIM technology adoption. Six hypotheses were established prior to SEM analysis employing two types of SEM models, namely the Confirmatory Factor Analysis (CFA) model and full structural model. The SEM models were then validated through the assessment on their uni-dimensionality, validity, reliability, and fitness index, in line with the hypotheses tested. The final SEM model fit measures are: P-value=0.000, RMSEA=0.0790.90, TLI=0.956>0.90, NFI=0.935>0.90 and ChiSq/df=2.259; indicating that the overall index values achieved the required level of model fitness. The model supports all the hypotheses evaluated, confirming that all relationship exists amongst the constructs are positive and significant. Ultimately, the analysis verified that most of the respondents foresee better understanding of project input information through BIM visualization, its reliable database and coordinated data, in developing more reliable cost estimates. They also perceive to accelerate their cost estimating task through BIM adoption.

  16. Reliability estimation of structures under stochastic loading—A case study on nuclear piping

    International Nuclear Information System (INIS)

    Hari Prasad, M.; Rami Reddy, G.; Dubey, P.N.; Srividya, A.; Verma, A.K.

    2013-01-01

    Highlights: ► Structures are generally subjected to different types of loadings. ► One such type of loading is random sequence and has been treated as a stochastic fatigue loading. ► In this methodology both stress amplitude and number of cycles to failure have been considered as random variables. ► The methodology has been demonstrated with a case study on nuclear piping. ► The failure probability of piping has been estimated as a function of time. - Abstract: Generally structures are subjected to different types of loadings throughout their life time. These loads can be either discrete in nature or continuous in nature and also these can be either stationary or non stationary processes. This means that the structural reliability analysis not only considers random variables but also considers random variables which are functions of time, referred to as stochastic processes. A stochastic process can be viewed as a family of random variables. When a structure is subjected to a random loading, based on the stresses developed in the structure and failure criteria the failure probability can be estimated. In practice the structures are designed with higher factor of safety to take care of such random loads. In such cases the structure will fail only when the random loads are cyclic in nature. In traditional reliability analysis, the variation in the load is treated as a random variable and to account for the number of occurrences of the loading the concept of extreme value theory is used. But with this method one is neglecting the damage accumulation that will take place from one loading to another loading. Hence, in this paper, a new way of dealing with these types of problems has been discussed by using the concept of stochastic fatigue loading. The random loading has been considered as earthquake loading. The methodology has been demonstrated with a case study on nuclear power plant piping.

  17. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    Directory of Open Access Journals (Sweden)

    Nielsen Morten

    2009-07-01

    Full Text Available Abstract Background Estimation of the reliability of specific real value predictions is nontrivial and the efficacy of this is often questionable. It is important to know if you can trust a given prediction and therefore the best methods associate a prediction with a reliability score or index. For discrete qualitative predictions, the reliability is conventionally estimated as the difference between output scores of selected classes. Such an approach is not feasible for methods that predict a biological feature as a single real value rather than a classification. As a solution to this challenge, we have implemented a method that predicts the relative surface accessibility of an amino acid and simultaneously predicts the reliability for each prediction, in the form of a Z-score. Results An ensemble of artificial neural networks has been trained on a set of experimentally solved protein structures to predict the relative exposure of the amino acids. The method assigns a reliability score to each surface accessibility prediction as an inherent part of the training process. This is in contrast to the most commonly used procedures where reliabilities are obtained by post-processing the output. Conclusion The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability score with the individual predictions. However, our implementation of reliability scores in the form of a Z-score is shown to be the more informative measure for discriminating good predictions from bad ones in the entire range from completely buried to fully exposed amino acids. This is evident when comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0

  18. Nonparametric methods for volatility density estimation

    NARCIS (Netherlands)

    Es, van Bert; Spreij, P.J.C.; Zanten, van J.H.

    2009-01-01

    Stochastic volatility modelling of financial processes has become increasingly popular. The proposed models usually contain a stationary volatility process. We will motivate and review several nonparametric methods for estimation of the density of the volatility process. Both models based on

  19. Assessment of Electronic Circuits Reliability Using Boolean Truth Table Modeling Method

    International Nuclear Information System (INIS)

    EI-Shanshoury, A.I.

    2011-01-01

    This paper explores the use of Boolean Truth Table modeling Method (BTTM) in the analysis of qualitative data. It is widely used in certain fields especially in the fields of electrical and electronic engineering. Our work focuses on the evaluation of power supply circuit reliability using (BTTM) which involves systematic attempts to falsify and identify hypotheses on the basis of truth tables constructed from qualitative data. Reliability parameters such as the system's failure rates for the power supply case study are estimated. All possible state combinations (operating and failed states) of the major components in the circuit were listed and their effects on overall system were studied

  20. Spectrum estimation method based on marginal spectrum

    International Nuclear Information System (INIS)

    Cai Jianhua; Hu Weiwen; Wang Xianchun

    2011-01-01

    FFT method can not meet the basic requirements of power spectrum for non-stationary signal and short signal. A new spectrum estimation method based on marginal spectrum from Hilbert-Huang transform (HHT) was proposed. The procession of obtaining marginal spectrum in HHT method was given and the linear property of marginal spectrum was demonstrated. Compared with the FFT method, the physical meaning and the frequency resolution of marginal spectrum were further analyzed. Then the Hilbert spectrum estimation algorithm was discussed in detail, and the simulation results were given at last. The theory and simulation shows that under the condition of short data signal and non-stationary signal, the frequency resolution and estimation precision of HHT method is better than that of FFT method. (authors)

  1. Applicability of simplified human reliability analysis methods for severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Boring, R.; St Germain, S. [Idaho National Lab., Idaho Falls, Idaho (United States); Banaseanu, G.; Chatri, H.; Akl, Y. [Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2016-03-15

    Most contemporary human reliability analysis (HRA) methods were created to analyse design-basis accidents at nuclear power plants. As part of a comprehensive expansion of risk assessments at many plants internationally, HRAs will begin considering severe accident scenarios. Severe accidents, while extremely rare, constitute high consequence events that significantly challenge successful operations and recovery. Challenges during severe accidents include degraded and hazardous operating conditions at the plant, the shift in control from the main control room to the technical support center, the unavailability of plant instrumentation, and the need to use different types of operating procedures. Such shifts in operations may also test key assumptions in existing HRA methods. This paper discusses key differences between design basis and severe accidents, reviews efforts to date to create customized HRA methods suitable for severe accidents, and recommends practices for adapting existing HRA methods that are already being used for HRAs at the plants. (author)

  2. A simple reliability block diagram method for safety integrity verification

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2007-01-01

    IEC 61508 requires safety integrity verification for safety related systems to be a necessary procedure in safety life cycle. PFD avg must be calculated to verify the safety integrity level (SIL). Since IEC 61508-6 does not give detailed explanations of the definitions and PFD avg calculations for its examples, it is difficult for common reliability or safety engineers to understand when they use the standard as guidance in practice. A method using reliability block diagram is investigated in this study in order to provide a clear and feasible way of PFD avg calculation and help those who take IEC 61508-6 as their guidance. The method finds mean down times (MDTs) of both channel and voted group first and then PFD avg . The calculated results of various voted groups are compared with those in IEC61508 part 6 and Ref. [Zhang T, Long W, Sato Y. Availability of systems with self-diagnostic components-applying Markov model to IEC 61508-6. Reliab Eng System Saf 2003;80(2):133-41]. An interesting outcome can be realized from the comparison. Furthermore, although differences in MDT of voted groups exist between IEC 61508-6 and this paper, PFD avg of voted groups are comparatively close. With detailed description, the method of RBD presented can be applied to the quantitative SIL verification, showing a similarity of the method in IEC 61508-6

  3. Reliability of single aliquot regenerative protocol (SAR) for dose estimation in quartz at different burial temperatures: A simulation study

    International Nuclear Information System (INIS)

    Koul, D.K.; Pagonis, V.; Patil, P.

    2016-01-01

    The single aliquot regenerative protocol (SAR) is a well-established technique for estimating naturally acquired radiation doses in quartz. This simulation work examines the reliability of SAR protocol for samples which experienced different ambient temperatures in nature in the range of −10 to 40 °C. The contribution of various experimental variables used in SAR protocols to the accuracy and precision of the method is simulated for different ambient temperatures. Specifically the effects of paleo-dose, test dose, pre-heating temperature and cut-heat temperature on the accuracy of equivalent dose (ED) estimation are simulated by using random combinations of the concentrations of traps and centers using a previously published comprehensive quartz model. The findings suggest that the ambient temperature has a significant bearing on the reliability of natural dose estimation using SAR protocol, especially for ambient temperatures above 0 °C. The main source of these inaccuracies seems to be thermal sensitization of the quartz samples caused by the well-known thermal transfer of holes between luminescence centers in quartz. The simulations suggest that most of this inaccuracy in the dose estimation can be removed by delivering the laboratory doses in pulses (pulsed irradiation procedures). - Highlights: • Ambient temperatures affect the reliability of SAR. • It overestimates the dose with increase in burial temperature and burial time periods. • Elevated temperature irradiation does not correct for these overestimations. • Inaccuracies in dose estimation can be removed by incorporating pulsed irradiation procedures.

  4. Reliability analysis of road network for estimation of public evacuation time around NPPs

    Energy Technology Data Exchange (ETDEWEB)

    Bang, Sun-Young; Lee, Gab-Bock; Chung, Yang-Geun [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    2007-07-01

    The most strong protection method of radiation emergency preparedness is the evacuation of the public members when a great deal of radioactivity is released to environment. After the Three Mile Island (TMI) nuclear power plant meltdown in the United States and Chernobyl nuclear power plant disaster in the U.S.S.R, many advanced countries including the United States and Japan have continued research on estimation of public evacuation time as one of emergency countermeasure technologies. Also in South Korea, 'Framework Act on Civil Defense: Radioactive Disaster Preparedness Plan' was established in 1983 and nuclear power plants set up a radiation emergency plan and have regularly carried out radiation emergency preparedness trainings. Nonetheless, there is still a need to improve technology to estimate public evacuation time by executing precise analysis of traffic flow to prepare practical and efficient ways to protect the public. In this research, road network for Wolsong and Kori NPPs was constructed by CORSIM code and Reliability analysis of this road network was performed.

  5. Application of system reliability analytical method, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Mitomo, Nobuo; Miyazaki, Keiko; Matsukura, Hiroshi; Kobayashi, Michiyuki

    1999-01-01

    The Ship Research Institute proceed a developmental study on GO-FLOW method with various advancing functionalities for the system reliability analysis method occupying main parts of PSA (Probabilistic Safety Assessment). Here was attempted to intend to upgrade functionality of the GO-FLOW method, to develop an analytical function integrated with dynamic behavior analytical function, physical behavior and probable subject transfer, and to prepare a main accident sequence picking-out function. In 1997 fiscal year, in dynamic event-tree analytical system, an analytical function was developed by adding dependency between headings. In simulation analytical function of the accident sequence, main accident sequence of MRX for improved ship propulsion reactor became possible to be covered perfectly. And, input data for analysis was prepared with a function capable easily to set by an analysis operator. (G.K.)

  6. Solution-verified reliability analysis and design of bistable MEMS using error estimation and adaptivity.

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, Michael Scott; Subia, Samuel Ramirez; Neckels, David; Hopkins, Matthew Morgan; Notz, Patrick K.; Adams, Brian M.; Carnes, Brian; Wittwer, Jonathan W.; Bichon, Barron J.; Copps, Kevin D.

    2006-10-01

    This report documents the results for an FY06 ASC Algorithms Level 2 milestone combining error estimation and adaptivity, uncertainty quantification, and probabilistic design capabilities applied to the analysis and design of bistable MEMS. Through the use of error estimation and adaptive mesh refinement, solution verification can be performed in an automated and parameter-adaptive manner. The resulting uncertainty analysis and probabilistic design studies are shown to be more accurate, efficient, reliable, and convenient.

  7. Evaluation of non cyanide methods for hemoglobin estimation

    Directory of Open Access Journals (Sweden)

    Vinaya B Shah

    2011-01-01

    Full Text Available Background: The hemoglobincyanide method (HiCN method for measuring hemoglobin is used extensively worldwide; its advantages are the ready availability of a stable and internationally accepted reference standard calibrator. However, its use may create a problem, as the waste disposal of large volumes of reagent containing cyanide constitutes a potential toxic hazard. Aims and Objective: As an alternative to drabkin`s method of Hb estimation, we attempted to estimate hemoglobin by other non-cyanide methods: alkaline hematin detergent (AHD-575 using Triton X-100 as lyser and alkaline- borax method using quarternary ammonium detergents as lyser. Materials and Methods: The hemoglobin (Hb results on 200 samples of varying Hb concentrations obtained by these two cyanide free methods were compared with a cyanmethemoglobin method on a colorimeter which is light emitting diode (LED based. Hemoglobin was also estimated in one hundred blood donors and 25 blood samples of infants and compared by these methods. Statistical analysis used was Pearson`s correlation coefficient. Results: The response of the non cyanide method is linear for serially diluted blood samples over the Hb concentration range from 3gm/dl -20 gm/dl. The non cyanide methods has a precision of + 0.25g/dl (coefficient of variation= (2.34% and is suitable for use with fixed wavelength or with colorimeters at wavelength- 530 nm and 580 nm. Correlation of these two methods was excellent (r=0.98. The evaluation has shown it to be as reliable and reproducible as HiCN for measuring hemoglobin at all concentrations. The reagents used in non cyanide methods are non-biohazardous and did not affect the reliability of data determination and also the cost was less than HiCN method. Conclusions: Thus, non cyanide methods of Hb estimation offer possibility of safe and quality Hb estimation and should prove useful for routine laboratory use. Non cyanide methods is easily incorporated in hemobloginometers

  8. A dynamic discretization method for reliability inference in Dynamic Bayesian Networks

    International Nuclear Information System (INIS)

    Zhu, Jiandao; Collette, Matthew

    2015-01-01

    The material and modeling parameters that drive structural reliability analysis for marine structures are subject to a significant uncertainty. This is especially true when time-dependent degradation mechanisms such as structural fatigue cracking are considered. Through inspection and monitoring, information such as crack location and size can be obtained to improve these parameters and the corresponding reliability estimates. Dynamic Bayesian Networks (DBNs) are a powerful and flexible tool to model dynamic system behavior and update reliability and uncertainty analysis with life cycle data for problems such as fatigue cracking. However, a central challenge in using DBNs is the need to discretize certain types of continuous random variables to perform network inference while still accurately tracking low-probability failure events. Most existing discretization methods focus on getting the overall shape of the distribution correct, with less emphasis on the tail region. Therefore, a novel scheme is presented specifically to estimate the likelihood of low-probability failure events. The scheme is an iterative algorithm which dynamically partitions the discretization intervals at each iteration. Through applications to two stochastic crack-growth example problems, the algorithm is shown to be robust and accurate. Comparisons are presented between the proposed approach and existing methods for the discretization problem. - Highlights: • A dynamic discretization method is developed for low-probability events in DBNs. • The method is compared to existing approaches on two crack growth problems. • The method is shown to improve on existing methods for low-probability events

  9. Human decomposition and the reliability of a 'Universal' model for post mortem interval estimations.

    Science.gov (United States)

    Cockle, Diane L; Bell, Lynne S

    2015-08-01

    Human decomposition is a complex biological process driven by an array of variables which are not clearly understood. The medico-legal community have long been searching for a reliable method to establish the post-mortem interval (PMI) for those whose deaths have either been hidden, or gone un-noticed. To date, attempts to develop a PMI estimation method based on the state of the body either at the scene or at autopsy have been unsuccessful. One recent study has proposed that two simple formulae, based on the level of decomposition humidity and temperature, could be used to accurately calculate the PMI for bodies outside, on or under the surface worldwide. This study attempted to validate 'Formula I' [1] (for bodies on the surface) using 42 Canadian cases with known PMIs. The results indicated that bodies exposed to warm temperatures consistently overestimated the known PMI by a large and inconsistent margin for Formula I estimations. And for bodies exposed to cold and freezing temperatures (less than 4°C), then the PMI was dramatically under estimated. The ability of 'Formulae II' to estimate the PMI for buried bodies was also examined using a set of 22 known Canadian burial cases. As these cases used in this study are retrospective, some of the data needed for Formula II was not available. The 4.6 value used in Formula II to represent the standard ratio of time that burial decelerates the rate of decomposition was examined. The average time taken to achieve each stage of decomposition both on, and under the surface was compared for the 118 known cases. It was found that the rate of decomposition was not consistent throughout all stages of decomposition. The rates of autolysis above and below the ground were equivalent with the buried cases staying in a state of putrefaction for a prolonged period of time. It is suggested that differences in temperature extremes and humidity levels between geographic regions may make it impractical to apply formulas developed in

  10. Methods for risk estimation in nuclear energy

    Energy Technology Data Exchange (ETDEWEB)

    Gauvenet, A [CEA, 75 - Paris (France)

    1979-01-01

    The author presents methods for estimating the different risks related to nuclear energy: immediate or delayed risks, individual or collective risks, risks of accidents and long-term risks. These methods have attained a highly valid level of elaboration and their application to other industrial or human problems is currently under way, especially in English-speaking countries.

  11. Estimating the Reliability of Aggregated and Within-Person Centered Scores in Ecological Momentary Assessment

    Science.gov (United States)

    Huang, Po-Hsien; Weng, Li-Jen

    2012-01-01

    A procedure for estimating the reliability of test scores in the context of ecological momentary assessment (EMA) was proposed to take into account the characteristics of EMA measures. Two commonly used test scores in EMA were considered: the aggregated score (AGGS) and the within-person centered score (WPCS). Conceptually, AGGS and WPCS represent…

  12. Nonparametric Estimation of Interval Reliability for Discrete-Time Semi-Markov Systems

    DEFF Research Database (Denmark)

    Georgiadis, Stylianos; Limnios, Nikolaos

    2016-01-01

    In this article, we consider a repairable discrete-time semi-Markov system with finite state space. The measure of the interval reliability is given as the probability of the system being operational over a given finite-length time interval. A nonparametric estimator is proposed for the interval...

  13. Uncertainty in reliability estimation : when do we know everything we know?

    NARCIS (Netherlands)

    Houben, M.J.H.A.; Sonnemans, P.J.M.; Newby, M.J.; Bris, R.; Guedes Soares, C.; Martorell, S.

    2009-01-01

    In this paperwe demonstrate the use of an adapted GroundedTheory approach through interviews and their analysis to determine explicit uncertainty (known unknowns) for reliability estimation in the early phases of product development.We have applied the adapted Grounded Theory approach in a case

  14. Estimating reliability coefficients with heterogeneous item weightings using Stata: A factor based approach

    NARCIS (Netherlands)

    Boermans, M.A.; Kattenberg, M.A.C.

    2011-01-01

    We show how to estimate a Cronbach's alpha reliability coefficient in Stata after running a principal component or factor analysis. Alpha evaluates to what extent items measure the same underlying content when the items are combined into a scale or used for latent variable. Stata allows for testing

  15. Can a sample of Landsat sensor scenes reliably estimate the global extent of tropical deforestation?

    Science.gov (United States)

    R. L. Czaplewski

    2003-01-01

    Tucker and Townshend (2000) conclude that wall-to-wall coverage is needed to avoid gross errors in estimations of deforestation rates' because tropical deforestation is concentrated along roads and rivers. They specifically question the reliability of the 10% sample of Landsat sensor scenes used in the global remote sensing survey conducted by the Food and...

  16. Integration of external estimated breeding values and associated reliabilities using correlations among traits and effects

    NARCIS (Netherlands)

    Vandenplas, J.; Colinet, F.G.; Glorieux, G.; Bertozzi, C.; Gengler, N.

    2015-01-01

    Based on a Bayesian view of linear mixed models, several studies showed the possibilities to integrate estimated breeding values (EBV) and associated reliabilities (REL) provided by genetic evaluations performed outside a given evaluation system into this genetic evaluation. Hereafter, the term

  17. A Group Contribution Method for Estimating Cetane and Octane Numbers

    Energy Technology Data Exchange (ETDEWEB)

    Kubic, William Louis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Process Modeling and Analysis Group

    2016-07-28

    Much of the research on advanced biofuels is devoted to the study of novel chemical pathways for converting nonfood biomass into liquid fuels that can be blended with existing transportation fuels. Many compounds under consideration are not found in the existing fuel supplies. Often, the physical properties needed to assess the viability of a potential biofuel are not available. The only reliable information available may be the molecular structure. Group contribution methods for estimating physical properties from molecular structure have been used for more than 60 years. The most common application is estimation of thermodynamic properties. More recently, group contribution methods have been developed for estimating rate dependent properties including cetane and octane numbers. Often, published group contribution methods are limited in terms of types of function groups and range of applicability. In this study, a new, broadly-applicable group contribution method based on an artificial neural network was developed to estimate cetane number research octane number, and motor octane numbers of hydrocarbons and oxygenated hydrocarbons. The new method is more accurate over a greater range molecular weights and structural complexity than existing group contribution methods for estimating cetane and octane numbers.

  18. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    Directory of Open Access Journals (Sweden)

    Hai An

    2016-08-01

    Full Text Available Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new hybrid reliability index definition is presented based on the random–fuzzy–interval model. Furthermore, the calculation flowchart of the hybrid reliability index is presented and it is solved using the modified limit-step length iterative algorithm, which ensures convergence. And the validity of convergent algorithm for the hybrid reliability model is verified through the calculation examples in literature. In the end, a numerical example is demonstrated to show that the hybrid reliability index is applicable for the wear reliability assessment of mechanisms, where truncated random variables, fuzzy random variables, and interval variables coexist. The demonstration also shows the good convergence of the iterative algorithm proposed in this article.

  19. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... analysis of the complex prior representation, where we show that the ability to induce sparse estimates of a given prior heavily depends on the inference method used and, interestingly, whether real or complex variables are inferred. We also show that the Bayesian estimators derived from the proposed...

  20. Estimations of parameters in Pareto reliability model in the presence of masked data

    International Nuclear Information System (INIS)

    Sarhan, Ammar M.

    2003-01-01

    Estimations of parameters included in the individual distributions of the life times of system components in a series system are considered in this paper based on masked system life test data. We consider a series system of two independent components each has a Pareto distributed lifetime. The maximum likelihood and Bayes estimators for the parameters and the values of the reliability of the system's components at a specific time are obtained. Symmetrical triangular prior distributions are assumed for the unknown parameters to be estimated in obtaining the Bayes estimators of these parameters. Large simulation studies are done in order: (i) explain how one can utilize the theoretical results obtained; (ii) compare the maximum likelihood and Bayes estimates obtained of the underlying parameters; and (iii) study the influence of the masking level and the sample size on the accuracy of the estimates obtained

  1. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  2. Comparison of methods for estimating premorbid intelligence

    OpenAIRE

    Bright, Peter; van der Linde, Ian

    2018-01-01

    To evaluate impact of neurological injury on cognitive performance it is typically necessary to derive a baseline (or ‘premorbid’) estimate of a patient’s general cognitive ability prior to the onset of impairment. In this paper, we consider a range of common methods for producing this estimate, including those based on current best performance, embedded ‘hold/no hold’ tests, demographic information, and word reading ability. Ninety-two neurologically healthy adult participants were assessed ...

  3. METHODICAL APPROACH TO AN ESTIMATION OF PROFESSIONALISM OF AN EMPLOYEE

    Directory of Open Access Journals (Sweden)

    Татьяна Александровна Коркина

    2013-08-01

    Full Text Available Analysis of definitions of «professionalism», reflecting the different viewpoints of scientists and practitioners, has shown that it is interpreted as a specific property of the people effectively and reliably carry out labour activity in a variety of conditions. The article presents the methodical approach to an estimation of professionalism of the employee from the position as the external manifestations of the reliability and effectiveness of the work and the position of the personal characteristics of the employee, determining the results of his work. This approach includes the assessment of the level of qualification and motivation of the employee for each key job functions as well as the final results of its implementation on the criteria of efficiency and reliability. The proposed methodological approach to the estimation of professionalism of the employee allows to identify «bottlenecks» in the structure of its labour functions and to define directions of development of the professional qualities of the worker to ensure the required level of reliability and efficiency of the obtained results.DOI: http://dx.doi.org/10.12731/2218-7405-2013-6-11

  4. Estimation of the human error probabilities in the human reliability analysis

    International Nuclear Information System (INIS)

    Liu Haibin; He Xuhong; Tong Jiejuan; Shen Shifei

    2006-01-01

    Human error data is an important issue of human reliability analysis (HRA). Using of Bayesian parameter estimation, which can use multiple information, such as the historical data of NPP and expert judgment data to modify the human error data, could get the human error data reflecting the real situation of NPP more truly. This paper, using the numeric compute program developed by the authors, presents some typical examples to illustrate the process of the Bayesian parameter estimation in HRA and discusses the effect of different modification data on the Bayesian parameter estimation. (authors)

  5. On estimation of reliability of a nuclear power plant with tokamak reactor

    International Nuclear Information System (INIS)

    Klemin, A.I.; Smetannikov, V.P.; Shiverskij, E.A.

    1982-01-01

    The results of the analysis of INTOR plant reliability are presented. The first stage of the analysis consists in the calculation of the INTOR plant structural reliability factors (15 ibs main systems have been considered). For each system the failure flow parameter (W(1/h)) and operational readiness Ksub(r) have been determined which for the plant as a whole besides these factors-technological utilization coefficient Ksub(TU) and mean-cycles-between failures Tsub(o). The second stage of the reliability analysis consists in investigating methods of improving its reliability factors reratively to the one calculated at the first level stage. It is shown that the reliability of the whole plant to the most essential extent is determined by the power supply system reliability. The following as to the influence extent on the INTOR plant reliability is the cryogenic system. Calculations of the INTOR plant reliability factors have given the following values: W=4,5x10 -3 1/h. Tsub(o)=152 h, Ksub(r)=0,71, Ksub(TU)=o,4 g

  6. A fast approximation method for reliability analysis of cold-standby systems

    International Nuclear Information System (INIS)

    Wang, Chaonan; Xing, Liudong; Amari, Suprasad V.

    2012-01-01

    Analyzing reliability of large cold-standby systems has been a complicated and time-consuming task, especially for systems with components having non-exponential time-to-failure distributions. In this paper, an approximation model, which is based on the central limit theorem, is presented for the reliability analysis of binary cold-standby systems. The proposed model can estimate the reliability of large cold-standby systems with binary-state components having arbitrary time-to-failure distributions in an efficient and easy way. The accuracy and efficiency of the proposed method are illustrated using several different types of distributions for both 1-out-of-n and k-out-of-n cold-standby systems.

  7. Structural reliability calculation method based on the dual neural network and direct integration method.

    Science.gov (United States)

    Li, Haibin; He, Yun; Nie, Xiaobo

    2018-01-01

    Structural reliability analysis under uncertainty is paid wide attention by engineers and scholars due to reflecting the structural characteristics and the bearing actual situation. The direct integration method, started from the definition of reliability theory, is easy to be understood, but there are still mathematics difficulties in the calculation of multiple integrals. Therefore, a dual neural network method is proposed for calculating multiple integrals in this paper. Dual neural network consists of two neural networks. The neural network A is used to learn the integrand function, and the neural network B is used to simulate the original function. According to the derivative relationships between the network output and the network input, the neural network B is derived from the neural network A. On this basis, the performance function of normalization is employed in the proposed method to overcome the difficulty of multiple integrations and to improve the accuracy for reliability calculations. The comparisons between the proposed method and Monte Carlo simulation method, Hasofer-Lind method, the mean value first-order second moment method have demonstrated that the proposed method is an efficient and accurate reliability method for structural reliability problems.

  8. Confidence Estimation of Reliability Indices of the System with Elements Duplication and Recovery

    Directory of Open Access Journals (Sweden)

    I. V. Pavlov

    2017-01-01

    Full Text Available The article considers a problem to estimate a confidence interval of the main reliability indices such as availability rate, mean time between failures, and operative availability (in the stationary state for the model of the system with duplication and independent recovery of elements.Presents a solution of the problem for a situation that often arises in practice, when there are unknown exact values of the reliability parameters of the elements, and only test data of the system or its individual parts (elements, subsystems for reliability are known. It should be noted that the problems of the confidence estimate of reliability indices of the complex systems based on the testing results of their individual elements are fairly common function in engineering practice when designing and running the various engineering systems. The available papers consider this problem, mainly, for non-recovery systems.Describes a solution of this problem for the important particular case when the system elements are duplicated by the reserved elements, and the elements that have failed in the course of system operation are recovered (regardless of the state of other elements.An approximate solution of this problem is obtained for the case of high reliability or "fast recovery" of elements on the assumption that the average recovery time of elements is small as compared to the average time between failures.

  9. Collection of methods for reliability and safety engineering

    International Nuclear Information System (INIS)

    Fussell, J.B.; Rasmuson, D.M.; Wilson, J.R.; Burdick, G.R.; Zipperer, J.C.

    1976-04-01

    The document presented contains five reports each describing a method of reliability and safety engineering. Report I provides a conceptual framework for the study of component malfunctions during system evaluations. Report II provides methods for locating groups of critical component failures such that all the component failures in a given group can be caused to occur by the occurrence of a single separate event. These groups of component failures are called common cause candidates. Report III provides a method for acquiring and storing system-independent component failure logic information. The information stored is influenced by the concepts presented in Report I and also includes information useful in locating common cause candidates. Report IV puts forth methods for analyzing situations that involve systems which change character in a predetermined time sequence. These phased missions techniques are applicable to the hypothetical ''accident chains'' frequently analyzed for nuclear power plants. Report V presents a unified approach to cause-consequence analysis, a method of analysis useful during risk assessments. This approach, as developed by the Danish Atomic Energy Commission, is modified to reflect the format and symbology conventionally used for other types of analysis of nuclear reactor systems

  10. A Reliability-Oriented Design Method for Power Electronic Converters

    DEFF Research Database (Denmark)

    Wang, Huai; Zhou, Dao; Blaabjerg, Frede

    2013-01-01

    Reliability is a crucial performance indicator of power electronic systems in terms of availability, mission accomplishment and life cycle cost. A paradigm shift in the research on reliability of power electronics is going on from simple handbook based calculations (e.g. models in MIL-HDBK-217F h...... and reliability prediction models are provided. A case study on a 2.3 MW wind power converter is discussed with emphasis on the reliability critical component IGBT modules....

  11. A simple method to estimate interwell autocorrelation

    Energy Technology Data Exchange (ETDEWEB)

    Pizarro, J.O.S.; Lake, L.W. [Univ. of Texas, Austin, TX (United States)

    1997-08-01

    The estimation of autocorrelation in the lateral or interwell direction is important when performing reservoir characterization studies using stochastic modeling. This paper presents a new method to estimate the interwell autocorrelation based on parameters, such as the vertical range and the variance, that can be estimated with commonly available data. We used synthetic fields that were generated from stochastic simulations to provide data to construct the estimation charts. These charts relate the ratio of areal to vertical variance and the autocorrelation range (expressed variously) in two directions. Three different semivariogram models were considered: spherical, exponential and truncated fractal. The overall procedure is demonstrated using field data. We find that the approach gives the most self-consistent results when it is applied to previously identified facies. Moreover, the autocorrelation trends follow the depositional pattern of the reservoir, which gives confidence in the validity of the approach.

  12. Precision profiles and analytic reliability of radioimmunologic methods

    International Nuclear Information System (INIS)

    Yaneva, Z.; Popova, Yu.

    1991-01-01

    The aim of the present study is to investigate and compare some methods for creation of 'precision profiles' (PP) and to clarify their possibilities for determining the analytical reliability of RIA. Only methods without complicated mathematical calculations has been used. The reproducibility in serums with a concentration of the determinable hormone in the whole range of the calibration curve has been studied. The radioimmunoassay has been performed with TSH-RIA set (ex East Germany), and comparative evaluations - with commercial sets of HOECHST (Germany) and AMERSHAM (GB). Three methods for obtaining the relationship concentration (IU/l) -reproducibility (C.V.,%) are used and a comparison is made of their corresponding profiles: preliminary rough profile, Rodbard-PP and Ekins-PP. It is concluded that the creation of a precision profile is obligatory and the method of its construction does not influence the relationship's course. PP allows to determine concentration range giving stable results which improves the efficiency of the analitical work. 16 refs., 4 figs

  13. Study on Performance Shaping Factors (PSFs) Quantification Method in Human Reliability Analysis (HRA)

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, Inseok Jang; Seong, Poong Hyun; Park, Jinkyun; Kim, Jong Hyun

    2015-01-01

    The purpose of HRA implementation is 1) to achieve the human factor engineering (HFE) design goal of providing operator interfaces that will minimize personnel errors and 2) to conduct an integrated activity to support probabilistic risk assessment (PRA). For these purposes, various HRA methods have been developed such as technique for human error rate prediction (THERP), simplified plant analysis risk human reliability assessment (SPAR-H), cognitive reliability and error analysis method (CREAM) and so on. In performing HRA, such conditions that influence human performances have been represented via several context factors called performance shaping factors (PSFs). PSFs are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. Most HRA methods evaluate the weightings of PSFs by expert judgment and explicit guidance for evaluating the weighting is not provided. It has been widely known that the performance of the human operator is one of the critical factors to determine the safe operation of NPPs. HRA methods have been developed to identify the possibility and mechanism of human errors. In performing HRA methods, the effect of PSFs which may increase or decrease human error should be investigated. However, the effect of PSFs were estimated by expert judgment so far. Accordingly, in order to estimate the effect of PSFs objectively, the quantitative framework to estimate PSFs by using PSF profiles is introduced in this paper

  14. Reliable Quantification of the Potential for Equations Based on Spot Urine Samples to Estimate Population Salt Intake

    DEFF Research Database (Denmark)

    Huang, Liping; Crino, Michelle; Wu, Jason Hy

    2016-01-01

    to a standard format. Individual participant records will be compiled and a series of analyses will be completed to: (1) compare existing equations for estimating 24-hour salt intake from spot urine samples with 24-hour urine samples, and assess the degree of bias according to key demographic and clinical......BACKGROUND: Methods based on spot urine samples (a single sample at one time-point) have been identified as a possible alternative approach to 24-hour urine samples for determining mean population salt intake. OBJECTIVE: The aim of this study is to identify a reliable method for estimating mean...... population salt intake from spot urine samples. This will be done by comparing the performance of existing equations against one other and against estimates derived from 24-hour urine samples. The effects of factors such as ethnicity, sex, age, body mass index, antihypertensive drug use, health status...

  15. Efficient Methods of Estimating Switchgrass Biomass Supplies

    Science.gov (United States)

    Switchgrass (Panicum virgatum L.) is being developed as a biofuel feedstock for the United States. Efficient and accurate methods to estimate switchgrass biomass feedstock supply within a production area will be required by biorefineries. Our main objective was to determine the effectiveness of in...

  16. Coalescent methods for estimating phylogenetic trees.

    Science.gov (United States)

    Liu, Liang; Yu, Lili; Kubatko, Laura; Pearl, Dennis K; Edwards, Scott V

    2009-10-01

    We review recent models to estimate phylogenetic trees under the multispecies coalescent. Although the distinction between gene trees and species trees has come to the fore of phylogenetics, only recently have methods been developed that explicitly estimate species trees. Of the several factors that can cause gene tree heterogeneity and discordance with the species tree, deep coalescence due to random genetic drift in branches of the species tree has been modeled most thoroughly. Bayesian approaches to estimating species trees utilizes two likelihood functions, one of which has been widely used in traditional phylogenetics and involves the model of nucleotide substitution, and the second of which is less familiar to phylogeneticists and involves the probability distribution of gene trees given a species tree. Other recent parametric and nonparametric methods for estimating species trees involve parsimony criteria, summary statistics, supertree and consensus methods. Species tree approaches are an appropriate goal for systematics, appear to work well in some cases where concatenation can be misleading, and suggest that sampling many independent loci will be paramount. Such methods can also be challenging to implement because of the complexity of the models and computational time. In addition, further elaboration of the simplest of coalescent models will be required to incorporate commonly known issues such as deviation from the molecular clock, gene flow and other genetic forces.

  17. Gray bootstrap method for estimating frequency-varying random vibration signals with small samples

    Directory of Open Access Journals (Sweden)

    Wang Yanqing

    2014-04-01

    Full Text Available During environment testing, the estimation of random vibration signals (RVS is an important technique for the airborne platform safety and reliability. However, the available methods including extreme value envelope method (EVEM, statistical tolerances method (STM and improved statistical tolerance method (ISTM require large samples and typical probability distribution. Moreover, the frequency-varying characteristic of RVS is usually not taken into account. Gray bootstrap method (GBM is proposed to solve the problem of estimating frequency-varying RVS with small samples. Firstly, the estimated indexes are obtained including the estimated interval, the estimated uncertainty, the estimated value, the estimated error and estimated reliability. In addition, GBM is applied to estimating the single flight testing of certain aircraft. At last, in order to evaluate the estimated performance, GBM is compared with bootstrap method (BM and gray method (GM in testing analysis. The result shows that GBM has superiority for estimating dynamic signals with small samples and estimated reliability is proved to be 100% at the given confidence level.

  18. Risk-based methods for reliability investments in electric power distribution systems

    Energy Technology Data Exchange (ETDEWEB)

    Alvehag, Karin

    2011-07-01

    Society relies more and more on a continuous supply of electricity. However, while under investments in reliability lead to an unacceptable number of power interruptions, over investments result in too high costs for society. To give incentives for a socio economically optimal level of reliability, quality regulations have been adopted in many European countries. These quality regulations imply new financial risks for the distribution system operator (DSO) since poor reliability can reduce the allowed revenue for the DSO and compensation may have to be paid to affected customers. This thesis develops a method for evaluating the incentives for reliability investments implied by different quality regulation designs. The method can be used to investigate whether socio economically beneficial projects are also beneficial for a profit-maximizing DSO subject to a particular quality regulation design. To investigate which reinvestment projects are preferable for society and a DSO, risk-based methods are developed. With these methods, the probability of power interruptions and the consequences of these can be simulated. The consequences of interruptions for the DSO will to a large extent depend on the quality regulation. The consequences for the customers, and hence also society, will depend on factors such as the interruption duration and time of occurrence. The proposed risk-based methods consider extreme outage events in the risk assessments by incorporating the impact of severe weather, estimating the full probability distribution of the total reliability cost, and formulating a risk-averse strategy. Results from case studies performed show that quality regulation design has a significant impact on reinvestment project profitability for a DSO. In order to adequately capture the financial risk that the DSO is exposed to, detailed riskbased methods, such as the ones developed in this thesis, are needed. Furthermore, when making investment decisions, a risk

  19. Estimating the reliability of glycemic index values and potential sources of methodological and biological variability.

    Science.gov (United States)

    Matthan, Nirupa R; Ausman, Lynne M; Meng, Huicui; Tighiouart, Hocine; Lichtenstein, Alice H

    2016-10-01

    The utility of glycemic index (GI) values for chronic disease risk management remains controversial. Although absolute GI value determinations for individual foods have been shown to vary significantly in individuals with diabetes, there is a dearth of data on the reliability of GI value determinations and potential sources of variability among healthy adults. We examined the intra- and inter-individual variability in glycemic response to a single food challenge and methodologic and biological factors that potentially mediate this response. The GI value for white bread was determined by using standardized methodology in 63 volunteers free from chronic disease and recruited to differ by sex, age (18-85 y), and body mass index [BMI (in kg/m 2 ): 20-35]. Volunteers randomly underwent 3 sets of food challenges involving glucose (reference) and white bread (test food), both providing 50 g available carbohydrates. Serum glucose and insulin were monitored for 5 h postingestion, and GI values were calculated by using different area under the curve (AUC) methods. Biochemical variables were measured by using standard assays and body composition by dual-energy X-ray absorptiometry. The mean ± SD GI value for white bread was 62 ± 15 when calculated by using the recommended method. Mean intra- and interindividual CVs were 20% and 25%, respectively. Increasing sample size, replication of reference and test foods, and length of blood sampling, as well as AUC calculation method, did not improve the CVs. Among the biological factors assessed, insulin index and glycated hemoglobin values explained 15% and 16% of the variability in mean GI value for white bread, respectively. These data indicate that there is substantial variability in individual responses to GI value determinations, demonstrating that it is unlikely to be a good approach to guiding food choices. Additionally, even in healthy individuals, glycemic status significantly contributes to the variability in GI value

  20. Development of reliability centered maintenance methods and tools

    International Nuclear Information System (INIS)

    Jacquot, J.P.; Dubreuil-Chambardel, A.; Lannoy, A.; Monnier, B.

    1992-12-01

    This paper recalls the development of the RCM (Reliability Centered Maintenance) approach in the nuclear industry and describes the trial study implemented by EDF in the context of the OMF (RCM) Project. The approach developed is currently being applied to about thirty systems (Industrial Project). On a parallel, R and D efforts are being maintained to improve the selectivity of the analysis methods. These methods use Probabilistic Safety Study models, thereby guaranteeing better selectivity in the identification of safety critical elements and enhancing consistency between Maintenance and Safety studies. They also offer more detailed analysis of operation feedback, invoking for example Bayes' methods combining expert judgement and feedback data. Finally, they propose a functional and material representation of the plant. This dual representation describes both the functions assured by maintenance provisions and the material elements required for their implementation. In the final chapter, the targets of the future OMF workstation are summarized and the latter's insertion in the EDF information system is briefly described. (authors). 5 figs., 2 tabs., 7 refs

  1. Reliability and validity of a brief method to assess nociceptive flexion reflex (NFR) threshold.

    Science.gov (United States)

    Rhudy, Jamie L; France, Christopher R

    2011-07-01

    The nociceptive flexion reflex (NFR) is a physiological tool to study spinal nociception. However, NFR assessment can take several minutes and expose participants to repeated suprathreshold stimulations. The 4 studies reported here assessed the reliability and validity of a brief method to assess NFR threshold that uses a single ascending series of stimulations (Peak 1 NFR), by comparing it to a well-validated method that uses 3 ascending/descending staircases of stimulations (Staircase NFR). Correlations between the NFR definitions were high, were on par with test-retest correlations of Staircase NFR, and were not affected by participant sex or chronic pain status. Results also indicated the test-retest reliabilities for the 2 definitions were similar. Using larger stimulus increments (4 mAs) to assess Peak 1 NFR tended to result in higher NFR threshold estimates than using the Staircase NFR definition, whereas smaller stimulus increments (2 mAs) tended to result in lower NFR threshold estimates than the Staircase NFR definition. Neither NFR definition was correlated with anxiety, pain catastrophizing, or anxiety sensitivity. In sum, a single ascending series of electrical stimulations results in a reliable and valid estimate of NFR threshold. However, caution may be warranted when comparing NFR thresholds across studies that differ in the ascending stimulus increments. This brief method to assess NFR threshold is reliable and valid; therefore, it should be useful to clinical pain researchers interested in quickly assessing inter- and intra-individual differences in spinal nociceptive processes. Copyright © 2011 American Pain Society. Published by Elsevier Inc. All rights reserved.

  2. Reliability of Lyapunov characteristic exponents computed by the two-particle method

    Science.gov (United States)

    Mei, Lijie; Huang, Li

    2018-03-01

    For highly complex problems, such as the post-Newtonian formulation of compact binaries, the two-particle method may be a better, or even the only, choice to compute the Lyapunov characteristic exponent (LCE). This method avoids the complex calculations of variational equations compared with the variational method. However, the two-particle method sometimes provides spurious estimates to LCEs. In this paper, we first analyze the equivalence in the definition of LCE between the variational and two-particle methods for Hamiltonian systems. Then, we develop a criterion to determine the reliability of LCEs computed by the two-particle method by considering the magnitude of the initial tangent (or separation) vector ξ0 (or δ0), renormalization time interval τ, machine precision ε, and global truncation error ɛT. The reliable Lyapunov characteristic indicators estimated by the two-particle method form a V-shaped region, which is restricted by d0, ε, and ɛT. Finally, the numerical experiments with the Hénon-Heiles system, the spinning compact binaries, and the post-Newtonian circular restricted three-body problem strongly support the theoretical results.

  3. An application of the fault tree analysis for the power system reliability estimation

    International Nuclear Information System (INIS)

    Volkanovski, A.; Cepin, M.; Mavko, B.

    2007-01-01

    The power system is a complex system with its main function to produce, transfer and provide consumers with electrical energy. Combinations of failures of components in the system can result in a failure of power delivery to certain load points and in some cases in a full blackout of power system. The power system reliability directly affects safe and reliable operation of nuclear power plants because the loss of offsite power is a significant contributor to the core damage frequency in probabilistic safety assessments of nuclear power plants. The method, which is based on the integration of the fault tree analysis with the analysis of the power flows in the power system, was developed and implemented for power system reliability assessment. The main contributors to the power system reliability are identified, both quantitatively and qualitatively. (author)

  4. A Simplified Procedure for Reliability Estimation of Underground Concrete Barriers against Normal Missile Impact

    Directory of Open Access Journals (Sweden)

    N. A. Siddiqui

    2011-06-01

    Full Text Available Underground concrete barriers are frequently used to protect strategic structures like Nuclear power plants (NPP, deep under the soil against any possible high velocity missile impact. For a given range and type of missile (or projectile it is of paramount importance to examine the reliability of underground concrete barriers under expected uncertainties involved in the missile, concrete, and soil parameters. In this paper, a simple procedure for the reliability assessment of underground concrete barriers against normal missile impact has been presented using the First Order Reliability Method (FORM. The presented procedure is illustrated by applying it to a concrete barrier that lies at a certain depth in the soil. Some parametric studies are also conducted to obtain the design values which make the barrier as reliable as desired.

  5. Reliability of piping system components. Framework for estimating failure parameters from service data

    International Nuclear Information System (INIS)

    Nyman, R.; Hegedus, D.; Tomic, B.; Lydell, B.

    1997-12-01

    This report summarizes results and insights from the final phase of a R and D project on piping reliability sponsored by the Swedish Nuclear Power Inspectorate (SKI). The technical scope includes the development of an analysis framework for estimating piping reliability parameters from service data. The R and D has produced a large database on the operating experience with piping systems in commercial nuclear power plants worldwide. It covers the period 1970 to the present. The scope of the work emphasized pipe failures (i.e., flaws/cracks, leaks and ruptures) in light water reactors (LWRs). Pipe failures are rare events. A data reduction format was developed to ensure that homogenous data sets are prepared from scarce service data. This data reduction format distinguishes between reliability attributes and reliability influence factors. The quantitative results of the analysis of service data are in the form of conditional probabilities of pipe rupture given failures (flaws/cracks, leaks or ruptures) and frequencies of pipe failures. Finally, the R and D by SKI produced an analysis framework in support of practical applications of service data in PSA. This, multi-purpose framework, termed 'PFCA'-Pipe Failure Cause and Attribute- defines minimum requirements on piping reliability analysis. The application of service data should reflect the requirements of an application. Together with raw data summaries, this analysis framework enables the development of a prior and a posterior pipe rupture probability distribution. The framework supports LOCA frequency estimation, steam line break frequency estimation, as well as the development of strategies for optimized in-service inspection strategies

  6. Reliability of piping system components. Framework for estimating failure parameters from service data

    Energy Technology Data Exchange (ETDEWEB)

    Nyman, R [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Hegedus, D; Tomic, B [ENCONET Consulting GesmbH, Vienna (Austria); Lydell, B [RSA Technologies, Vista, CA (United States)

    1997-12-01

    This report summarizes results and insights from the final phase of a R and D project on piping reliability sponsored by the Swedish Nuclear Power Inspectorate (SKI). The technical scope includes the development of an analysis framework for estimating piping reliability parameters from service data. The R and D has produced a large database on the operating experience with piping systems in commercial nuclear power plants worldwide. It covers the period 1970 to the present. The scope of the work emphasized pipe failures (i.e., flaws/cracks, leaks and ruptures) in light water reactors (LWRs). Pipe failures are rare events. A data reduction format was developed to ensure that homogenous data sets are prepared from scarce service data. This data reduction format distinguishes between reliability attributes and reliability influence factors. The quantitative results of the analysis of service data are in the form of conditional probabilities of pipe rupture given failures (flaws/cracks, leaks or ruptures) and frequencies of pipe failures. Finally, the R and D by SKI produced an analysis framework in support of practical applications of service data in PSA. This, multi-purpose framework, termed `PFCA`-Pipe Failure Cause and Attribute- defines minimum requirements on piping reliability analysis. The application of service data should reflect the requirements of an application. Together with raw data summaries, this analysis framework enables the development of a prior and a posterior pipe rupture probability distribution. The framework supports LOCA frequency estimation, steam line break frequency estimation, as well as the development of strategies for optimized in-service inspection strategies. 63 refs, 30 tabs, 22 figs.

  7. EVALUATION OF METHODS FOR ESTIMATING FATIGUE PROPERTIES APPLIED TO STAINLESS STEELS AND ALUMINUM ALLOYS

    Directory of Open Access Journals (Sweden)

    Taylor Mac Intyer Fonseca Junior

    2013-12-01

    Full Text Available This work evaluate seven estimation methods of fatigue properties applied to stainless steels and aluminum alloys. Experimental strain-life curves are compared to the estimations obtained by each method. After applying seven different estimation methods at 14 material conditions, it was found that fatigue life can be estimated with good accuracy only by the Bäumel-Seeger method for the martensitic stainless steel tempered between 300°C and 500°C. The differences between mechanical behavior during monotonic and cyclic loading are probably the reason for the absence of a reliable method for estimation of fatigue behavior from monotonic properties for a group of materials.

  8. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.

  9. Fusion rule estimation using vector space methods

    International Nuclear Information System (INIS)

    Rao, N.S.V.

    1997-01-01

    In a system of N sensors, the sensor S j , j = 1, 2 .... N, outputs Y (j) element-of Re, according to an unknown probability distribution P (Y(j) /X) , corresponding to input X element-of [0, 1]. A training n-sample (X 1 , Y 1 ), (X 2 , Y 2 ), ..., (X n , Y n ) is given where Y i = (Y i (1) , Y i (2) , . . . , Y i N ) such that Y i (j) is the output of S j in response to input X i . The problem is to estimate a fusion rule f : Re N → [0, 1], based on the sample, such that the expected square error is minimized over a family of functions Y that constitute a vector space. The function f* that minimizes the expected error cannot be computed since the underlying densities are unknown, and only an approximation f to f* is feasible. We estimate the sample size sufficient to ensure that f provides a close approximation to f* with a high probability. The advantages of vector space methods are two-fold: (a) the sample size estimate is a simple function of the dimensionality of F, and (b) the estimate f can be easily computed by well-known least square methods in polynomial time. The results are applicable to the classical potential function methods and also (to a recently proposed) special class of sigmoidal feedforward neural networks

  10. Validity and reliability of Nike + Fuelband for estimating physical activity energy expenditure.

    Science.gov (United States)

    Tucker, Wesley J; Bhammar, Dharini M; Sawyer, Brandon J; Buman, Matthew P; Gaesser, Glenn A

    2015-01-01

    The Nike + Fuelband is a commercially available, wrist-worn accelerometer used to track physical activity energy expenditure (PAEE) during exercise. However, validation studies assessing the accuracy of this device for estimating PAEE are lacking. Therefore, this study examined the validity and reliability of the Nike + Fuelband for estimating PAEE during physical activity in young adults. Secondarily, we compared PAEE estimation of the Nike + Fuelband with the previously validated SenseWear Armband (SWA). Twenty-four participants (n = 24) completed two, 60-min semi-structured routines consisting of sedentary/light-intensity, moderate-intensity, and vigorous-intensity physical activity. Participants wore a Nike + Fuelband and SWA, while oxygen uptake was measured continuously with an Oxycon Mobile (OM) metabolic measurement system (criterion). The Nike + Fuelband (ICC = 0.77) and SWA (ICC = 0.61) both demonstrated moderate to good validity. PAEE estimates provided by the Nike + Fuelband (246 ± 67 kcal) and SWA (238 ± 57 kcal) were not statistically different than OM (243 ± 67 kcal). Both devices also displayed similar mean absolute percent errors for PAEE estimates (Nike + Fuelband = 16 ± 13 %; SWA = 18 ± 18 %). Test-retest reliability for PAEE indicated good stability for Nike + Fuelband (ICC = 0.96) and SWA (ICC = 0.90). The Nike + Fuelband provided valid and reliable estimates of PAEE, that are similar to the previously validated SWA, during a routine that included approximately equal amounts of sedentary/light-, moderate- and vigorous-intensity physical activity.

  11. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis

    Directory of Open Access Journals (Sweden)

    Adela-Eliza Dumitrascu

    2015-01-01

    Full Text Available Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram, which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.

  12. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis.

    Science.gov (United States)

    Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina

    2015-01-01

    Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.

  13. Adaptive Methods for Permeability Estimation and Smart Well Management

    Energy Technology Data Exchange (ETDEWEB)

    Lien, Martha Oekland

    2005-04-01

    The main focus of this thesis is on adaptive regularization methods. We consider two different applications, the inverse problem of absolute permeability estimation and the optimal control problem of estimating smart well management. Reliable estimates of absolute permeability are crucial in order to develop a mathematical description of an oil reservoir. Due to the nature of most oil reservoirs, mainly indirect measurements are available. In this work, dynamic production data from wells are considered. More specifically, we have investigated into the resolution power of pressure data for permeability estimation. The inversion of production data into permeability estimates constitutes a severely ill-posed problem. Hence, regularization techniques are required. In this work, deterministic regularization based on adaptive zonation is considered, i.e. a solution approach with adaptive multiscale estimation in conjunction with level set estimation is developed for coarse scale permeability estimation. A good mathematical reservoir model is a valuable tool for future production planning. Recent developments within well technology have given us smart wells, which yield increased flexibility in the reservoir management. In this work, we investigate into the problem of finding the optimal smart well management by means of hierarchical regularization techniques based on multiscale parameterization and refinement indicators. The thesis is divided into two main parts, where Part I gives a theoretical background for a collection of research papers that has been written by the candidate in collaboration with others. These constitutes the most important part of the thesis, and are presented in Part II. A brief outline of the thesis follows below. Numerical aspects concerning calculations of derivatives will also be discussed. Based on the introduction to regularization given in Chapter 2, methods for multiscale zonation, i.e. adaptive multiscale estimation and refinement

  14. The Reliability Estimation for the Open Function of Cabin Door Affected by the Imprecise Judgment Corresponding to Distribution Hypothesis

    Science.gov (United States)

    Yu, Z. P.; Yue, Z. F.; Liu, W.

    2018-05-01

    With the development of artificial intelligence, more and more reliability experts have noticed the roles of subjective information in the reliability design of complex system. Therefore, based on the certain numbers of experiment data and expert judgments, we have divided the reliability estimation based on distribution hypothesis into cognition process and reliability calculation. Consequently, for an illustration of this modification, we have taken the information fusion based on intuitional fuzzy belief functions as the diagnosis model of cognition process, and finished the reliability estimation for the open function of cabin door affected by the imprecise judgment corresponding to distribution hypothesis.

  15. Estimation of reliability on digital plant protection system in nuclear power plants using fault simulation with self-checking

    International Nuclear Information System (INIS)

    Lee, Jun Seok; Kim, Suk Joon; Seong, Poong Hyun

    2004-01-01

    Safety-critical digital systems in nuclear power plants require high design reliability. Reliable software design and accurate prediction methods for the system reliability are important problems. In the reliability analysis, the error detection coverage of the system is one of the crucial factors, however, it is difficult to evaluate the error detection coverage of digital instrumentation and control system in nuclear power plants due to complexity of the system. To evaluate the error detection coverage for high efficiency and low cost, the simulation based fault injections with self checking are needed for digital instrumentation and control system in nuclear power plants. The target system is local coincidence logic in digital plant protection system and a simplified software modeling for this target system is used in this work. C++ based hardware description of micro computer simulator system is used to evaluate the error detection coverage of the system. From the simulation result, it is possible to estimate the error detection coverage of digital plant protection system in nuclear power plants using simulation based fault injection method with self checking. (author)

  16. A MONTE-CARLO METHOD FOR ESTIMATING THE CORRELATION EXPONENT

    NARCIS (Netherlands)

    MIKOSCH, T; WANG, QA

    We propose a Monte Carlo method for estimating the correlation exponent of a stationary ergodic sequence. The estimator can be considered as a bootstrap version of the classical Hill estimator. A simulation study shows that the method yields reasonable estimates.

  17. Methods to estimate the genetic risk

    International Nuclear Information System (INIS)

    Ehling, U.H.

    1989-01-01

    The estimation of the radiation-induced genetic risk to human populations is based on the extrapolation of results from animal experiments. Radiation-induced mutations are stochastic events. The probability of the event depends on the dose; the degree of the damage dose not. There are two main approaches in making genetic risk estimates. One of these, termed the direct method, expresses risk in terms of expected frequencies of genetic changes induced per unit dose. The other, referred to as the doubling dose method or the indirect method, expresses risk in relation to the observed incidence of genetic disorders now present in man. The advantage of the indirect method is that not only can Mendelian mutations be quantified, but also other types of genetic disorders. The disadvantages of the method are the uncertainties in determining the current incidence of genetic disorders in human and, in addition, the estimasion of the genetic component of congenital anomalies, anomalies expressed later and constitutional and degenerative diseases. Using the direct method we estimated that 20-50 dominant radiation-induced mutations would be expected in 19 000 offspring born to parents exposed in Hiroshima and Nagasaki, but only a small proportion of these mutants would have been detected with the techniques used for the population study. These methods were used to predict the genetic damage from the fallout of the reactor accident at Chernobyl in the vicinity of Southern Germany. The lack of knowledge for the interaction of chemicals with ionizing radiation and the discrepancy between the high safety standards for radiation protection and the low level of knowledge for the toxicological evaluation of chemical mutagens will be emphasized. (author)

  18. Application of Fault Tree Analysis for Estimating Temperature Alarm Circuit Reliability

    International Nuclear Information System (INIS)

    El-Shanshoury, A.I.; El-Shanshoury, G.I.

    2011-01-01

    Fault Tree Analysis (FTA) is one of the most widely-used methods in system reliability analysis. It is a graphical technique that provides a systematic description of the combinations of possible occurrences in a system, which can result in an undesirable outcome. The presented paper deals with the application of FTA method in analyzing temperature alarm circuit. The criticality failure of this circuit comes from failing to alarm when temperature exceeds a certain limit. In order for a circuit to be safe, a detailed analysis of the faults causing circuit failure is performed by configuring fault tree diagram (qualitative analysis). Calculations of circuit quantitative reliability parameters such as Failure Rate (FR) and Mean Time between Failures (MTBF) are also done by using Relex 2009 computer program. Benefits of FTA are assessing system reliability or safety during operation, improving understanding of the system, and identifying root causes of equipment failures

  19. Radioisotope method potentialities in machine reliability and durability enhancement

    International Nuclear Information System (INIS)

    Postnikov, V.I.

    1975-01-01

    The development of a surface activation method is reviewed with regard to wear of machine parts. Examples demonstrating the highly promising aspects and practical application of the method are cited. The use of high-sensitivity instruments and variation of activation depth from 10 um to 0.5 mm allows to perform the investigations at a sensitivity of 0.05 um and to estimate the linear values of machine wear. Standard diagrams are presented for measuring the wear of different machine parts by means of surface activation. Investigations performed at several Soviet technological institutes afford a set of dependences, which characterize the distribution of radioactive isotopes in depth under different conditions of activation of diverse metals and alloys and permit to study the wear of any metal

  20. Accuracy and Reliability of the Klales et al. (2012) Morphoscopic Pelvic Sexing Method.

    Science.gov (United States)

    Lesciotto, Kate M; Doershuk, Lily J

    2018-01-01

    Klales et al. (2012) devised an ordinal scoring system for the morphoscopic pelvic traits described by Phenice (1969) and used for sex estimation of skeletal remains. The aim of this study was to test the accuracy and reliability of the Klales method using a large sample from the Hamann-Todd collection (n = 279). Two observers were blinded to sex, ancestry, and age and used the Klales et al. method to estimate the sex of each individual. Sex was correctly estimated for females with over 95% accuracy; however, the male allocation accuracy was approximately 50%. Weighted Cohen's kappa and intraclass correlation coefficient analysis for evaluating intra- and interobserver error showed moderate to substantial agreement for all traits. Although each trait can be reliably scored using the Klales method, low accuracy rates and high sex bias indicate better trait descriptions and visual guides are necessary to more accurately reflect the range of morphological variation. © 2017 American Academy of Forensic Sciences.

  1. ESTIMATION OF PARAMETERS AND RELIABILITY FUNCTION OF EXPONENTIATED EXPONENTIAL DISTRIBUTION: BAYESIAN APPROACH UNDER GENERAL ENTROPY LOSS FUNCTION

    Directory of Open Access Journals (Sweden)

    Sanjay Kumar Singh

    2011-06-01

    Full Text Available In this Paper we propose Bayes estimators of the parameters of Exponentiated Exponential distribution and Reliability functions under General Entropy loss function for Type II censored sample. The proposed estimators have been compared with the corresponding Bayes estimators obtained under Squared Error loss function and maximum likelihood estimators for their simulated risks (average loss over sample space.

  2. Methods for qualification of highly reliable software - international procedure

    International Nuclear Information System (INIS)

    Kersken, M.

    1997-01-01

    Despite the advantages of computer-assisted safety technology, there still is some uneasyness to be observed with respect to the novel processes, resulting from absence of a body of generally accepted and uncontentious qualification guides (regulatory provisions, standards) for safety evaluation of the computer codes applied. Warranty of adequate protection of the population, operators or plant components is an essential aspect in this context, too - as it is in general with reliability and risk assessment of novel technology - so that, due to appropriate legislation still missing, there currently is a licensing risk involved in the introduction of digital safety systems. Nevertheless, there is some extent of agreement within the international community and utility operators about what standards and measures should be applied for qualification of software of relevance to plant safety. The standard IEC 880/IEC 86/ in particular, in its original version, or national documents based on this standard, are applied in all countries using or planning to install those systems. A novel supplement to this standard, document /IEC 96/, is in the process of finalization and defines the requirements to be met by modern methods of software engineering. (orig./DG) [de

  3. Uncertainty analysis of nonlinear systems employing the first-order reliability method

    International Nuclear Information System (INIS)

    Choi, Chan Kyu; Yoo, Hong Hee

    2012-01-01

    In most mechanical systems, properties of the system elements have uncertainties due to several reasons. For example, mass, stiffness coefficient of a spring, damping coefficient of a damper or friction coefficients have uncertain characteristics. The uncertain characteristics of the elements have a direct effect on the system performance uncertainty. It is very important to estimate the performance uncertainty since the performance uncertainty is directly related to manufacturing yield and consumer satisfaction. Due to this reason, the performance uncertainty should be estimated accurately and considered in the system design. In this paper, performance measures are defined for nonlinear vibration systems and the performance measure uncertainties are estimated employing the first order reliability method (FORM). It was found that the FORM could provide good results in spite of the system nonlinear characteristics. Comparing to the results obtained by Monte Carlo Simulation (MCS), the accuracy of the uncertainty analysis results obtained by the FORM is validated

  4. Network reliability analysis of complex systems using a non-simulation-based method

    International Nuclear Information System (INIS)

    Kim, Youngsuk; Kang, Won-Hee

    2013-01-01

    Civil infrastructures such as transportation, water supply, sewers, telecommunications, and electrical and gas networks often establish highly complex networks, due to their multiple source and distribution nodes, complex topology, and functional interdependence between network components. To understand the reliability of such complex network system under catastrophic events such as earthquakes and to provide proper emergency management actions under such situation, efficient and accurate reliability analysis methods are necessary. In this paper, a non-simulation-based network reliability analysis method is developed based on the Recursive Decomposition Algorithm (RDA) for risk assessment of generic networks whose operation is defined by the connections of multiple initial and terminal node pairs. The proposed method has two separate decomposition processes for two logical functions, intersection and union, and combinations of these processes are used for the decomposition of any general system event with multiple node pairs. The proposed method is illustrated through numerical network examples with a variety of system definitions, and is applied to a benchmark gas transmission pipe network in Memphis TN to estimate the seismic performance and functional degradation of the network under a set of earthquake scenarios.

  5. A fracture mechanics and reliability based method to assess non-destructive testings for pressure vessels

    International Nuclear Information System (INIS)

    Kitagawa, Hideo; Hisada, Toshiaki

    1979-01-01

    Quantitative evaluation has not been made on the effects of carrying out preservice and in-service nondestructive tests for securing the soundness, safety and maintainability of pressure vessels, spending large expenses and labor. Especially the problems concerning the time and interval of in-service inspections lack the reasonable, quantitative evaluation method. In this paper, the problems of pressure vessels are treated by having developed the analysis method based on reliability technology and probability theory. The growth of surface cracks in pressure vessels was estimated, using the results of previous studies. The effects of nondestructive inspection on the defects in pressure vessels were evaluated, and the influences of many factors, such as plate thickness, stress, the accuracy of inspection and so on, on the effects of inspection, and the method of evaluating the inspections at unequal intervals were investigated. The analysis of reliability taking in-service inspection into consideration, the evaluation of in-service inspection and other affecting factors through the typical examples of analysis, and the review concerning the time of inspection are described. The method of analyzing the reliability of pressure vessels, considering the growth of defects and preservice and in-service nondestructive tests, was able to be systematized so as to be practically usable. (Kako, I.)

  6. Safety and reliability analysis based on nonprobabilistic methods

    International Nuclear Information System (INIS)

    Kozin, I.O.; Petersen, K.E.

    1996-01-01

    Imprecise probabilities, being developed during the last two decades, offer a considerably more general theory having many advantages which make it very promising for reliability and safety analysis. The objective of the paper is to argue that imprecise probabilities are more appropriate tool for reliability and safety analysis, that they allow to model the behavior of nuclear industry objects more comprehensively and give a possibility to solve some problems unsolved in the framework of conventional approach. Furthermore, some specific examples are given from which we can see the usefulness of the tool for solving some reliability tasks

  7. Availability estimation of repairable systems using reliability graph with general gates (RGGG)

    International Nuclear Information System (INIS)

    Goh, Gyoung Tae

    2009-02-01

    By performing risk analysis, we may obtain sufficient information about the system to redesign it and lower the probability of the occurrence of an accident or mitigate the ensuing consequences. The concept of reliability is widely used to express risk of systems. The reliability is used for non-repairable systems. But nuclear power plant systems are repairable systems. With repairable systems, repairable components can improve the availability of a system because faults that are generated in components can be recovered. Hence, the availability of the system is more proper concept in case of repairable systems. Reliability graph with general gate (RGGG) is one of the system reliability analysis methods. The RGGG is a very intuitiveness method as compared with other methods. But the RGGG has not been applied to repairable systems yet. The objective of this study is to extend the RGGG in order to enable one to analyze repairable system. Determining the probability table for each node is a critical process to calculate the system availability in the RGGG method. Therefore finding the proper algorithms and making probability tables in various situations are the major a part of this study. The other part is an example of applying RGGG method to a real system. We find the proper algorithms and probability tables for independent repairable systems, dependent series repairable systems, and k-out-of-m (K/M) redundant parallel repairable systems in this study. We can evaluate the availability of real system using these probability tables. An example for a real system is shown in the latter part of this study. For the purpose of this analysis, the charging pumps subsystem of the chemical and volume control system (CVCS) was selected. The RGGG method extended for repairable systems has the same characteristic of intuitiveness as the original RGGG method and we can confirm that the availability analysis result from the repairable RGGG method is exact

  8. Review of methods for the integration of reliability and design engineering

    International Nuclear Information System (INIS)

    Reilly, J.T.

    1978-03-01

    A review of methods for the integration of reliability and design engineering was carried out to establish a reliability program philosophy, an initial set of methods, and procedures to be used by both the designer and reliability analyst. The report outlines a set of procedures which implements a philosophy that requires increased involvement by the designer in reliability analysis. Discussions of each method reviewed include examples of its application

  9. Control and estimation methods over communication networks

    CERN Document Server

    Mahmoud, Magdi S

    2014-01-01

    This book provides a rigorous framework in which to study problems in the analysis, stability and design of networked control systems. Four dominant sources of difficulty are considered: packet dropouts, communication bandwidth constraints, parametric uncertainty, and time delays. Past methods and results are reviewed from a contemporary perspective, present trends are examined, and future possibilities proposed. Emphasis is placed on robust and reliable design methods. New control strategies for improving the efficiency of sensor data processing and reducing associated time delay are presented. The coverage provided features: ·        an overall assessment of recent and current fault-tolerant control algorithms; ·        treatment of several issues arising at the junction of control and communications; ·        key concepts followed by their proofs and efficient computational methods for their implementation; and ·        simulation examples (including TrueTime simulations) to...

  10. Method of core thermodynamic reliability determination in pressurized water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ackermann, G.; Horche, W. (Ingenieurhochschule Zittau (German Democratic Republic). Sektion Kraftwerksanlagenbau und Energieumwandlung)

    1983-01-01

    A statistical model appropriate to determine the thermodynamic reliability and the power-limiting parameter of PWR cores is described for cases of accidental transients. The model is compared with the hot channel model hitherto applied.

  11. Method of core thermodynamic reliability determination in pressurized water reactors

    International Nuclear Information System (INIS)

    Ackermann, G.; Horche, W.

    1983-01-01

    A statistical model appropriate to determine the thermodynamic reliability and the power-limiting parameter of PWR cores is described for cases of accidental transients. The model is compared with the hot channel model hitherto applied. (author)

  12. Reliability estimation of a N- M-cold-standby redundancy system in a multicomponent stress-strength model with generalized half-logistic distribution

    Science.gov (United States)

    Liu, Yiming; Shi, Yimin; Bai, Xuchao; Zhan, Pei

    2018-01-01

    In this paper, we study the estimation for the reliability of a multicomponent system, named N- M-cold-standby redundancy system, based on progressive Type-II censoring sample. In the system, there are N subsystems consisting of M statistically independent distributed strength components, and only one of these subsystems works under the impact of stresses at a time and the others remain as standbys. Whenever the working subsystem fails, one from the standbys takes its place. The system fails when the entire subsystems fail. It is supposed that the underlying distributions of random strength and stress both belong to the generalized half-logistic distribution with different shape parameter. The reliability of the system is estimated by using both classical and Bayesian statistical inference. Uniformly minimum variance unbiased estimator and maximum likelihood estimator for the reliability of the system are derived. Under squared error loss function, the exact expression of the Bayes estimator for the reliability of the system is developed by using the Gauss hypergeometric function. The asymptotic confidence interval and corresponding coverage probabilities are derived based on both the Fisher and the observed information matrices. The approximate highest probability density credible interval is constructed by using Monte Carlo method. Monte Carlo simulations are performed to compare the performances of the proposed reliability estimators. A real data set is also analyzed for an illustration of the findings.

  13. Characteristics and application study of AP1000 NPPs equipment reliability classification method

    International Nuclear Information System (INIS)

    Guan Gao

    2013-01-01

    AP1000 nuclear power plant applies an integrated approach to establish equipment reliability classification, which includes probabilistic risk assessment technique, maintenance rule administrative, power production reliability classification and functional equipment group bounding method, and eventually classify equipment reliability into 4 levels. This classification process and result are very different from classical RCM and streamlined RCM. It studied the characteristic of AP1000 equipment reliability classification approach, considered that equipment reliability classification should effectively support maintenance strategy development and work process control, recommended to use a combined RCM method to establish the future equipment reliability program of AP1000 nuclear power plants. (authors)

  14. Online Reliable Peak Charge/Discharge Power Estimation of Series-Connected Lithium-Ion Battery Packs

    Directory of Open Access Journals (Sweden)

    Bo Jiang

    2017-03-01

    Full Text Available The accurate peak power estimation of a battery pack is essential to the power-train control of electric vehicles (EVs. It helps to evaluate the maximum charge and discharge capability of the battery system, and thus to optimally control the power-train system to meet the requirement of acceleration, gradient climbing and regenerative braking while achieving a high energy efficiency. A novel online peak power estimation method for series-connected lithium-ion battery packs is proposed, which considers the influence of cell difference on the peak power of the battery packs. A new parameter identification algorithm based on adaptive ratio vectors is designed to online identify the parameters of each individual cell in a series-connected battery pack. The ratio vectors reflecting cell difference are deduced strictly based on the analysis of battery characteristics. Based on the online parameter identification, the peak power estimation considering cell difference is further developed. Some validation experiments in different battery aging conditions and with different current profiles have been implemented to verify the proposed method. The results indicate that the ratio vector-based identification algorithm can achieve the same accuracy as the repetitive RLS (recursive least squares based identification while evidently reducing the computation cost, and the proposed peak power estimation method is more effective and reliable for series-connected battery packs due to the consideration of cell difference.

  15. Calculation of the reliability of large complex systems by the relevant path method

    International Nuclear Information System (INIS)

    Richter, G.

    1975-03-01

    In this paper, analytical methods are presented and tested with which the probabilistic reliability data of technical systems can be determined for given fault trees and block diagrams and known reliability data of the components. (orig./AK) [de

  16. Reliability of a semi-quantitative method for dermal exposure assessment (DREAM)

    NARCIS (Netherlands)

    Wendel de Joode, B. van; Hemmen, J.J. van; Meijster, T.; Major, V.; London, L.; Kromhout, H.

    2005-01-01

    Valid and reliable semi-quantitative dermal exposure assessment methods for epidemiological research and for occupational hygiene practice, applicable for different chemical agents, are practically nonexistent. The aim of this study was to assess the reliability of a recently developed

  17. A Reliability Assessment Method for the VHTR Safety Systems

    International Nuclear Information System (INIS)

    Lee, Hyung Sok; Jae, Moo Sung; Kim, Yong Wan

    2011-01-01

    The Passive safety system by very high temperature reactor which has attracted worldwide attention in the last century is the reliability safety system introduced for the improvement in the safety of the next generation nuclear power plant design. The Passive system functionality does not rely on an external source of energy, but on an intelligent use of the natural phenomena, such as gravity, conduction and radiation, which are always present. Because of these features, it is difficult to evaluate the passive safety on the risk analysis methodology having considered the existing active system failure. Therefore new reliability methodology has to be considered. In this study, the preliminary evaluation and conceptualization are tried, applying the concept of the load and capacity from the reliability physics model, designing the new passive system analysis methodology, and the trial applying to paper plant.

  18. A dynamic particle filter-support vector regression method for reliability prediction

    International Nuclear Information System (INIS)

    Wei, Zhao; Tao, Tao; ZhuoShu, Ding; Zio, Enrico

    2013-01-01

    Support vector regression (SVR) has been applied to time series prediction and some works have demonstrated the feasibility of its use to forecast system reliability. For accuracy of reliability forecasting, the selection of SVR's parameters is important. The existing research works on SVR's parameters selection divide the example dataset into training and test subsets, and tune the parameters on the training data. However, these fixed parameters can lead to poor prediction capabilities if the data of the test subset differ significantly from those of training. Differently, the novel method proposed in this paper uses particle filtering to estimate the SVR model parameters according to the whole measurement sequence up to the last observation instance. By treating the SVR training model as the observation equation of a particle filter, our method allows updating the SVR model parameters dynamically when a new observation comes. Because of the adaptability of the parameters to dynamic data pattern, the new PF–SVR method has superior prediction performance over that of standard SVR. Four application results show that PF–SVR is more robust than SVR to the decrease of the number of training data and the change of initial SVR parameter values. Also, even if there are trends in the test data different from those in the training data, the method can capture the changes, correct the SVR parameters and obtain good predictions. -- Highlights: •A dynamic PF–SVR method is proposed to predict the system reliability. •The method can adjust the SVR parameters according to the change of data. •The method is robust to the size of training data and initial parameter values. •Some cases based on both artificial and real data are studied. •PF–SVR shows superior prediction performance over standard SVR

  19. Preliminary investigation on reliability of genomic estimated breeding values in the Danish and Swedish Holstein Population

    DEFF Research Database (Denmark)

    Su, G; Guldbrandtsen, B; Gregersen, V R

    2010-01-01

    or no effects, and a single prior distribution common for all SNP. It was found that, in general, the model with a common prior distribution of scaling factors had better predictive ability than any mixture prior models. Therefore, a common prior model was used to estimate SNP effects and breeding values......Abstract This study investigated the reliability of genomic estimated breeding values (GEBV) in the Danish Holstein population. The data in the analysis included 3,330 bulls with both published conventional EBV and single nucleotide polymorphism (SNP) markers. After data editing, 38,134 SNP markers...... were available. In the analysis, all SNP were fitted simultaneously as random effects in a Bayesian variable selection model, which allows heterogeneous variances for different SNP markers. The response variables were the official EBV. Direct GEBV were calculated as the sum of individual SNP effects...

  20. Reliability estimate of unconfined compressive strength of black cotton soil stabilized with cement and quarry dust

    Directory of Open Access Journals (Sweden)

    Dayo Oluwatoyin AKANBI

    2017-06-01

    Full Text Available Reliability estimates of unconfined compressive strength values from laboratory results for specimens compacted at British Standard Light (BSLfor compacted quarry dust treated black cotton soil using cement for road sub – base material was developed by incorporating data obtained from Unconfined compressive strength (UCS test gotten from the laboratory test to produce a predictive model. Data obtained were incorporated into a FORTRAN-based first-order reliability program to obtain reliability index values. Variable factors such as water content relative to optimum (WRO, hydraulic modulus (HM, quarry dust (QD, cement (C, Tri-Calcium silicate (C3S, Di-calcium silicate (C2S, Tri-Calcium Aluminate (C3A, and maximum dry density (MDD produced acceptable safety index value of1.0and they were achieved at coefficient of variation (COV ranges of 10-100%. Observed trends indicate that WRO, C3S, C2S and MDD are greatly influenced by the COV and therefore must be strictly controlled in QD/C treated black cotton soil for use as sub-base material in road pavements. Stochastically, British Standard light (BSL can be used to model the 7 days unconfined compressive strength of compacted quarry dust/cement treated black cotton soil as a sub-base material for road pavement at all coefficient of variation (COV range 10 – 100% because the safety index obtained are higher than the acceptable 1.0 value.

  1. Hybrid time-variant reliability estimation for active control structures under aleatory and epistemic uncertainties

    Science.gov (United States)

    Wang, Lei; Xiong, Chuang; Wang, Xiaojun; Li, Yunlong; Xu, Menghui

    2018-04-01

    Considering that multi-source uncertainties from inherent nature as well as the external environment are unavoidable and severely affect the controller performance, the dynamic safety assessment with high confidence is of great significance for scientists and engineers. In view of this, the uncertainty quantification analysis and time-variant reliability estimation corresponding to the closed-loop control problems are conducted in this study under a mixture of random, interval, and convex uncertainties. By combining the state-space transformation and the natural set expansion, the boundary laws of controlled response histories are first confirmed with specific implementation of random items. For nonlinear cases, the collocation set methodology and fourth Rounge-Kutta algorithm are introduced as well. Enlightened by the first-passage model in random process theory as well as by the static probabilistic reliability ideas, a new definition of the hybrid time-variant reliability measurement is provided for the vibration control systems and the related solution details are further expounded. Two engineering examples are eventually presented to demonstrate the validity and applicability of the methodology developed.

  2. Regional inversion of CO2 ecosystem fluxes from atmospheric measurements. Reliability of the uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Broquet, G.; Chevallier, F.; Breon, F.M.; Yver, C.; Ciais, P.; Ramonet, M.; Schmidt, M. [Laboratoire des Sciences du Climat et de l' Environnement, CEA-CNRS-UVSQ, UMR8212, IPSL, Gif-sur-Yvette (France); Alemanno, M. [Servizio Meteorologico dell' Aeronautica Militare Italiana, Centro Aeronautica Militare di Montagna, Monte Cimone/Sestola (Italy); Apadula, F. [Research on Energy Systems, RSE, Environment and Sustainable Development Department, Milano (Italy); Hammer, S. [Universitaet Heidelberg, Institut fuer Umweltphysik, Heidelberg (Germany); Haszpra, L. [Hungarian Meteorological Service, Budapest (Hungary); Meinhardt, F. [Federal Environmental Agency, Kirchzarten (Germany); Necki, J. [AGH University of Science and Technology, Krakow (Poland); Piacentino, S. [ENEA, Laboratory for Earth Observations and Analyses, Palermo (Italy); Thompson, R.L. [Max Planck Institute for Biogeochemistry, Jena (Germany); Vermeulen, A.T. [Energy research Centre of the Netherlands ECN, EEE-EA, Petten (Netherlands)

    2013-07-01

    The Bayesian framework of CO2 flux inversions permits estimates of the retrieved flux uncertainties. Here, the reliability of these theoretical estimates is studied through a comparison against the misfits between the inverted fluxes and independent measurements of the CO2 Net Ecosystem Exchange (NEE) made by the eddy covariance technique at local (few hectares) scale. Regional inversions at 0.5{sup 0} resolution are applied for the western European domain where {approx}50 eddy covariance sites are operated. These inversions are conducted for the period 2002-2007. They use a mesoscale atmospheric transport model, a prior estimate of the NEE from a terrestrial ecosystem model and rely on the variational assimilation of in situ continuous measurements of CO2 atmospheric mole fractions. Averaged over monthly periods and over the whole domain, the misfits are in good agreement with the theoretical uncertainties for prior and inverted NEE, and pass the chi-square test for the variance at the 30% and 5% significance levels respectively, despite the scale mismatch and the independence between the prior (respectively inverted) NEE and the flux measurements. The theoretical uncertainty reduction for the monthly NEE at the measurement sites is 53% while the inversion decreases the standard deviation of the misfits by 38 %. These results build confidence in the NEE estimates at the European/monthly scales and in their theoretical uncertainty from the regional inverse modelling system. However, the uncertainties at the monthly (respectively annual) scale remain larger than the amplitude of the inter-annual variability of monthly (respectively annual) fluxes, so that this study does not engender confidence in the inter-annual variations. The uncertainties at the monthly scale are significantly smaller than the seasonal variations. The seasonal cycle of the inverted fluxes is thus reliable. In particular, the CO2 sink period over the European continent likely ends later than

  3. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    OpenAIRE

    Xiao, Ning-Cong; Li, Yan-Feng; Wang, Zhonglai; Peng, Weiwen; Huang, Hong-Zhong

    2013-01-01

    In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to cal...

  4. Comparison of different methods for estimation of potential evapotranspiration

    International Nuclear Information System (INIS)

    Nazeer, M.

    2010-01-01

    Evapotranspiration can be estimated with different available methods. The aim of this research study to compare and evaluate the originally measured potential evapotranspiration from Class A pan with the Hargreaves equation, the Penman equation, the Penman-Montheith equation, and the FAO56 Penman-Monteith equation. The evaporation rate from pan recorded greater than stated methods. For each evapotranspiration method, results were compared against mean monthly potential evapotranspiration (PET) from Pan data according to FAO (ET/sub o/=K/sub pan X E/sub pan)), from daily measured recorded data of the twenty-five years (1984-2008). On the basis of statistical analysis between the pan data and the FAO56- Penman-Monteith method are not considered to be very significant (=0.98) at 95% confidence and prediction intervals. All methods required accurate weather data for precise results, for the purpose of this study the past twenty five years data were analyzed and used including maximum and minimum air temperature, relative humidity, wind speed, sunshine duration and rainfall. Based on linear regression analysis results the FAO56 PMM ranked first (R/sup 2/=0.98) followed by Hergreaves method (R/sup 2/=0.96), Penman-Monteith method (R/sup 2/=0.94) and Penman method (=0.93). Obviously, using FAO56 Penman Monteith method with precise climatic variables for ET/sub o/ estimation is more reliable than the other alternative methods, Hergreaves is more simple and rely only on air temperatures data and can be used alternative of FAO56 Penman-Monteith method if other climatic data are missing or unreliable. (author)

  5. Reliability-Based Shape Optimization using Stochastic Finite Element Methods

    DEFF Research Database (Denmark)

    Enevoldsen, Ib; Sørensen, John Dalsgaard; Sigurdsson, G.

    1991-01-01

    stochastic fields (e.g. loads and material parameters such as Young's modulus and the Poisson ratio). In this case stochastic finite element techniques combined with FORM analysis can be used to obtain measures of the reliability of the structural systems, see Der Kiureghian & Ke (6) and Liu & Der Kiureghian...

  6. RELIABILITY AND ACCURACY ASSESSMENT OF INVASIVE AND NON- INVASIVE SEISMIC METHODS FOR SITE CHARACTERIZATION: FEEDBACK FROM THE INTERPACIFIC PROJECT

    OpenAIRE

    Garofalo , F.; Foti , S.; Hollender , F.; Bard , P.-Y.; Cornou , C.; Cox , B.R.; Dechamp , A.; Ohrnberger , M.; Sicilia , D.; Vergniault , C.

    2017-01-01

    International audience; The InterPacific project (Intercomparison of methods for site parameter and velocity profile characterization) aims to assess the reliability of seismic site characterization methods (borehole and surface wave methods) used for estimating shear wave velocity (VS) profiles and other related parameters (e.g., VS30). Three sites, representative of different geological conditions relevant for the evaluation of seismic site response effects, have been selected: (1) a hard r...

  7. ImageJ: A Free, Easy, and Reliable Method to Measure Leg Ulcers Using Digital Pictures.

    Science.gov (United States)

    Aragón-Sánchez, Javier; Quintana-Marrero, Yurena; Aragón-Hernández, Cristina; Hernández-Herero, María José

    2017-12-01

    Wound measurement to document the healing course of chronic leg ulcers has an important role in the management of these patients. Digital cameras in smartphones are readily available and easy to use, and taking pictures of wounds is becoming a routine in specialized departments. Analyzing digital pictures with appropriate software provides clinicians a quick, clean, and easy-to-use tool for measuring wound area. A set of 25 digital pictures of plain foot and leg ulcers was the basis of this study. Photographs were taken placing a ruler next to the wound in parallel with the healthy skin with the iPhone 6S (Apple Inc, Cupertino, CA), which has a camera of 12 megapixels using the flash. The digital photographs were visualized with ImageJ 1.45s freeware (National Institutes of Health, Rockville, MD; http://imagej.net/ImageJ ). Wound area measurement was carried out by 4 raters: head of the department, wound care nurse, physician, and medical student. We assessed intra- and interrater reliability using the interclass correlation coefficient. To determine intraobserver reliability, 2 of the raters repeated the measurement of the set 1 week after the first reading. The interrater model displayed an interclass correlation coefficient of 0.99 with 95% confidence interval of 0.999 to 1.000, showing excellent reliability. The intrarater model of both examiners showed excellent reliability. In conclusion, analyzing digital images of leg ulcers with ImageJ estimates wound area with excellent reliability. This method provides a free, rapid, and accurate way to measure wounds and could routinely be used to document wound healing in daily clinical practice.

  8. Numerical methods for reliability and safety assessment multiscale and multiphysics systems

    CERN Document Server

    Hami, Abdelkhalak

    2015-01-01

    This book offers unique insight on structural safety and reliability by combining computational methods that address multiphysics problems, involving multiple equations describing different physical phenomena, and multiscale problems, involving discrete sub-problems that together  describe important aspects of a system at multiple scales. The book examines a range of engineering domains and problems using dynamic analysis, nonlinear methods, error estimation, finite element analysis, and other computational techniques. This book also: ·       Introduces novel numerical methods ·       Illustrates new practical applications ·       Examines recent engineering applications ·       Presents up-to-date theoretical results ·       Offers perspective relevant to a wide audience, including teaching faculty/graduate students, researchers, and practicing engineers

  9. Approximation of the Monte Carlo Sampling Method for Reliability Analysis of Structures

    Directory of Open Access Journals (Sweden)

    Mahdi Shadab Far

    2016-01-01

    Full Text Available Structural load types, on the one hand, and structural capacity to withstand these loads, on the other hand, are of a probabilistic nature as they cannot be calculated and presented in a fully deterministic way. As such, the past few decades have witnessed the development of numerous probabilistic approaches towards the analysis and design of structures. Among the conventional methods used to assess structural reliability, the Monte Carlo sampling method has proved to be very convenient and efficient. However, it does suffer from certain disadvantages, the biggest one being the requirement of a very large number of samples to handle small probabilities, leading to a high computational cost. In this paper, a simple algorithm was proposed to estimate low failure probabilities using a small number of samples in conjunction with the Monte Carlo method. This revised approach was then presented in a step-by-step flowchart, for the purpose of easy programming and implementation.

  10. Method of estimation of scanning system quality

    Science.gov (United States)

    Larkin, Eugene; Kotov, Vladislav; Kotova, Natalya; Privalov, Alexander

    2018-04-01

    Estimation of scanner parameters is an important part in developing electronic document management system. This paper suggests considering the scanner as a system that contains two main channels: a photoelectric conversion channel and a channel for measuring spatial coordinates of objects. Although both of channels consist of the same elements, the testing of their parameters should be executed separately. The special structure of the two-dimensional reference signal is offered for this purpose. In this structure, the fields for testing various parameters of the scanner are sp atially separated. Characteristics of the scanner are associated with the loss of information when a document is digitized. The methods to test grayscale transmitting ability, resolution and aberrations level are offered.

  11. Estimating Ordinal Reliability for Likert-Type and Ordinal Item Response Data: A Conceptual, Empirical, and Practical Guide

    Science.gov (United States)

    Gadermann, Anne M.; Guhn, Martin; Zumbo, Bruno D.

    2012-01-01

    This paper provides a conceptual, empirical, and practical guide for estimating ordinal reliability coefficients for ordinal item response data (also referred to as Likert, Likert-type, ordered categorical, or rating scale item responses). Conventionally, reliability coefficients, such as Cronbach's alpha, are calculated using a Pearson…

  12. Usefulness of the Monte Carlo method in reliability calculations

    International Nuclear Information System (INIS)

    Lanore, J.M.; Kalli, H.

    1977-01-01

    Three examples of reliability Monte Carlo programs developed in the LEP (Laboratory for Radiation Shielding Studies in the Nuclear Research Center at Saclay) are presented. First, an uncertainty analysis is given for a simplified spray system; a Monte Carlo program PATREC-MC has been written to solve the problem with the system components given in the fault tree representation. The second program MONARC 2 has been written to solve the problem of complex systems reliability by the Monte Carlo simulation, here again the system (a residual heat removal system) is in the fault tree representation. Third, the Monte Carlo program MONARC was used instead of the Markov diagram to solve the simulation problem of an electric power supply including two nets and two stand-by diesels

  13. Reliability improvement methods for sapphire fiber temperature sensors

    Science.gov (United States)

    Schietinger, C.; Adams, B.

    1991-08-01

    Mechanical, optical, electrical, and software design improvements can be brought to bear in the enhancement of fiber-optic sapphire-fiber temperature measurement tool reliability in harsh environments. The optical fiber thermometry (OFT) equipment discussed is used in numerous process industries and generally involves a sapphire sensor, an optical transmission cable, and a microprocessor-based signal analyzer. OFT technology incorporating sensors for corrosive environments, hybrid sensors, and two-wavelength measurements, are discussed.

  14. Study on Feasibility of Applying Function Approximation Moment Method to Achieve Reliability-Based Design Optimization

    International Nuclear Information System (INIS)

    Huh, Jae Sung; Kwak, Byung Man

    2011-01-01

    Robust optimization or reliability-based design optimization are some of the methodologies that are employed to take into account the uncertainties of a system at the design stage. For applying such methodologies to solve industrial problems, accurate and efficient methods for estimating statistical moments and failure probability are required, and further, the results of sensitivity analysis, which is needed for searching direction during the optimization process, should also be accurate. The aim of this study is to employ the function approximation moment method into the sensitivity analysis formulation, which is expressed as an integral form, to verify the accuracy of the sensitivity results, and to solve a typical problem of reliability-based design optimization. These results are compared with those of other moment methods, and the feasibility of the function approximation moment method is verified. The sensitivity analysis formula with integral form is the efficient formulation for evaluating sensitivity because any additional function calculation is not needed provided the failure probability or statistical moments are calculated

  15. Girsanov's transformation based variance reduced Monte Carlo simulation schemes for reliability estimation in nonlinear stochastic dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Kanjilal, Oindrila, E-mail: oindrila@civil.iisc.ernet.in; Manohar, C.S., E-mail: manohar@civil.iisc.ernet.in

    2017-07-15

    The study considers the problem of simulation based time variant reliability analysis of nonlinear randomly excited dynamical systems. Attention is focused on importance sampling strategies based on the application of Girsanov's transformation method. Controls which minimize the distance function, as in the first order reliability method (FORM), are shown to minimize a bound on the sampling variance of the estimator for the probability of failure. Two schemes based on the application of calculus of variations for selecting control signals are proposed: the first obtains the control force as the solution of a two-point nonlinear boundary value problem, and, the second explores the application of the Volterra series in characterizing the controls. The relative merits of these schemes, vis-à-vis the method based on ideas from the FORM, are discussed. Illustrative examples, involving archetypal single degree of freedom (dof) nonlinear oscillators, and a multi-degree of freedom nonlinear dynamical system, are presented. The credentials of the proposed procedures are established by comparing the solutions with pertinent results from direct Monte Carlo simulations. - Highlights: • The distance minimizing control forces minimize a bound on the sampling variance. • Establishing Girsanov controls via solution of a two-point boundary value problem. • Girsanov controls via Volterra's series representation for the transfer functions.

  16. Reliability analysis of reactor systems by applying probability method; Analiza pouzdanosti reaktorskih sistema primenom metoda verovatnoce

    Energy Technology Data Exchange (ETDEWEB)

    Milivojevic, S [Institute of Nuclear Sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1974-12-15

    Probability method was chosen for analysing the reactor system reliability is considered realistic since it is based on verified experimental data. In fact this is a statistical method. The probability method developed takes into account the probability distribution of permitted levels of relevant parameters and their particular influence on the reliability of the system as a whole. The proposed method is rather general, and was used for problem of thermal safety analysis of reactor system. This analysis enables to analyze basic properties of the system under different operation conditions, expressed in form of probability they show the reliability of the system on the whole as well as reliability of each component.

  17. A practical approach for calculating reliable cost estimates from observational data: application to cost analyses in maternal and child health.

    Science.gov (United States)

    Salemi, Jason L; Comins, Meg M; Chandler, Kristen; Mogos, Mulubrhan F; Salihu, Hamisu M

    2013-08-01

    Comparative effectiveness research (CER) and cost-effectiveness analysis are valuable tools for informing health policy and clinical care decisions. Despite the increased availability of rich observational databases with economic measures, few researchers have the skills needed to conduct valid and reliable cost analyses for CER. The objectives of this paper are to (i) describe a practical approach for calculating cost estimates from hospital charges in discharge data using publicly available hospital cost reports, and (ii) assess the impact of using different methods for cost estimation in maternal and child health (MCH) studies by conducting economic analyses on gestational diabetes (GDM) and pre-pregnancy overweight/obesity. In Florida, we have constructed a clinically enhanced, longitudinal, encounter-level MCH database covering over 2.3 million infants (and their mothers) born alive from 1998 to 2009. Using this as a template, we describe a detailed methodology to use publicly available data to calculate hospital-wide and department-specific cost-to-charge ratios (CCRs), link them to the master database, and convert reported hospital charges to refined cost estimates. We then conduct an economic analysis as a case study on women by GDM and pre-pregnancy body mass index (BMI) status to compare the impact of using different methods on cost estimation. Over 60 % of inpatient charges for birth hospitalizations came from the nursery/labor/delivery units, which have very different cost-to-charge markups (CCR = 0.70) than the commonly substituted hospital average (CCR = 0.29). Using estimated mean, per-person maternal hospitalization costs for women with GDM as an example, unadjusted charges ($US14,696) grossly overestimated actual cost, compared with hospital-wide ($US3,498) and department-level ($US4,986) CCR adjustments. However, the refined cost estimation method, although more accurate, did not alter our conclusions that infant/maternal hospitalization costs

  18. Reliability Evaluation of Bridges Based on Nonprobabilistic Response Surface Limit Method

    Directory of Open Access Journals (Sweden)

    Xuyong Chen

    2017-01-01

    Full Text Available Due to many uncertainties in nonprobabilistic reliability assessment of bridges, the limit state function is generally unknown. The traditional nonprobabilistic response surface method is a lengthy and oscillating iteration process and leads to difficultly solving the nonprobabilistic reliability index. This article proposes a nonprobabilistic response surface limit method based on the interval model. The intention of this method is to solve the upper and lower limits of the nonprobabilistic reliability index and to narrow the range of the nonprobabilistic reliability index. If the range of the reliability index reduces to an acceptable accuracy, the solution will be considered convergent, and the nonprobabilistic reliability index will be obtained. The case study indicates that using the proposed method can avoid oscillating iteration process, make iteration process stable and convergent, reduce iteration steps significantly, and improve computational efficiency and precision significantly compared with the traditional nonprobabilistic response surface method. Finally, the nonprobabilistic reliability evaluation process of bridge will be built through evaluating the reliability of one PC continuous rigid frame bridge with three spans using the proposed method, which appears to be more simple and reliable when lack of samples and parameters in the bridge nonprobabilistic reliability evaluation is present.

  19. Regulatory relevant and reliable methods and data for determining the environmental fate of manufactured nanomaterials

    DEFF Research Database (Denmark)

    Baun, Anders; Sayre, Phil; Steinhäuser, Klaus Günter

    2017-01-01

    The widespread use of manufactured nanomaterials (MN) increases the need for describing and predicting their environmental fate and behaviour. A number of recent reviews have addressed the scientific challenges in disclosing the governing processes for the environmental fate and behaviour of MNs,...... data. Gaps do however exist in test methods for environmental fate, such as methods to estimate heteroagglomeration and the tendency for MNs to transform in the environment.......The widespread use of manufactured nanomaterials (MN) increases the need for describing and predicting their environmental fate and behaviour. A number of recent reviews have addressed the scientific challenges in disclosing the governing processes for the environmental fate and behaviour of MNs......, however there has been less focus on the regulatory adequacy of the data available for MN. The aim of this paper is therefore to review data, testing protocols and guidance papers which describe the environmental fate and behaviour of MN with a focus on their regulatory reliability and relevance. Given...

  20. Validity and Reliability of the Brazilian Version of the Rapid Estimate of Adult Literacy in Dentistry – BREALD-30

    Science.gov (United States)

    Junkes, Monica C.; Fraiz, Fabian C.; Sardenberg, Fernanda; Lee, Jessica Y.; Paiva, Saul M.; Ferreira, Fernanda M.

    2015-01-01

    Objective The aim of the present study was to translate, perform the cross-cultural adaptation of the Rapid Estimate of Adult Literacy in Dentistry to Brazilian-Portuguese language and test the reliability and validity of this version. Methods After translation and cross-cultural adaptation, interviews were conducted with 258 parents/caregivers of children in treatment at the pediatric dentistry clinics and health units in Curitiba, Brazil. To test the instrument's validity, the scores of Brazilian Rapid Estimate of Adult Literacy in Dentistry (BREALD-30) were compared based on occupation, monthly household income, educational attainment, general literacy, use of dental services and three dental outcomes. Results The BREALD-30 demonstrated good internal reliability. Cronbach’s alpha ranged from 0.88 to 0.89 when words were deleted individually. The analysis of test-retest reliability revealed excellent reproducibility (intraclass correlation coefficient = 0.983 and Kappa coefficient ranging from moderate to nearly perfect). In the bivariate analysis, BREALD-30 scores were significantly correlated with the level of general literacy (rs = 0.593) and income (rs = 0.327) and significantly associated with occupation, educational attainment, use of dental services, self-rated oral health and the respondent’s perception regarding his/her child's oral health. However, only the association between the BREALD-30 score and the respondent’s perception regarding his/her child's oral health remained significant in the multivariate analysis. Conclusion The BREALD-30 demonstrated satisfactory psychometric properties and is therefore applicable to adults in Brazil. PMID:26158724

  1. A Method for Estimating Surveillance Video Georeferences

    Directory of Open Access Journals (Sweden)

    Aleksandar Milosavljević

    2017-07-01

    Full Text Available The integration of a surveillance camera video with a three-dimensional (3D geographic information system (GIS requires the georeferencing of that video. Since a video consists of separate frames, each frame must be georeferenced. To georeference a video frame, we rely on the information about the camera view at the moment that the frame was captured. A camera view in 3D space is completely determined by the camera position, orientation, and field-of-view. Since the accurate measuring of these parameters can be extremely difficult, in this paper we propose a method for their estimation based on matching video frame coordinates of certain point features with their 3D geographic locations. To obtain these coordinates, we rely on high-resolution orthophotos and digital elevation models (DEM of the area of interest. Once an adequate number of points are matched, Levenberg–Marquardt iterative optimization is applied to find the most suitable video frame georeference, i.e., position and orientation of the camera.

  2. Assessment and Improving Methods of Reliability Indices in Bakhtar Regional Electricity Company

    Directory of Open Access Journals (Sweden)

    Saeed Shahrezaei

    2013-04-01

    Full Text Available Reliability of a system is the ability of a system to do prospected duties in future and the probability of desirable operation for doing predetermined duties. Power system elements failures data are the main data of reliability assessment in the network. Determining antiseptic parameters is the goal of reliability assessment by using system history data. These parameters help to recognize week points of the system. In other words, the goal of reliability assessment is operation improving and decreasing of the failures and power outages. This paper is developed to assess reliability indices of Bakhtar Regional Electricity Company up to 1393 and the improving methods and their effects on the reliability indices in this network. DIgSILENT Power Factory software is employed for simulation. Simulation results show the positive effect of improving methods in reliability indices of Bakhtar Regional Electricity Company.

  3. Determinants of the reliability of ultrasound tomography sound speed estimates as a surrogate for volumetric breast density

    Energy Technology Data Exchange (ETDEWEB)

    Khodr, Zeina G.; Pfeiffer, Ruth M.; Gierach, Gretchen L., E-mail: GierachG@mail.nih.gov [Department of Health and Human Services, Division of Cancer Epidemiology and Genetics, National Cancer Institute, 9609 Medical Center Drive MSC 9774, Bethesda, Maryland 20892 (United States); Sak, Mark A.; Bey-Knight, Lisa [Karmanos Cancer Institute, Wayne State University, 4100 John R, Detroit, Michigan 48201 (United States); Duric, Nebojsa; Littrup, Peter [Karmanos Cancer Institute, Wayne State University, 4100 John R, Detroit, Michigan 48201 and Delphinus Medical Technologies, 46701 Commerce Center Drive, Plymouth, Michigan 48170 (United States); Ali, Haythem; Vallieres, Patricia [Henry Ford Health System, 2799 W Grand Boulevard, Detroit, Michigan 48202 (United States); Sherman, Mark E. [Division of Cancer Prevention, National Cancer Institute, Department of Health and Human Services, 9609 Medical Center Drive MSC 9774, Bethesda, Maryland 20892 (United States)

    2015-10-15

    Purpose: High breast density, as measured by mammography, is associated with increased breast cancer risk, but standard methods of assessment have limitations including 2D representation of breast tissue, distortion due to breast compression, and use of ionizing radiation. Ultrasound tomography (UST) is a novel imaging method that averts these limitations and uses sound speed measures rather than x-ray imaging to estimate breast density. The authors evaluated the reproducibility of measures of speed of sound and changes in this parameter using UST. Methods: One experienced and five newly trained raters measured sound speed in serial UST scans for 22 women (two scans per person) to assess inter-rater reliability. Intrarater reliability was assessed for four raters. A random effects model was used to calculate the percent variation in sound speed and change in sound speed attributable to subject, scan, rater, and repeat reads. The authors estimated the intraclass correlation coefficients (ICCs) for these measures based on data from the authors’ experienced rater. Results: Median (range) time between baseline and follow-up UST scans was five (1–13) months. Contributions of factors to sound speed variance were differences between subjects (86.0%), baseline versus follow-up scans (7.5%), inter-rater evaluations (1.1%), and intrarater reproducibility (∼0%). When evaluating change in sound speed between scans, 2.7% and ∼0% of variation were attributed to inter- and intrarater variation, respectively. For the experienced rater’s repeat reads, agreement for sound speed was excellent (ICC = 93.4%) and for change in sound speed substantial (ICC = 70.4%), indicating very good reproducibility of these measures. Conclusions: UST provided highly reproducible sound speed measurements, which reflect breast density, suggesting that UST has utility in sensitively assessing change in density.

  4. Method for assessing reliability of a network considering probabilistic safety assessment

    International Nuclear Information System (INIS)

    Cepin, M.

    2005-01-01

    A method for assessment of reliability of the network is developed, which uses the features of the fault tree analysis. The method is developed in a way that the increase of the network under consideration does not require significant increase of the model. The method is applied to small examples of network consisting of a small number of nodes and a small number of their connections. The results give the network reliability. They identify equipment, which is to be carefully maintained in order that the network reliability is not reduced, and equipment, which is a candidate for redundancy, as this would improve network reliability significantly. (author)

  5. RELIABILITY ASSESSMENT OF ENTROPY METHOD FOR SYSTEM CONSISTED OF IDENTICAL EXPONENTIAL UNITS

    Institute of Scientific and Technical Information of China (English)

    Sun Youchao; Shi Jun

    2004-01-01

    The reliability assessment of unit-system near two levels is the most important content in the reliability multi-level synthesis of complex systems. Introducing the information theory into system reliability assessment, using the addible characteristic of information quantity and the principle of equivalence of information quantity, an entropy method of data information conversion is presented for the system consisted of identical exponential units. The basic conversion formulae of entropy method of unit test data are derived based on the principle of information quantity equivalence. The general models of entropy method synthesis assessment for system reliability approximate lower limits are established according to the fundamental principle of the unit reliability assessment. The applications of the entropy method are discussed by way of practical examples. Compared with the traditional methods, the entropy method is found to be valid and practicable and the assessment results are very satisfactory.

  6. Reliability of different sampling densities for estimating and mapping lichen diversity in biomonitoring studies

    International Nuclear Information System (INIS)

    Ferretti, M.; Brambilla, E.; Brunialti, G.; Fornasier, F.; Mazzali, C.; Giordani, P.; Nimis, P.L.

    2004-01-01

    Sampling requirements related to lichen biomonitoring include optimal sampling density for obtaining precise and unbiased estimates of population parameters and maps of known reliability. Two available datasets on a sub-national scale in Italy were used to determine a cost-effective sampling density to be adopted in medium-to-large-scale biomonitoring studies. As expected, the relative error in the mean Lichen Biodiversity (Italian acronym: BL) values and the error associated with the interpolation of BL values for (unmeasured) grid cells increased as the sampling density decreased. However, the increase in size of the error was not linear and even a considerable reduction (up to 50%) in the original sampling effort led to a far smaller increase in errors in the mean estimates (<6%) and in mapping (<18%) as compared with the original sampling densities. A reduction in the sampling effort can result in considerable savings of resources, which can then be used for a more detailed investigation of potentially problematic areas. It is, however, necessary to decide the acceptable level of precision at the design stage of the investigation, so as to select the proper sampling density. - An acceptable level of precision must be decided before determining a sampling design

  7. The Reliability, Impact, and Cost-Effectiveness of Value-Added Teacher Assessment Methods

    Science.gov (United States)

    Yeh, Stuart S.

    2012-01-01

    This article reviews evidence regarding the intertemporal reliability of teacher rankings based on value-added methods. Value-added methods exhibit low reliability, yet are broadly supported by prominent educational researchers and are increasingly being used to evaluate and fire teachers. The article then presents a cost-effectiveness analysis…

  8. Reliability testing of tendon disease using two different scanning methods in patients with rheumatoid arthritis

    DEFF Research Database (Denmark)

    Bruyn, George A W; Möller, Ingrid; Garrido, Jesus

    2012-01-01

    To assess the intra- and interobserver reliability of musculoskeletal ultrasonography (US) in detecting inflammatory and destructive tendon abnormalities in patients with RA using two different scanning methods.......To assess the intra- and interobserver reliability of musculoskeletal ultrasonography (US) in detecting inflammatory and destructive tendon abnormalities in patients with RA using two different scanning methods....

  9. Model correction factor method for reliability problems involving integrals of non-Gaussian random fields

    DEFF Research Database (Denmark)

    Franchin, P.; Ditlevsen, Ove Dalager; Kiureghian, Armen Der

    2002-01-01

    The model correction factor method (MCFM) is used in conjunction with the first-order reliability method (FORM) to solve structural reliability problems involving integrals of non-Gaussian random fields. The approach replaces the limit-state function with an idealized one, in which the integrals ...

  10. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    Science.gov (United States)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  11. A comparative study on the HW reliability assessment methods for digital I and C equipment

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Hoan Sung; Sung, T. Y.; Eom, H. S.; Park, J. K.; Kang, H. G.; Lee, G. Y. [Korea Atomic Energy Research Institute, Taejeon (Korea); Kim, M. C. [Korea Advanced Institute of Science and Technology, Taejeon (Korea); Jun, S. T. [KHNP, Taejeon (Korea)

    2002-03-01

    It is necessary to predict or to evaluate the reliability of electronic equipment for the probabilistic safety analysis of digital instrument and control equipment. But most databases for the reliability prediction have no data for the up-to-date equipment and the failure modes are not classified. The prediction results for the specific component show different values according to the methods and databases. For boards and systems each method shows different values than others also. This study is for reliability prediction of PDC system for Wolsong NPP1 as a digital I and C equipment. Various reliability prediction methods and failure databases are used in calculation of the reliability to compare the effects of sensitivity and accuracy of each model and database. Many considerations for the reliability assessment of digital systems are derived with the results of this study. 14 refs., 19 figs., 15 tabs. (Author)

  12. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    Directory of Open Access Journals (Sweden)

    Ning-Cong Xiao

    2013-12-01

    Full Text Available In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to calculate the maximum entropy density function of uncertainty parameters more accurately for it does not need any additional information and assumptions. Finally, two optimization models are presented which can be used to determine the lower and upper bounds of systems probability of failure under vague environment conditions. Two numerical examples are investigated to demonstrate the proposed method.

  13. Method of reliability allocation based on fault tree analysis and fuzzy math in nuclear power plants

    International Nuclear Information System (INIS)

    Chen Zhaobing; Deng Jian; Cao Xuewu

    2005-01-01

    Reliability allocation is a kind of a difficult multi-objective optimization problem. It can not only be applied to determine the reliability characteristic of reactor systems, subsystem and main components but also be performed to improve the design, operation and maintenance of nuclear plants. The fuzzy math known as one of the powerful tools for fuzzy optimization and the fault analysis deemed to be one of the effective methods of reliability analysis can be applied to the reliability allocation model so as to work out the problems of fuzzy characteristic of some factors and subsystem's choice respectively in this paper. Thus we develop a failure rate allocation model on the basis of the fault tree analysis and fuzzy math. For the choice of the reliability constraint factors, we choose the six important ones according to practical need for conducting the reliability allocation. The subsystem selected by the top-level fault tree analysis is to avoid allocating reliability for all the equipment and components including the unnecessary parts. During the reliability process, some factors can be calculated or measured quantitatively while others only can be assessed qualitatively by the expert rating method. So we adopt fuzzy decision and dualistic contrast to realize the reliability allocation with the help of fault tree analysis. Finally the example of the emergency diesel generator's reliability allocation is used to illustrate reliability allocation model and improve this model simple and applicable. (authors)

  14. Reliability of Nationwide Prevalence Estimates of Dementia: A Critical Appraisal Based on Brazilian Surveys.

    Directory of Open Access Journals (Sweden)

    Flávio Chaimowicz

    Full Text Available The nationwide dementia prevalence is usually calculated by applying the results of local surveys to countries' populations. To evaluate the reliability of such estimations in developing countries, we chose Brazil as an example. We carried out a systematic review of dementia surveys, ascertained their risk of bias, and present the best estimate of occurrence of dementia in Brazil.We carried out an electronic search of PubMed, Latin-American databases, and a Brazilian thesis database for surveys focusing on dementia prevalence in Brazil. The systematic review was registered at PROSPERO (CRD42014008815. Among the 35 studies found, 15 analyzed population-based random samples. However, most of them utilized inadequate criteria for diagnostics. Six studies without these limitations were further analyzed to assess the risk of selection, attrition, outcome and population bias as well as several statistical issues. All the studies presented moderate or high risk of bias in at least two domains due to the following features: high non-response, inaccurate cut-offs, and doubtful accuracy of the examiners. Two studies had limited external validity due to high rates of illiteracy or low income. The three studies with adequate generalizability and the lowest risk of bias presented a prevalence of dementia between 7.1% and 8.3% among subjects aged 65 years and older. However, after adjustment for accuracy of screening, the best available evidence points towards a figure between 15.2% and 16.3%.The risk of bias may strongly limit the generalizability of dementia prevalence estimates in developing countries. Extrapolations that have already been made for Brazil and Latin America were based on a prevalence that should have been adjusted for screening accuracy or not used at all due to severe bias. Similar evaluations regarding other developing countries are needed in order to verify the scope of these limitations.

  15. An interactive website for analytical method comparison and bias estimation.

    Science.gov (United States)

    Bahar, Burak; Tuncel, Ayse F; Holmes, Earle W; Holmes, Daniel T

    2017-12-01

    Regulatory standards mandate laboratories to perform studies to ensure accuracy and reliability of their test results. Method comparison and bias estimation are important components of these studies. We developed an interactive website for evaluating the relative performance of two analytical methods using R programming language tools. The website can be accessed at https://bahar.shinyapps.io/method_compare/. The site has an easy-to-use interface that allows both copy-pasting and manual entry of data. It also allows selection of a regression model and creation of regression and difference plots. Available regression models include Ordinary Least Squares, Weighted-Ordinary Least Squares, Deming, Weighted-Deming, Passing-Bablok and Passing-Bablok for large datasets. The server processes the data and generates downloadable reports in PDF or HTML format. Our website provides clinical laboratories a practical way to assess the relative performance of two analytical methods. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  16. Calibration Methods for Reliability-Based Design Codes

    DEFF Research Database (Denmark)

    Gayton, N.; Mohamed, A.; Sørensen, John Dalsgaard

    2004-01-01

    The calibration methods are applied to define the optimal code format according to some target safety levels. The calibration procedure can be seen as a specific optimization process where the control variables are the partial factors of the code. Different methods are available in the literature...

  17. Rapid and Reliable HPLC Method for the Determination of Vitamin ...

    African Journals Online (AJOL)

    Purpose: To develop and validate an accurate, sensitive and reproducible high performance liquid chromatographic (HPLC) method for the quantitation of vitamin C in pharmaceutical samples. Method: The drug and the standard were eluted from Superspher RP-18 (250 mm x 4.6 mm, 10ìm particle size) at 20 0C.

  18. Methods for reliability evaluation of trust and reputation systems

    Science.gov (United States)

    Janiszewski, Marek B.

    2016-09-01

    Trust and reputation systems are a systematic approach to build security on the basis of observations of node's behaviour. Exchange of node's opinions about other nodes is very useful to indicate nodes which act selfishly or maliciously. The idea behind trust and reputation systems gets significance because of the fact that conventional security measures (based on cryptography) are often not sufficient. Trust and reputation systems can be used in various types of networks such as WSN, MANET, P2P and also in e-commerce applications. Trust and reputation systems give not only benefits but also could be a thread itself. Many attacks aim at trust and reputation systems exist, but such attacks still have not gain enough attention of research teams. Moreover, joint effects of many of known attacks have been determined as a very interesting field of research. Lack of an acknowledged methodology of evaluation of trust and reputation systems is a serious problem. This paper aims at presenting various approaches of evaluation such systems. This work also contains a description of generalization of many trust and reputation systems which can be used to evaluate reliability of such systems in the context of preventing various attacks.

  19. The MIRD method of estimating absorbed dose

    International Nuclear Information System (INIS)

    Weber, D.A.

    1991-01-01

    The estimate of absorbed radiation dose from internal emitters provides the information required to assess the radiation risk associated with the administration of radiopharmaceuticals for medical applications. The MIRD (Medical Internal Radiation Dose) system of dose calculation provides a systematic approach to combining the biologic distribution data and clearance data of radiopharmaceuticals and the physical properties of radionuclides to obtain dose estimates. This tutorial presents a review of the MIRD schema, the derivation of the equations used to calculate absorbed dose, and shows how the MIRD schema can be applied to estimate dose from radiopharmaceuticals used in nuclear medicine

  20. Psychological methods of subjective risk estimates

    International Nuclear Information System (INIS)

    Zimolong, B.

    1980-01-01

    Reactions to situations involving risks can be divided into the following parts/ perception of danger, subjective estimates of the risk and risk taking with respect to action. Several investigations have compared subjective estimates of the risk with an objective measure of that risk. In general there was a mis-match between subjective and objective measures of risk, especially, objective risk involved in routine activities is most commonly underestimated. This implies, for accident prevention, that attempts must be made to induce accurate subjective risk estimates by technical and behavioural measures. (orig.) [de

  1. Assessing the impact of uncertainty on flood risk estimates with reliability analysis using 1-D and 2-D hydraulic models

    Directory of Open Access Journals (Sweden)

    L. Altarejos-García

    2012-07-01

    Full Text Available This paper addresses the use of reliability techniques such as Rosenblueth's Point-Estimate Method (PEM as a practical alternative to more precise Monte Carlo approaches to get estimates of the mean and variance of uncertain flood parameters water depth and velocity. These parameters define the flood severity, which is a concept used for decision-making in the context of flood risk assessment. The method proposed is particularly useful when the degree of complexity of the hydraulic models makes Monte Carlo inapplicable in terms of computing time, but when a measure of the variability of these parameters is still needed. The capacity of PEM, which is a special case of numerical quadrature based on orthogonal polynomials, to evaluate the first two moments of performance functions such as the water depth and velocity is demonstrated in the case of a single river reach using a 1-D HEC-RAS model. It is shown that in some cases, using a simple variable transformation, statistical distributions of both water depth and velocity approximate the lognormal. As this distribution is fully defined by its mean and variance, PEM can be used to define the full probability distribution function of these flood parameters and so allowing for probability estimations of flood severity. Then, an application of the method to the same river reach using a 2-D Shallow Water Equations (SWE model is performed. Flood maps of mean and standard deviation of water depth and velocity are obtained, and uncertainty in the extension of flooded areas with different severity levels is assessed. It is recognized, though, that whenever application of Monte Carlo method is practically feasible, it is a preferred approach.

  2. A Generalized Autocovariance Least-Squares Method for Covariance Estimation

    DEFF Research Database (Denmark)

    Åkesson, Bernt Magnus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2007-01-01

    A generalization of the autocovariance least- squares method for estimating noise covariances is presented. The method can estimate mutually correlated system and sensor noise and can be used with both the predicting and the filtering form of the Kalman filter.......A generalization of the autocovariance least- squares method for estimating noise covariances is presented. The method can estimate mutually correlated system and sensor noise and can be used with both the predicting and the filtering form of the Kalman filter....

  3. Quantitative developments in the cognitive reliability and error analysis method (CREAM) for the assessment of human performance

    International Nuclear Information System (INIS)

    Marseguerra, Marzio; Zio, Enrico; Librizzi, Massimo

    2006-01-01

    The current 'second generation' approaches in human reliability analysis focus their attention on the contextual conditions under which a given action is performed rather than on the notion of inherent human error probabilities, as was done in the earlier 'first generation' techniques. Among the 'second generation' methods, this paper considers the Cognitive Reliability and Error Analysis Method (CREAM) and proposes some developments with respect to a systematic procedure for computing probabilities of action failure. The starting point for the quantification is a previously introduced fuzzy version of the CREAM paradigm which is here further extended to include uncertainty on the qualification of the conditions under which the action is performed and to account for the fact that the effects of the common performance conditions (CPCs) on performance reliability may not all be equal. By the proposed approach, the probability of action failure is estimated by rating the performance conditions in terms of their effect on the action

  4. PERFORMANCE ANALYSIS OF METHODS FOR ESTIMATING ...

    African Journals Online (AJOL)

    2014-12-31

    Dec 31, 2014 ... speed is the most significant parameter of the wind energy. ... wind-powered generators and applied to estimate potential power output at various ...... Wind and Solar Power Systems, U.S. Merchant Marine Academy Kings.

  5. Reliability research to nuclear power plant operators based on several methods

    International Nuclear Information System (INIS)

    Fang Xiang; Li Fu; Zhao Bingquan

    2009-01-01

    The paper utilizes many kinds of international reliability research methods, and summarizes the review of reliability research of Chinese nuclear power plant operators in past over ten years based on the simulator platform of nuclear power plant. The paper shows the necessity and feasibility of the research to nuclear power plant operators from many angles including human cognition reliability, fuzzy mathematics model and psychological research model, etc. It will be good to the safe operation of nuclear power plant based on many kinds of research methods to the reliability research of nuclear power plant operators. (authors)

  6. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan human reliability analysis (IJS-HRA) and standardized plant analysis risk human reliability analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance

  7. Comparison of methods for dependency determination between human failure events within human reliability analysis

    International Nuclear Information System (INIS)

    Cepis, M.

    2007-01-01

    The Human Reliability Analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease of subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan - Human Reliability Analysis (IJS-HRA) and Standardized Plant Analysis Risk Human Reliability Analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance. (author)

  8. Statistical methods of estimating mining costs

    Science.gov (United States)

    Long, K.R.

    2011-01-01

    Until it was defunded in 1995, the U.S. Bureau of Mines maintained a Cost Estimating System (CES) for prefeasibility-type economic evaluations of mineral deposits and estimating costs at producing and non-producing mines. This system had a significant role in mineral resource assessments to estimate costs of developing and operating known mineral deposits and predicted undiscovered deposits. For legal reasons, the U.S. Geological Survey cannot update and maintain CES. Instead, statistical tools are under development to estimate mining costs from basic properties of mineral deposits such as tonnage, grade, mineralogy, depth, strip ratio, distance from infrastructure, rock strength, and work index. The first step was to reestimate "Taylor's Rule" which relates operating rate to available ore tonnage. The second step was to estimate statistical models of capital and operating costs for open pit porphyry copper mines with flotation concentrators. For a sample of 27 proposed porphyry copper projects, capital costs can be estimated from three variables: mineral processing rate, strip ratio, and distance from nearest railroad before mine construction began. Of all the variables tested, operating costs were found to be significantly correlated only with strip ratio.

  9. Advances in Time Estimation Methods for Molecular Data.

    Science.gov (United States)

    Kumar, Sudhir; Hedges, S Blair

    2016-04-01

    Molecular dating has become central to placing a temporal dimension on the tree of life. Methods for estimating divergence times have been developed for over 50 years, beginning with the proposal of molecular clock in 1962. We categorize the chronological development of these methods into four generations based on the timing of their origin. In the first generation approaches (1960s-1980s), a strict molecular clock was assumed to date divergences. In the second generation approaches (1990s), the equality of evolutionary rates between species was first tested and then a strict molecular clock applied to estimate divergence times. The third generation approaches (since ∼2000) account for differences in evolutionary rates across the tree by using a statistical model, obviating the need to assume a clock or to test the equality of evolutionary rates among species. Bayesian methods in the third generation require a specific or uniform prior on the speciation-process and enable the inclusion of uncertainty in clock calibrations. The fourth generation approaches (since 2012) allow rates to vary from branch to branch, but do not need prior selection of a statistical model to describe the rate variation or the specification of speciation model. With high accuracy, comparable to Bayesian approaches, and speeds that are orders of magnitude faster, fourth generation methods are able to produce reliable timetrees of thousands of species using genome scale data. We found that early time estimates from second generation studies are similar to those of third and fourth generation studies, indicating that methodological advances have not fundamentally altered the timetree of life, but rather have facilitated time estimation by enabling the inclusion of more species. Nonetheless, we feel an urgent need for testing the accuracy and precision of third and fourth generation methods, including their robustness to misspecification of priors in the analysis of large phylogenies and data

  10. A fast method for calculating reliable event supports in tree reconciliations via Pareto optimality.

    Science.gov (United States)

    To, Thu-Hien; Jacox, Edwin; Ranwez, Vincent; Scornavacca, Celine

    2015-11-14

    Given a gene and a species tree, reconciliation methods attempt to retrieve the macro-evolutionary events that best explain the discrepancies between the two tree topologies. The DTL parsimonious approach searches for a most parsimonious reconciliation between a gene tree and a (dated) species tree, considering four possible macro-evolutionary events (speciation, duplication, transfer, and loss) with specific costs. Unfortunately, many events are erroneously predicted due to errors in the input trees, inappropriate input cost values or because of the existence of several equally parsimonious scenarios. It is thus crucial to provide a measure of the reliability for predicted events. It has been recently proposed that the reliability of an event can be estimated via its frequency in the set of most parsimonious reconciliations obtained using a variety of reasonable input cost vectors. To compute such a support, a straightforward but time-consuming approach is to generate the costs slightly departing from the original ones, independently compute the set of all most parsimonious reconciliations for each vector, and combine these sets a posteriori. Another proposed approach uses Pareto-optimality to partition cost values into regions which induce reconciliations with the same number of DTL events. The support of an event is then defined as its frequency in the set of regions. However, often, the number of regions is not large enough to provide reliable supports. We present here a method to compute efficiently event supports via a polynomial-sized graph, which can represent all reconciliations for several different costs. Moreover, two methods are proposed to take into account alternative input costs: either explicitly providing an input cost range or allowing a tolerance for the over cost of a reconciliation. Our methods are faster than the region based method, substantially faster than the sampling-costs approach, and have a higher event-prediction accuracy on

  11. A new lifetime estimation model for a quicker LED reliability prediction

    Science.gov (United States)

    Hamon, B. H.; Mendizabal, L.; Feuillet, G.; Gasse, A.; Bataillou, B.

    2014-09-01

    LED reliability and lifetime prediction is a key point for Solid State Lighting adoption. For this purpose, one hundred and fifty LEDs have been aged for a reliability analysis. LEDs have been grouped following nine current-temperature stress conditions. Stress driving current was fixed between 350mA and 1A and ambient temperature between 85C and 120°C. Using integrating sphere and I(V) measurements, a cross study of the evolution of electrical and optical characteristics has been done. Results show two main failure mechanisms regarding lumen maintenance. The first one is the typically observed lumen depreciation and the second one is a much more quicker depreciation related to an increase of the leakage and non radiative currents. Models of the typical lumen depreciation and leakage resistance depreciation have been made using electrical and optical measurements during the aging tests. The combination of those models allows a new method toward a quicker LED lifetime prediction. These two models have been used for lifetime predictions for LEDs.

  12. [A reliability growth assessment method and its application in the development of equipment in space cabin].

    Science.gov (United States)

    Chen, J D; Sun, H L

    1999-04-01

    Objective. To assess and predict reliability of an equipment dynamically by making full use of various test informations in the development of products. Method. A new reliability growth assessment method based on army material system analysis activity (AMSAA) model was developed. The method is composed of the AMSAA model and test data conversion technology. Result. The assessment and prediction results of a space-borne equipment conform to its expectations. Conclusion. It is suggested that this method should be further researched and popularized.

  13. Increasing reliability of Gauss-Kronrod quadrature by Eratosthenes' sieve method

    Science.gov (United States)

    Adam, Gh.; Adam, S.

    2001-04-01

    The reliability of the local error estimates returned by the Gauss-Kronrod quadrature rules can be raised up to the theoretical 100% rate of success, under error estimate sharpening, provided a number of natural validating conditions are required. The self-validating scheme of the local error estimates, which is easy to implement and adds little supplementary computing effort, strengthens considerably the correctness of the decisions within the automatic adaptive quadrature.

  14. Reliability of estimating the room volume from a single room impulse response

    OpenAIRE

    Kuster, M.

    2008-01-01

    The methods investigated for the room volume estimation are based on geometrical acoustics, eigenmode, and diffuse field models and no data other than the room impulse response are available. The measurements include several receiver positions in a total of 12 rooms of vastly different sizes and acoustic characteristics. The limitations in identifying the pivotal specular reflections of the geometrical acoustics model in measured room impulse responses are examined both theoretically and expe...

  15. Analysis methods for structure reliability of piping components

    International Nuclear Information System (INIS)

    Schimpfke, T.; Grebner, H.; Sievers, J.

    2004-01-01

    In the frame of the German reactor safety research program of the Federal Ministry of Economics and Labour (BMWA) GRS has started to develop an analysis code named PROST (PRObabilistic STructure analysis) for estimating the leak and break probabilities of piping systems in nuclear power plants. The long-term objective of this development is to provide failure probabilities of passive components for probabilistic safety analysis of nuclear power plants. Up to now the code can be used for calculating fatigue problems. The paper mentions the main capabilities and theoretical background of the present PROST development and presents some of the results of a benchmark analysis in the frame of the European project NURBIM (Nuclear Risk Based Inspection Methodologies for Passive Components). (orig.)

  16. Reliable method for fission source convergence of Monte Carlo criticality calculation with Wielandt's method

    International Nuclear Information System (INIS)

    Yamamoto, Toshihiro; Miyoshi, Yoshinori

    2004-01-01

    A new algorithm of Monte Carlo criticality calculations for implementing Wielandt's method, which is one of acceleration techniques for deterministic source iteration methods, is developed, and the algorithm can be successfully implemented into MCNP code. In this algorithm, part of fission neutrons emitted during random walk processes are tracked within the current cycle, and thus a fission source distribution used in the next cycle spread more widely. Applying this method intensifies a neutron interaction effect even in a loosely-coupled array where conventional Monte Carlo criticality methods have difficulties, and a converged fission source distribution can be obtained with fewer cycles. Computing time spent for one cycle, however, increases because of tracking fission neutrons within the current cycle, which eventually results in an increase of total computing time up to convergence. In addition, statistical fluctuations of a fission source distribution in a cycle are worsened by applying Wielandt's method to Monte Carlo criticality calculations. However, since a fission source convergence is attained with fewer source iterations, a reliable determination of convergence can easily be made even in a system with a slow convergence. This acceleration method is expected to contribute to prevention of incorrect Monte Carlo criticality calculations. (author)

  17. A New Method of Reliability Evaluation Based on Wavelet Information Entropy for Equipment Condition Identification

    International Nuclear Information System (INIS)

    He, Z J; Zhang, X L; Chen, X F

    2012-01-01

    Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.

  18. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  19. Assessment of modern methods of human factor reliability analysis in PSA studies

    International Nuclear Information System (INIS)

    Holy, J.

    2001-12-01

    The report is structured as follows: Classical terms and objects (Probabilistic safety assessment as a framework for human reliability assessment; Human failure within the PSA model; Basic types of operator failure modelled in a PSA study and analyzed by HRA methods; Qualitative analysis of human reliability; Quantitative analysis of human reliability used; Process of analysis of nuclear reactor operator reliability in a PSA study); New terms and objects (Analysis of dependences; Errors of omission; Errors of commission; Error forcing context); and Overview and brief assessment of human reliability analysis (Basic characteristics of the methods; Assets and drawbacks of the use of each of HRA method; History and prospects of the use of the methods). (P.A.)

  20. System and method for traffic signal timing estimation

    KAUST Repository

    Dumazert, Julien; Claudel, Christian G.

    2015-01-01

    A method and system for estimating traffic signals. The method and system can include constructing trajectories of probe vehicles from GPS data emitted by the probe vehicles, estimating traffic signal cycles, combining the estimates, and computing the traffic signal timing by maximizing a scoring function based on the estimates. Estimating traffic signal cycles can be based on transition times of the probe vehicles starting after a traffic signal turns green.

  1. System and method for traffic signal timing estimation

    KAUST Repository

    Dumazert, Julien

    2015-12-30

    A method and system for estimating traffic signals. The method and system can include constructing trajectories of probe vehicles from GPS data emitted by the probe vehicles, estimating traffic signal cycles, combining the estimates, and computing the traffic signal timing by maximizing a scoring function based on the estimates. Estimating traffic signal cycles can be based on transition times of the probe vehicles starting after a traffic signal turns green.

  2. Study of Fuze Structure and Reliability Design Based on the Direct Search Method

    Science.gov (United States)

    Lin, Zhang; Ning, Wang

    2017-03-01

    Redundant design is one of the important methods to improve the reliability of the system, but mutual coupling of multiple factors is often involved in the design. In my study, Direct Search Method is introduced into the optimum redundancy configuration for design optimization, in which, the reliability, cost, structural weight and other factors can be taken into account simultaneously, and the redundant allocation and reliability design of aircraft critical system are computed. The results show that this method is convenient and workable, and applicable to the redundancy configurations and optimization of various designs upon appropriate modifications. And this method has a good practical value.

  3. Formulating informative, data-based priors for failure probability estimation in reliability analysis

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2007-01-01

    Priors play an important role in the use of Bayesian methods in risk analysis, and using all available information to formulate an informative prior can lead to more accurate posterior inferences. This paper examines the practical implications of using five different methods for formulating an informative prior for a failure probability based on past data. These methods are the method of moments, maximum likelihood (ML) estimation, maximum entropy estimation, starting from a non-informative 'pre-prior', and fitting a prior based on confidence/credible interval matching. The priors resulting from the use of these different methods are compared qualitatively, and the posteriors are compared quantitatively based on a number of different scenarios of observed data used to update the priors. The results show that the amount of information assumed in the prior makes a critical difference in the accuracy of the posterior inferences. For situations in which the data used to formulate the informative prior is an accurate reflection of the data that is later observed, the ML approach yields the minimum variance posterior. However, the maximum entropy approach is more robust to differences between the data used to formulate the prior and the observed data because it maximizes the uncertainty in the prior subject to the constraints imposed by the past data

  4. Application of reliability analysis methods to the comparison of two safety circuits

    International Nuclear Information System (INIS)

    Signoret, J.-P.

    1975-01-01

    Two circuits of different design, intended for assuming the ''Low Pressure Safety Injection'' function in PWR reactors are analyzed using reliability methods. The reliability analysis of these circuits allows the failure trees to be established and the failure probability derived. The dependence of these results on test use and maintenance is emphasized as well as critical paths. The great number of results obtained may allow a well-informed choice taking account of the reliability wanted for the type of circuits [fr

  5. The Language Teaching Methods Scale: Reliability and Validity Studies

    Science.gov (United States)

    Okmen, Burcu; Kilic, Abdurrahman

    2016-01-01

    The aim of this research is to develop a scale to determine the language teaching methods used by English teachers. The research sample consisted of 300 English teachers who taught at Duzce University and in primary schools, secondary schools and high schools in the Provincial Management of National Education in the city of Duzce in 2013-2014…

  6. A method to determine validity and reliability of activity sensors

    NARCIS (Netherlands)

    Boerema, Simone Theresa; Hermens, Hermanus J.

    2013-01-01

    METHOD Four sensors were securely fastened to a mechanical oscillator (Vibration Exciter, type 4809, Brüel & Kjær) and moved at various frequencies (6.67Hz; 13.45Hz; 19.88Hz) within the range of human physical activity. For each of the three sensor axes, the sensors were simultaneously moved for

  7. Reliability and Validity of the Research Methods Skills Assessment

    Science.gov (United States)

    Smith, Tamarah; Smith, Samantha

    2018-01-01

    The Research Methods Skills Assessment (RMSA) was created to measure psychology majors' statistics knowledge and skills. The American Psychological Association's Guidelines for the Undergraduate Major in Psychology (APA, 2007, 2013) served as a framework for development. Results from a Rasch analysis with data from n = 330 undergraduates showed…

  8. Development of a Method for Quantifying the Reliability of Nuclear Safety-Related Software

    International Nuclear Information System (INIS)

    Yi Zhang; Golay, Michael W.

    2003-01-01

    The work of our project is intended to help introducing digital technologies into nuclear power into nuclear power plant safety related software applications. In our project we utilize a combination of modern software engineering methods: design process discipline and feedback, formal methods, automated computer aided software engineering tools, automatic code generation, and extensive feasible structure flow path testing to improve software quality. The tactics include ensuring that the software structure is kept simple, permitting routine testing during design development, permitting extensive finished product testing in the input data space of most likely service and using test-based Bayesian updating to estimate the probability that a random software input will encounter an error upon execution. From the results obtained the software reliability can be both improved and its value estimated. Hopefully our success in the project's work can aid the transition of the nuclear enterprise into the modern information world. In our work, we have been using the proprietary sample software, the digital Signal Validation Algorithm (SVA), provided by Westinghouse. Also our work is being done with their collaboration. The SVA software is used for selecting the plant instrumentation signal set which is to be used as the input the digital Plant Protection System (PPS). This is the system that automatically decides whether to trip the reactor. In our work, we are using -001 computer assisted software engineering (CASE) tool of Hamilton Technologies Inc. This tool is capable of stating the syntactic structure of a program reflecting its state requirements, logical functions and data structure

  9. Reliability and validity of non-radiographic methods of thoracic kyphosis measurement: a systematic review.

    Science.gov (United States)

    Barrett, Eva; McCreesh, Karen; Lewis, Jeremy

    2014-02-01

    A wide array of instruments are available for non-invasive thoracic kyphosis measurement. Guidelines for selecting outcome measures for use in clinical and research practice recommend that properties such as validity and reliability are considered. This systematic review reports on the reliability and validity of non-invasive methods for measuring thoracic kyphosis. A systematic search of 11 electronic databases located studies assessing reliability and/or validity of non-invasive thoracic kyphosis measurement techniques. Two independent reviewers used a critical appraisal tool to assess the quality of retrieved studies. Data was extracted by the primary reviewer. The results were synthesized qualitatively using a level of evidence approach. 27 studies satisfied the eligibility criteria and were included in the review. The reliability, validity and both reliability and validity were investigated by sixteen, two and nine studies respectively. 17/27 studies were deemed to be of high quality. In total, 15 methods of thoracic kyphosis were evaluated in retrieved studies. All investigated methods showed high (ICC ≥ .7) to very high (ICC ≥ .9) levels of reliability. The validity of the methods ranged from low to very high. The strongest levels of evidence for reliability exists in support of the Debrunner kyphometer, Spinal Mouse and Flexicurve index, and for validity supports the arcometer and Flexicurve index. Further reliability and validity studies are required to strengthen the level of evidence for the remaining methods of measurement. This should be addressed by future research. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. TESTING METHODS FOR MECHANICALLY IMPROVED SOILS: RELIABILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Ana Petkovšek

    2017-10-01

    Full Text Available A possibility of in-situ mechanical improvement for reducing the liquefaction potential of silty sands was investigated by using three different techniques: Vibratory Roller Compaction, Rapid Impact Compaction (RIC and Soil Mixing. Material properties at all test sites were investigated before and after improvement with the laboratory and the in situ tests (CPT, SDMT, DPSH B, static and dynamic load plate test, geohydraulic tests. Correlation between the results obtained by different test methods gave inconclusive answers.

  11. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  12. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement.

    Science.gov (United States)

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-12-01

    The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis.

  13. Prevalence of family violence in adults and children : Estimates using the capture-recapture method

    NARCIS (Netherlands)

    Oosterlee, A.; Vink, R.M.; Smit, F.

    2009-01-01

    Background: Reliable prevalence estimates of family violence in adults and children are difficult to obtain. Most are based on surveys or registration counts, whose research designs and methods are often questionable, making the results difficult to compare. This article presents an alternative

  14. Reliability Evaluation of Bridges Based on Nonprobabilistic Response Surface Limit Method

    OpenAIRE

    Chen, Xuyong; Chen, Qian; Bian, Xiaoya; Fan, Jianping

    2017-01-01

    Due to many uncertainties in nonprobabilistic reliability assessment of bridges, the limit state function is generally unknown. The traditional nonprobabilistic response surface method is a lengthy and oscillating iteration process and leads to difficultly solving the nonprobabilistic reliability index. This article proposes a nonprobabilistic response surface limit method based on the interval model. The intention of this method is to solve the upper and lower limits of the nonprobabilistic ...

  15. Using operational data to estimate the reliable yields of water-supply wells

    Science.gov (United States)

    Misstear, Bruce D. R.; Beeson, Sarah

    The reliable yield of a water-supply well depends on many different factors, including the properties of the well and the aquifer; the capacities of the pumps, raw-water mains, and treatment works; the interference effects from other wells; and the constraints imposed by ion licences, water quality, and environmental issues. A relatively simple methodology for estimating reliable yields has been developed that takes into account all of these factors. The methodology is based mainly on an analysis of water-level and source-output data, where such data are available. Good operational data are especially important when dealing with wells in shallow, unconfined, fissure-flow aquifers, where actual well performance may vary considerably from that predicted using a more analytical approach. Key issues in the yield-assessment process are the identification of a deepest advisable pumping water level, and the collection of the appropriate well, aquifer, and operational data. Although developed for water-supply operators in the United Kingdom, this approach to estimating the reliable yields of water-supply wells using operational data should be applicable to a wide range of hydrogeological conditions elsewhere. Résumé La productivité d'un puits capté pour l'adduction d'eau potable dépend de différents facteurs, parmi lesquels les propriétés du puits et de l'aquifère, la puissance des pompes, le traitement des eaux brutes, les effets d'interférences avec d'autres puits et les contraintes imposées par les autorisations d'exploitation, par la qualité des eaux et par les conditions environnementales. Une méthodologie relativement simple d'estimation de la productivité qui prenne en compte tous ces facteurs a été mise au point. Cette méthodologie est basée surtout sur une analyse des données concernant le niveau piézométrique et le débit de prélèvement, quand ces données sont disponibles. De bonnes données opérationnelles sont particuli

  16. System reliability with correlated components: Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, A.C.W.M.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  17. System reliability with correlated components : Accuracy of the Equivalent Planes method

    NARCIS (Netherlands)

    Roscoe, K.; Diermanse, F.; Vrouwenvelder, T.

    2015-01-01

    Computing system reliability when system components are correlated presents a challenge because it usually requires solving multi-fold integrals numerically, which is generally infeasible due to the computational cost. In Dutch flood defense reliability modeling, an efficient method for computing

  18. An Efficient Acoustic Density Estimation Method with Human Detectors Applied to Gibbons in Cambodia.

    Directory of Open Access Journals (Sweden)

    Darren Kidney

    Full Text Available Some animal species are hard to see but easy to hear. Standard visual methods for estimating population density for such species are often ineffective or inefficient, but methods based on passive acoustics show more promise. We develop spatially explicit capture-recapture (SECR methods for territorial vocalising species, in which humans act as an acoustic detector array. We use SECR and estimated bearing data from a single-occasion acoustic survey of a gibbon population in northeastern Cambodia to estimate the density of calling groups. The properties of the estimator are assessed using a simulation study, in which a variety of survey designs are also investigated. We then present a new form of the SECR likelihood for multi-occasion data which accounts for the stochastic availability of animals. In the context of gibbon surveys this allows model-based estimation of the proportion of groups that produce territorial vocalisations on a given day, thereby enabling the density of groups, instead of the density of calling groups, to be estimated. We illustrate the performance of this new estimator by simulation. We show that it is possible to estimate density reliably from human acoustic detections of visually cryptic species using SECR methods. For gibbon surveys we also show that incorporating observers' estimates of bearings to detected groups substantially improves estimator performance. Using the new form of the SECR likelihood we demonstrate that estimates of availability, in addition to population density and detection function parameters, can be obtained from multi-occasion data, and that the detection function parameters are not confounded with the availability parameter. This acoustic SECR method provides a means of obtaining reliable density estimates for territorial vocalising species. It is also efficient in terms of data requirements since since it only requires routine survey data. We anticipate that the low-tech field requirements will

  19. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  20. Human factors estimation methods in nuclear power plant

    International Nuclear Information System (INIS)

    Takano, Kenichi; Yoshino, Kenji; Nagasaka, Akihiko; Ishii, Keichiro; Nakasa, Hiroyasu

    1985-01-01

    To improve the operational and maintenance work reliability, it is neccessary for workers to maintain his performance always at high level, that leads to decreasing mistaken judgements and operations. This paper inuolves the development and evaluation of ''Multi-Purpose Physiological Information Measurement system'' to estimate human performance and conditions with a highly fixed quantity. The following itemes is mentioned : (1) Most suitable physiological informations are selected to measure worker' performance in nuclear power plant with none-disturbance, ambulatory, continual, and multi channel measurement. (2) Relatively important physiological informations are measured with the real-time monitoring functions. (electrocardiogram, respirometric functions and EMG (electromyogram) pulse rete). (3) It is made to optimize the measurement condition and analysing methods in the use of a noise-cut function and a D.C. drift cutting method. (4) As a example, it is clear that, when the different weight is loaded to the arm and make it strech-bend motion, the EMG signal is measured and analysed by this system, the analysed EMG pulse rate and maximum amplitude is related to the arm loaded weight. (author)

  1. Some improvements in the estimation of 137Cs in urine by the AMP-chlorostannate method

    International Nuclear Information System (INIS)

    Kalaiselvan, S.; Prasad, M.V.R.

    1988-01-01

    An accurate and reliable method was developed for the estimation of radiocesium in urine. Initially cesium is adsorbed on ammonium phosphomolybdate (AMP) precipitate and separated by ion exchange from other contaminants. Cesium thus separated is estimated as cesium chlorostannate, Cs 2 SnCl 6 , from a 50 (v/v)% solution of concentrated HCl in ethyl alcohol. While the results are in good agreement with the values obtained by γ-spectrometry using a Marinelli beaker, the present method has a much lower detection limit. It is observed that the method has significant advantages over the methods available with respect to analysis time, accuracy and detection limits. (author) 10 refs.; 3 tabs

  2. A Reliable Method for Rhythm Analysis during Cardiopulmonary Resuscitation

    Directory of Open Access Journals (Sweden)

    U. Ayala

    2014-01-01

    Full Text Available Interruptions in cardiopulmonary resuscitation (CPR compromise defibrillation success. However, CPR must be interrupted to analyze the rhythm because although current methods for rhythm analysis during CPR have high sensitivity for shockable rhythms, the specificity for nonshockable rhythms is still too low. This paper introduces a new approach to rhythm analysis during CPR that combines two strategies: a state-of-the-art CPR artifact suppression filter and a shock advice algorithm (SAA designed to optimally classify the filtered signal. Emphasis is on designing an algorithm with high specificity. The SAA includes a detector for low electrical activity rhythms to increase the specificity, and a shock/no-shock decision algorithm based on a support vector machine classifier using slope and frequency features. For this study, 1185 shockable and 6482 nonshockable 9-s segments corrupted by CPR artifacts were obtained from 247 patients suffering out-of-hospital cardiac arrest. The segments were split into a training and a test set. For the test set, the sensitivity and specificity for rhythm analysis during CPR were 91.0% and 96.6%, respectively. This new approach shows an important increase in specificity without compromising the sensitivity when compared to previous studies.

  3. Method of estimating the reactor power distribution

    International Nuclear Information System (INIS)

    Mitsuta, Toru; Fukuzaki, Takaharu; Doi, Kazuyori; Kiguchi, Takashi.

    1984-01-01

    Purpose: To improve the calculation accuracy for the power distribution thereby improve the reliability of power distribution monitor. Constitution: In detector containing strings disposed within a reactor core, movable type neutron flux monitors are provided in addition to position fixed type neutron monitors conventionally disposed so far. Upon periodical monitoring, a power distribution X1 is calculated from a physical reactor core model. Then, a higher power position X2 is detected by position detectors and value X2 is sent to a neutron flux monitor driving device to displace the movable type monitors to a higher power position in each of the strings. After displacement, the value X1 is amended by an amending device using measured values from the movable type and fixed type monitors and the amended value is sent to a reactor core monitor device. Upon failure of the fixed type monitors, the position is sent to the monitor driving device and the movable monitors are displaced to that position for measurement. (Sekiya, K.)

  4. Bin mode estimation methods for Compton camera imaging

    International Nuclear Information System (INIS)

    Ikeda, S.; Odaka, H.; Uemura, M.; Takahashi, T.; Watanabe, S.; Takeda, S.

    2014-01-01

    We study the image reconstruction problem of a Compton camera which consists of semiconductor detectors. The image reconstruction is formulated as a statistical estimation problem. We employ a bin-mode estimation (BME) and extend an existing framework to a Compton camera with multiple scatterers and absorbers. Two estimation algorithms are proposed: an accelerated EM algorithm for the maximum likelihood estimation (MLE) and a modified EM algorithm for the maximum a posteriori (MAP) estimation. Numerical simulations demonstrate the potential of the proposed methods

  5. Empirical methods for estimating future climatic conditions

    International Nuclear Information System (INIS)

    Anon.

    1990-01-01

    Applying the empirical approach permits the derivation of estimates of the future climate that are nearly independent of conclusions based on theoretical (model) estimates. This creates an opportunity to compare these results with those derived from the model simulations of the forthcoming changes in climate, thus increasing confidence in areas of agreement and focusing research attention on areas of disagreements. The premise underlying this approach for predicting anthropogenic climate change is based on associating the conditions of the climatic optimums of the Holocene, Eemian, and Pliocene with corresponding stages of the projected increase of mean global surface air temperature. Provided that certain assumptions are fulfilled in matching the value of the increased mean temperature for a certain epoch with the model-projected change in global mean temperature in the future, the empirical approach suggests that relationships leading to the regional variations in air temperature and other meteorological elements could be deduced and interpreted based on use of empirical data describing climatic conditions for past warm epochs. Considerable care must be taken, of course, in making use of these spatial relationships, especially in accounting for possible large-scale differences that might, in some cases, result from different factors contributing to past climate changes than future changes and, in other cases, might result from the possible influences of changes in orography and geography on regional climatic conditions over time

  6. Statistically Efficient Methods for Pitch and DOA Estimation

    DEFF Research Database (Denmark)

    Jensen, Jesper Rindom; Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2013-01-01

    , it was recently considered to estimate the DOA and pitch jointly. In this paper, we propose two novel methods for DOA and pitch estimation. They both yield maximum-likelihood estimates in white Gaussian noise scenar- ios, where the SNR may be different across channels, as opposed to state-of-the-art methods......Traditionally, direction-of-arrival (DOA) and pitch estimation of multichannel, periodic sources have been considered as two separate problems. Separate estimation may render the task of resolving sources with similar DOA or pitch impossible, and it may decrease the estimation accuracy. Therefore...

  7. Integration of external estimated breeding values and associated reliabilities using correlations among traits and effects.

    Science.gov (United States)

    Vandenplas, J; Colinet, F G; Glorieux, G; Bertozzi, C; Gengler, N

    2015-12-01

    Based on a Bayesian view of linear mixed models, several studies showed the possibilities to integrate estimated breeding values (EBV) and associated reliabilities (REL) provided by genetic evaluations performed outside a given evaluation system into this genetic evaluation. Hereafter, the term "internal" refers to this given genetic evaluation system, and the term "external" refers to all other genetic evaluations performed outside the internal evaluation system. Bayesian approaches integrate external information (i.e., external EBV and associated REL) by altering both the mean and (co)variance of the prior distributions of the additive genetic effects based on the knowledge of this external information. Extensions of the Bayesian approaches to multivariate settings are interesting because external information expressed on other scales, measurement units, or trait definitions, or associated with different heritabilities and genetic parameters than the internal traits, could be integrated into a multivariate genetic evaluation without the need to convert external information to the internal traits. Therefore, the aim of this study was to test the integration of external EBV and associated REL, expressed on a 305-d basis and genetically correlated with a trait of interest, into a multivariate genetic evaluation using a random regression test-day model for the trait of interest. The approach we used was a multivariate Bayesian approach. Results showed that the integration of external information led to a genetic evaluation for the trait of interest for, at least, animals associated with external information, as accurate as a bivariate evaluation including all available phenotypic information. In conclusion, the multivariate Bayesian approaches have the potential to integrate external information correlated with the internal phenotypic traits, and potentially to the different random regressions, into a multivariate genetic evaluation. This allows the use of different

  8. portfolio optimization based on nonparametric estimation methods

    Directory of Open Access Journals (Sweden)

    mahsa ghandehari

    2017-03-01

    Full Text Available One of the major issues investors are facing with in capital markets is decision making about select an appropriate stock exchange for investing and selecting an optimal portfolio. This process is done through the risk and expected return assessment. On the other hand in portfolio selection problem if the assets expected returns are normally distributed, variance and standard deviation are used as a risk measure. But, the expected returns on assets are not necessarily normal and sometimes have dramatic differences from normal distribution. This paper with the introduction of conditional value at risk ( CVaR, as a measure of risk in a nonparametric framework, for a given expected return, offers the optimal portfolio and this method is compared with the linear programming method. The data used in this study consists of monthly returns of 15 companies selected from the top 50 companies in Tehran Stock Exchange during the winter of 1392 which is considered from April of 1388 to June of 1393. The results of this study show the superiority of nonparametric method over the linear programming method and the nonparametric method is much faster than the linear programming method.

  9. The hockey-stick method to estimate evening dim light melatonin onset (DLMO) in humans.

    Science.gov (United States)

    Danilenko, Konstantin V; Verevkin, Evgeniy G; Antyufeev, Viktor S; Wirz-Justice, Anna; Cajochen, Christian

    2014-04-01

    The onset of melatonin secretion in the evening is the most reliable and most widely used index of circadian timing in humans. Saliva (or plasma) is usually sampled every 0.5-1 hours under dim-light conditions in the evening 5-6 hours before usual bedtime to assess the dim-light melatonin onset (DLMO). For many years, attempts have been made to find a reliable objective determination of melatonin onset time either by fixed or dynamic threshold approaches. The here-developed hockey-stick algorithm, used as an interactive computer-based approach, fits the evening melatonin profile by a piecewise linear-parabolic function represented as a straight line switching to the branch of a parabola. The switch point is considered to reliably estimate melatonin rise time. We applied the hockey-stick method to 109 half-hourly melatonin profiles to assess the DLMOs and compared these estimates to visual ratings from three experts in the field. The DLMOs of 103 profiles were considered to be clearly quantifiable. The hockey-stick DLMO estimates were on average 4 minutes earlier than the experts' estimates, with a range of -27 to +13 minutes; in 47% of the cases the difference fell within ±5 minutes, in 98% within -20 to +13 minutes. The raters' and hockey-stick estimates showed poor accordance with DLMOs defined by threshold methods. Thus, the hockey-stick algorithm is a reliable objective method to estimate melatonin rise time, which does not depend on a threshold value and is free from errors arising from differences in subjective circadian phase estimates. The method is available as a computerized program that can be easily used in research settings and clinical practice either for salivary or plasma melatonin values.

  10. A flexible latent class approach to estimating test-score reliability

    NARCIS (Netherlands)

    van der Palm, D.W.; van der Ark, L.A.; Sijtsma, K.

    2014-01-01

    The latent class reliability coefficient (LCRC) is improved by using the divisive latent class model instead of the unrestricted latent class model. This results in the divisive latent class reliability coefficient (DLCRC), which unlike LCRC avoids making subjective decisions about the best solution

  11. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    Science.gov (United States)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  12. Reliability Analysis Of Fire System On The Industry Facility By Use Fameca Method

    International Nuclear Information System (INIS)

    Sony T, D.T.; Situmorang, Johnny; Ismu W, Puradwi; Demon H; Mulyanto, Dwijo; Kusmono, Slamet; Santa, Sigit Asmara

    2000-01-01

    FAMECA is one of the analysis method to determine system reliability on the industry facility. Analysis is done by some procedure that is identification of component function, determination of failure mode, severity level and effect of their failure. Reliability value is determined by three combinations that is severity level, component failure value and critical component. Reliability of analysis has been done for fire system on the industry by FAMECA method. Critical component which identified is pump, air release valve, check valve, manual test valve, isolation valve, control system etc

  13. Verification of practicability of quantitative reliability evaluation method (De-BDA) in nuclear power plants

    International Nuclear Information System (INIS)

    Takahashi, Kinshiro; Yukimachi, Takeo.

    1988-01-01

    A variety of methods have been applied to study of reliability analysis in which human factors are included in order to enhance the safety and availability of nuclear power plants. De-BDA (Detailed Block Diagram Analysis) is one of such mehtods developed with the objective of creating a more comprehensive and understandable tool for quantitative analysis of reliability associated with plant operations. The practicability of this method has been verified by applying it to reliability analysis of various phases of plant operation as well as evaluation of enhanced man-machine interface in the central control room. (author)

  14. Estimation of subcriticality of TCA using 'indirect estimation method for calculation error'

    International Nuclear Information System (INIS)

    Naito, Yoshitaka; Yamamoto, Toshihiro; Arakawa, Takuya; Sakurai, Kiyoshi

    1996-01-01

    To estimate the subcriticality of neutron multiplication factor in a fissile system, 'Indirect Estimation Method for Calculation Error' is proposed. This method obtains the calculational error of neutron multiplication factor by correlating measured values with the corresponding calculated ones. This method was applied to the source multiplication and to the pulse neutron experiments conducted at TCA, and the calculation error of MCNP 4A was estimated. In the source multiplication method, the deviation of measured neutron count rate distributions from the calculated ones estimates the accuracy of calculated k eff . In the pulse neutron method, the calculation errors of prompt neutron decay constants give the accuracy of the calculated k eff . (author)

  15. A study in the reliability analysis method for nuclear power plant structures (I)

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Byung Hwan; Choi, Seong Cheol; Shin, Ho Sang; Yang, In Hwan; Kim, Yi Sung; Yu, Young; Kim, Se Hun [Seoul, Nationl Univ., Seoul (Korea, Republic of)

    1999-03-15

    Nuclear power plant structures may be exposed to aggressive environmental effects that may cause their strength and stiffness to decrease over their service life. Although the physics of these damage mechanisms are reasonably well understood and quantitative evaluation of their effects on time-dependent structural behavior is possible in some instances, such evaluations are generally very difficult and remain novel. The assessment of existing steel containment in nuclear power plants for continued service must provide quantitative evidence that they are able to withstand future extreme loads during a service period with an acceptable level of reliability. Rational methodologies to perform the reliability assessment can be developed from mechanistic models of structural deterioration, using time-dependent structural reliability analysis to take loading and strength uncertainties into account. The final goal of this study is to develop the analysis method for the reliability of containment structures. The cause and mechanism of corrosion is first clarified and the reliability assessment method has been established. By introducing the equivalent normal distribution, the procedure of reliability analysis which can determine the failure probabilities has been established. The influence of design variables to reliability and the relation between the reliability and service life will be continued second year research.

  16. Comparison of different methods in estimating potential evapotranspiration at Muda Irrigation Scheme of Malaysia

    Directory of Open Access Journals (Sweden)

    Sobri Harun

    2012-04-01

    Full Text Available Evapotranspiration (ET is a complex process in the hydrological cycle that influences the quantity of runoff and thus the irrigation water requirements. Numerous methods have been developed to estimate potential evapotranspiration (PET. Unfortunately, most of the reliable PET methods are parameter rich models and therefore, not feasible for application in data scarce regions. On the other hand, accuracy and reliability of simple PET models vary widely according to regional climate conditions. The objective of the present study was to evaluate the performance of three temperature-based and three radiation-based simple ET methods in estimating historical ET and projecting future ET at Muda Irrigation Scheme at Kedah, Malaysia. The performance was measured by comparing those methods with the parameter intensive Penman-Monteith Method. It was found that radiation based methods gave better performance compared to temperature-based methods in estimation of ET in the study area. Future ET simulated from projected climate data obtained through statistical downscaling technique also showed that radiation-based methods can project closer ET values to that projected by Penman-Monteith Method. It is expected that the study will guide in selecting suitable methods for estimating and projecting ET in accordance to availability of meteorological data.

  17. Testing a statistical method of global mean palotemperature estimations in a long climate simulation

    Energy Technology Data Exchange (ETDEWEB)

    Zorita, E.; Gonzalez-Rouco, F. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Hydrophysik

    2001-07-01

    Current statistical methods of reconstructing the climate of the last centuries are based on statistical models linking climate observations (temperature, sea-level-pressure) and proxy-climate data (tree-ring chronologies, ice-cores isotope concentrations, varved sediments, etc.). These models are calibrated in the instrumental period, and the longer time series of proxy data are then used to estimate the past evolution of the climate variables. Using such methods the global mean temperature of the last 600 years has been recently estimated. In this work this method of reconstruction is tested using data from a very long simulation with a climate model. This testing allows to estimate the errors of the estimations as a function of the number of proxy data and the time scale at which the estimations are probably reliable. (orig.)

  18. Linear Interaction Energy Based Prediction of Cytochrome P450 1A2 Binding Affinities with Reliability Estimation.

    Directory of Open Access Journals (Sweden)

    Luigi Capoferri

    Full Text Available Prediction of human Cytochrome P450 (CYP binding affinities of small ligands, i.e., substrates and inhibitors, represents an important task for predicting drug-drug interactions. A quantitative assessment of the ligand binding affinity towards different CYPs can provide an estimate of inhibitory activity or an indication of isoforms prone to interact with the substrate of inhibitors. However, the accuracy of global quantitative models for CYP substrate binding or inhibition based on traditional molecular descriptors can be limited, because of the lack of information on the structure and flexibility of the catalytic site of CYPs. Here we describe the application of a method that combines protein-ligand docking, Molecular Dynamics (MD simulations and Linear Interaction Energy (LIE theory, to allow for quantitative CYP affinity prediction. Using this combined approach, a LIE model for human CYP 1A2 was developed and evaluated, based on a structurally diverse dataset for which the estimated experimental uncertainty was 3.3 kJ mol-1. For the computed CYP 1A2 binding affinities, the model showed a root mean square error (RMSE of 4.1 kJ mol-1 and a standard error in prediction (SDEP in cross-validation of 4.3 kJ mol-1. A novel approach that includes information on both structural ligand description and protein-ligand interaction was developed for estimating the reliability of predictions, and was able to identify compounds from an external test set with a SDEP for the predicted affinities of 4.6 kJ mol-1 (corresponding to 0.8 pKi units.

  19. Reliability Assessment Method of Reactor Protection System Software by Using V and Vbased Bayesian Nets

    International Nuclear Information System (INIS)

    Eom, H. S.; Park, G. Y.; Kang, H. G.; Son, H. S.

    2010-07-01

    Developed a methodology which can be practically used in quantitative reliability assessment of a safety c ritical software for a protection system of nuclear power plants. The base of the proposed methodology is V and V being used in the nuclear industry, which means that it is not affected with specific software development environments or parameters that are necessary for the reliability calculation. Modular and formal sub-BNs in the proposed methodology is useful tool to constitute the whole BN model for reliability assessment of a target software. The proposed V and V based BN model estimates the defects in the software according to the performance of V and V results and then calculate reliability of the software. A case study was carried out to validate the proposed methodology. The target software is the RPS SW which was developed by KNICS project

  20. Thermodynamic properties of organic compounds estimation methods, principles and practice

    CERN Document Server

    Janz, George J

    1967-01-01

    Thermodynamic Properties of Organic Compounds: Estimation Methods, Principles and Practice, Revised Edition focuses on the progression of practical methods in computing the thermodynamic characteristics of organic compounds. Divided into two parts with eight chapters, the book concentrates first on the methods of estimation. Topics presented are statistical and combined thermodynamic functions; free energy change and equilibrium conversions; and estimation of thermodynamic properties. The next discussions focus on the thermodynamic properties of simple polyatomic systems by statistical the

  1. Reliability of tumor volume estimation from MR images in patients with malignant glioma. Results from the American College of Radiology Imaging Network (ACRIN) 6662 Trial

    International Nuclear Information System (INIS)

    Ertl-Wagner, Birgit B.; Blume, Jeffrey D.; Herman, Benjamin; Peck, Donald; Udupa, Jayaram K.; Levering, Anthony; Schmalfuss, Ilona M.

    2009-01-01

    Reliable assessment of tumor growth in malignant glioma poses a common problem both clinically and when studying novel therapeutic agents. We aimed to evaluate two software-systems in their ability to estimate volume change of tumor and/or edema on magnetic resonance (MR) images of malignant gliomas. Twenty patients with malignant glioma were included from different sites. Serial post-operative MR images were assessed with two software systems representative of the two fundamental segmentation methods, single-image fuzzy analysis (3DVIEWNIX-TV) and multi-spectral-image analysis (Eigentool), and with a manual method by 16 independent readers (eight MR-certified technologists, four neuroradiology fellows, four neuroradiologists). Enhancing tumor volume and tumor volume plus edema were assessed independently by each reader. Intraclass correlation coefficients (ICCs), variance components, and prediction intervals were estimated. There were no significant differences in the average tumor volume change over time between the software systems (p > 0.05). Both software systems were much more reliable and yielded smaller prediction intervals than manual measurements. No significant differences were observed between the volume changes determined by fellows/neuroradiologists or technologists.Semi-automated software systems are reliable tools to serve as outcome parameters in clinical studies and the basis for therapeutic decision-making for malignant gliomas, whereas manual measurements are less reliable and should not be the basis for clinical or research outcome studies. (orig.)

  2. A reliable method for ageing of whiting (Merlangius merlangus) for use in stock assessment and management

    DEFF Research Database (Denmark)

    Ross, Stine Dalmann; Hüssy, Karin

    2013-01-01

    Accurate age estimation is important for stock assessment and management. The importance of reliable ageing is emphasized by the impending analytical assessment of whiting (Merlangius merlangus) in the Baltic Sea. Whiting is a top predator in the western Baltic Sea, where it is fished commercially...

  3. Computational intelligence methods for the efficient reliability analysis of complex flood defence structures

    NARCIS (Netherlands)

    Kingston, Greer B.; Rajabali Nejad, Mohammadreza; Gouldby, Ben P.; van Gelder, Pieter H.A.J.M.

    2011-01-01

    With the continual rise of sea levels and deterioration of flood defence structures over time, it is no longer appropriate to define a design level of flood protection, but rather, it is necessary to estimate the reliability of flood defences under varying and uncertain conditions. For complex

  4. Theoretical basis, application, reliability, and sample size estimates of a Meridian Energy Analysis Device for Traditional Chinese Medicine Research

    Directory of Open Access Journals (Sweden)

    Ming-Yen Tsai

    Full Text Available OBJECTIVES: The Meridian Energy Analysis Device is currently a popular tool in the scientific research of meridian electrophysiology. In this field, it is generally believed that measuring the electrical conductivity of meridians provides information about the balance of bioenergy or Qi-blood in the body. METHODS AND RESULTS: PubMed database based on some original articles from 1956 to 2014 and the authoŕs clinical experience. In this short communication, we provide clinical examples of Meridian Energy Analysis Device application, especially in the field of traditional Chinese medicine, discuss the reliability of the measurements, and put the values obtained into context by considering items of considerable variability and by estimating sample size. CONCLUSION: The Meridian Energy Analysis Device is making a valuable contribution to the diagnosis of Qi-blood dysfunction. It can be assessed from short-term and long-term meridian bioenergy recordings. It is one of the few methods that allow outpatient traditional Chinese medicine diagnosis, monitoring the progress, therapeutic effect and evaluation of patient prognosis. The holistic approaches underlying the practice of traditional Chinese medicine and new trends in modern medicine toward the use of objective instruments require in-depth knowledge of the mechanisms of meridian energy, and the Meridian Energy Analysis Device can feasibly be used for understanding and interpreting traditional Chinese medicine theory, especially in view of its expansion in Western countries.

  5. Prediction of the shape of inline wave force and free surface elevation using First Order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Ghadirian, Amin; Bredmose, Henrik; Schløer, Signe

    2017-01-01

    theory, that is, the most likely time history of inline force around a force peak of given value. The results of FORM and NewForce are linearly identical and show only minor deviations at second order. The FORM results are then compared to wave averaged measurements of the same criteria for crest height......In design of substructures for offshore wind turbines, the extreme wave loads which are of interest in Ultimate Limit States are often estimated by choosing extreme events from linear random sea states and replacing them by either stream function wave theory or the NewWave theory of a certain...... design wave height. As these wave theories super from limitations such as symmetry around the crest, other methods to estimate the wave loads are needed. In the present paper, the First Order Reliability Method, FORM, is used systematically to estimate the most likely extreme wave shapes. Two parameters...

  6. System and method for correcting attitude estimation

    Science.gov (United States)

    Josselson, Robert H. (Inventor)

    2010-01-01

    A system includes an angular rate sensor disposed in a vehicle for providing angular rates of the vehicle, and an instrument disposed in the vehicle for providing line-of-sight control with respect to a line-of-sight reference. The instrument includes an integrator which is configured to integrate the angular rates of the vehicle to form non-compensated attitudes. Also included is a compensator coupled across the integrator, in a feed-forward loop, for receiving the angular rates of the vehicle and outputting compensated angular rates of the vehicle. A summer combines the non-compensated attitudes and the compensated angular rates of the to vehicle to form estimated vehicle attitudes for controlling the instrument with respect to the line-of-sight reference. The compensator is configured to provide error compensation to the instrument free-of any feedback loop that uses an error signal. The compensator may include a transfer function providing a fixed gain to the received angular rates of the vehicle. The compensator may, alternatively, include a is transfer function providing a variable gain as a function of frequency to operate on the received angular rates of the vehicle.

  7. Human factors estimation method in nuclear power plants

    International Nuclear Information System (INIS)

    Takano, Kenichi; Yoshino, Kenji; Nagasaka, Akihiko

    1987-01-01

    It is need for improving a NPS reliability to prevent human-errors of operators in a control room. Especially, the time error or omission error may be often caused by a exceed of the mental work load. Therefore, in order to decrease such kinds of human errors, not only the planning of an equipment and a console is well considered about proper level of mental work load but also the exceeded mental work load must be let down by trainning etc. This paper present measurement techniques of the mental work load by physiological informations and the relation between the error rate and mental work load on the basis of the experiment by various modeled tasks. Following results are obtained. (1) TSF, the indicator of the mental work load, is well correlated to the subsidary task reaction time. Therefore it is able to estimate the TSF by subsidary tasks if the task was loaded instanteniously with main task. (2) The relation between the TSF and GSR pulses rate has a 0.81 correlation factor except the case of a parallel processing task. Because we can evaluate the mental work load by the measurement of the GSR pulses rate if the task was processed by a single channel. But if uses GSR, the atomospheric condition is kept constant and the arousal level must be at the well stage. (3) The human error is greatly increase when the TSF exceed above 60 %, that values are almost agreed to the tolerance limit of the TTS methods. (author)

  8. Bearing Procurement Analysis Method by Total Cost of Ownership Analysis and Reliability Prediction

    Science.gov (United States)

    Trusaji, Wildan; Akbar, Muhammad; Sukoyo; Irianto, Dradjad

    2018-03-01

    In making bearing procurement analysis, price and its reliability must be considered as decision criteria, since price determines the direct cost as acquisition cost and reliability of bearing determine the indirect cost such as maintenance cost. Despite the indirect cost is hard to identify and measured, it has high contribution to overall cost that will be incurred. So, the indirect cost of reliability must be considered when making bearing procurement analysis. This paper tries to explain bearing evaluation method with the total cost of ownership analysis to consider price and maintenance cost as decision criteria. Furthermore, since there is a lack of failure data when bearing evaluation phase is conducted, reliability prediction method is used to predict bearing reliability from its dynamic load rating parameter. With this method, bearing with a higher price but has higher reliability is preferable for long-term planning. But for short-term planning the cheaper one but has lower reliability is preferable. This contextuality can give rise to conflict between stakeholders. Thus, the planning horizon needs to be agreed by all stakeholder before making a procurement decision.

  9. Application of nuclear-geophysical methods to reserves estimation

    International Nuclear Information System (INIS)

    Bessonova, T.B.; Karpenko, I.A.

    1980-01-01

    On the basis of the analysis of reports dealing with calculations of mineral reserves considered are shortcomings in using nuclear-geophysical methods and in assessment of the reliability of geophysical sampling. For increasing efficiency of nuclear-geophysical investigations while prospecting ore deposits, it is advisable to introduce them widely instead of traditional geological sampling methods. For this purpose it is necessary to increase sensitivity and accuracy of radioactivity logging methods, to provide determination of certain elements in ores by these methods

  10. Assessment of Advanced Life Support competence when combining different test methods--reliability and validity

    DEFF Research Database (Denmark)

    Ringsted, C; Lippert, F; Hesselfeldt, R

    2007-01-01

    Cardiac Arrest Simulation Test (CASTest) scenarios for the assessments according to guidelines 2005. AIMS: To analyse the reliability and validity of the individual sub-tests provided by ERC and to find a combination of MCQ and CASTest that provides a reliable and valid single effect measure of ALS...... that possessed high reliability, equality of test sets, and ability to discriminate between the two groups of supposedly different ALS competence. CONCLUSIONS: ERC sub-tests of ALS competence possess sufficient reliability and validity. A combined ALS score with equal weighting of one MCQ and one CASTest can...... competence. METHODS: Two groups of participants were included in this randomised, controlled experimental study: a group of newly graduated doctors, who had not taken the ALS course (N=17) and a group of students, who had passed the ALS course 9 months before the study (N=16). Reliability in terms of inter...

  11. Bayesian methods to estimate urban growth potential

    Science.gov (United States)

    Smith, Jordan W.; Smart, Lindsey S.; Dorning, Monica; Dupéy, Lauren Nicole; Méley, Andréanne; Meentemeyer, Ross K.

    2017-01-01

    Urban growth often influences the production of ecosystem services. The impacts of urbanization on landscapes can subsequently affect landowners’ perceptions, values and decisions regarding their land. Within land-use and land-change research, very few models of dynamic landscape-scale processes like urbanization incorporate empirically-grounded landowner decision-making processes. Very little attention has focused on the heterogeneous decision-making processes that aggregate to influence broader-scale patterns of urbanization. We examine the land-use tradeoffs faced by individual landowners in one of the United States’ most rapidly urbanizing regions − the urban area surrounding Charlotte, North Carolina. We focus on the land-use decisions of non-industrial private forest owners located across the region’s development gradient. A discrete choice experiment is used to determine the critical factors influencing individual forest owners’ intent to sell their undeveloped properties across a series of experimentally varied scenarios of urban growth. Data are analyzed using a hierarchical Bayesian approach. The estimates derived from the survey data are used to modify a spatially-explicit trend-based urban development potential model, derived from remotely-sensed imagery and observed changes in the region’s socioeconomic and infrastructural characteristics between 2000 and 2011. This modeling approach combines the theoretical underpinnings of behavioral economics with spatiotemporal data describing a region’s historical development patterns. By integrating empirical social preference data into spatially-explicit urban growth models, we begin to more realistically capture processes as well as patterns that drive the location, magnitude and rates of urban growth.

  12. Framework for estimating response time data to conduct a seismic human reliability analysis - its feasibility

    International Nuclear Information System (INIS)

    Park, Jinkyun; Kin, Yochan; Jung, Wondea; Jang, Seung Cheol

    2014-01-01

    This is because the PSA has been used for several decades as the representative tool to evaluate the safety of NPPs. To this end, it is inevitable to evaluate human error probabilities (HEPs) in conducting important tasks being considered in the PSA framework (i.e., HFEs; human failure events), which are able to significantly affect the safety of NPPs. In addition, it should be emphasized that the provision of a realistic human performance data is an important precondition for calculating HEPs under a seismic condition. Unfortunately, it seems that HRA methods being currently used for calculating HEPs under a seismic event do not properly consider the performance variation of human operators. For this reason, in this paper, a framework to estimate response time data that are critical for calculating HEPs is suggested with respect to a seismic intensity. This paper suggested a systematic framework for estimating response time data that would be one of the most critical for calculating HEPs. Although extensive review of existing literatures is indispensable for identifying response times of human operators who have to conduct a series of tasks prescribed in procedures based on a couple of wrong indications, it is highly expected that response time data for seismic HRA can be properly secured through revisiting response time data collected from diverse situations without concerning a seismic event

  13. Reliability and reproducibility of several methods of arthroscopic assessment of femoral tunnel position during anterior cruciate ligament reconstruction.

    Science.gov (United States)

    Ilahi, Omer A; Mansfield, David J; Urrea, Luis H; Qadeer, Ali A

    2014-10-01

    To assess interobserver and intraobserver agreement of estimating anterior cruciate ligament (ACL) femoral tunnel positioning arthroscopically using circular and linear (noncircular) estimation methods and to determine whether overlay template visual aids improve agreement. Standardized intraoperative pictures of femoral tunnel pilot holes (taken with a 30° arthroscope through an anterolateral portal at 90° of knee flexion with horizontal being parallel to the tibial surface) in 27 patients undergoing single-bundle ACL reconstruction were presented to 3 fellowship-trained arthroscopists on 2 separate occasions. On both viewings, each surgeon estimated the femoral tunnel pilot hole location to the nearest half-hour mark using a whole clock face and half clock face, to the nearest 15° using a whole compass and half compass, in the top or bottom half of a linear quadrant, and in the top or bottom half of a linear trisector. Evaluations were performed first without and then with an overlay template of each estimation method. The average difference among reviewers was quite similar for all 4 circular methods with the use of visual aids. Without overlay template visual aids, pair-wise κ statistic values for interobserver agreement ranged from -0.14 to 0.56 for the whole clock face and from 0.16 to 0.42 for the half clock face. With overlay visual guides, interobserver agreement ranged from 0.29 to 0.63 for the whole clock face and from 0.17 to 0.66 for the half clock face. The quadrant method's interobserver agreement ranged from 0.22 to 0.60, and that of the trisection method ranged from 0.17 to 0.57. Neither linear estimation method's reliability uniformly improved with the use of overlay templates. Intraobserver agreement without overlay templates ranged from 0.17 to 0.49 for the whole clock face, 0.11 to 0.47 for the half clock face, 0.01 to 0.66 for the quadrant method, and 0.20 to 0.57 for the trisection method. Use of overlay templates did not uniformly

  14. Heterogeneous Data Fusion Method to Estimate Travel Time Distributions in Congested Road Networks

    OpenAIRE

    Chaoyang Shi; Bi Yu Chen; William H. K. Lam; Qingquan Li

    2017-01-01

    Travel times in congested urban road networks are highly stochastic. Provision of travel time distribution information, including both mean and variance, can be very useful for travelers to make reliable path choice decisions to ensure higher probability of on-time arrival. To this end, a heterogeneous data fusion method is proposed to estimate travel time distributions by fusing heterogeneous data from point and interval detectors. In the proposed method, link travel time distributions are f...

  15. Internal Dosimetry Intake Estimation using Bayesian Methods

    International Nuclear Information System (INIS)

    Miller, G.; Inkret, W.C.; Martz, H.F.

    1999-01-01

    New methods for the inverse problem of internal dosimetry are proposed based on evaluating expectations of the Bayesian posterior probability distribution of intake amounts, given bioassay measurements. These expectation integrals are normally of very high dimension and hence impractical to use. However, the expectations can be algebraically transformed into a sum of terms representing different numbers of intakes, with a Poisson distribution of the number of intakes. This sum often rapidly converges, when the average number of intakes for a population is small. A simplified algorithm using data unfolding is described (UF code). (author)

  16. Online Identification with Reliability Criterion and State of Charge Estimation Based on a Fuzzy Adaptive Extended Kalman Filter for Lithium-Ion Batteries

    Directory of Open Access Journals (Sweden)

    Zhongwei Deng

    2016-06-01

    Full Text Available In the field of state of charge (SOC estimation, the Kalman filter has been widely used for many years, although its performance strongly depends on the accuracy of the battery model as well as the noise covariance. The Kalman gain determines the confidence coefficient of the battery model by adjusting the weight of open circuit voltage (OCV correction, and has a strong correlation with the measurement noise covariance (R. In this paper, the online identification method is applied to acquire the real model parameters under different operation conditions. A criterion based on the OCV error is proposed to evaluate the reliability of online parameters. Besides, the equivalent circuit model produces an intrinsic model error which is dependent on the load current, and the property that a high battery current or a large current change induces a large model error can be observed. Based on the above prior knowledge, a fuzzy model is established to compensate the model error through updating R. Combining the positive strategy (i.e., online identification and negative strategy (i.e., fuzzy model, a more reliable and robust SOC estimation algorithm is proposed. The experiment results verify the proposed reliability criterion and SOC estimation method under various conditions for LiFePO4 batteries.

  17. Comparison of methods for estimating carbon in harvested wood products

    International Nuclear Information System (INIS)

    Claudia Dias, Ana; Louro, Margarida; Arroja, Luis; Capela, Isabel

    2009-01-01

    There is a great diversity of methods for estimating carbon storage in harvested wood products (HWP) and, therefore, it is extremely important to agree internationally on the methods to be used in national greenhouse gas inventories. This study compares three methods for estimating carbon accumulation in HWP: the method suggested by Winjum et al. (Winjum method), the tier 2 method proposed by the IPCC Good Practice Guidance for Land Use, Land-Use Change and Forestry (GPG LULUCF) (GPG tier 2 method) and a method consistent with GPG LULUCF tier 3 methods (GPG tier 3 method). Carbon accumulation in HWP was estimated for Portugal under three accounting approaches: stock-change, production and atmospheric-flow. The uncertainty in the estimates was also evaluated using Monte Carlo simulation. The estimates of carbon accumulation in HWP obtained with the Winjum method differed substantially from the estimates obtained with the other methods, because this method tends to overestimate carbon accumulation with the stock-change and the production approaches and tends to underestimate carbon accumulation with the atmospheric-flow approach. The estimates of carbon accumulation provided by the GPG methods were similar, but the GPG tier 3 method reported the lowest uncertainties. For the GPG methods, the atmospheric-flow approach produced the largest estimates of carbon accumulation, followed by the production approach and the stock-change approach, by this order. A sensitivity analysis showed that using the ''best'' available data on production and trade of HWP produces larger estimates of carbon accumulation than using data from the Food and Agriculture Organization. (author)

  18. New methods of testing nonlinear hypothesis using iterative NLLS estimator

    Science.gov (United States)

    Mahaboob, B.; Venkateswarlu, B.; Mokeshrayalu, G.; Balasiddamuni, P.

    2017-11-01

    This research paper discusses the method of testing nonlinear hypothesis using iterative Nonlinear Least Squares (NLLS) estimator. Takeshi Amemiya [1] explained this method. However in the present research paper, a modified Wald test statistic due to Engle, Robert [6] is proposed to test the nonlinear hypothesis using iterative NLLS estimator. An alternative method for testing nonlinear hypothesis using iterative NLLS estimator based on nonlinear hypothesis using iterative NLLS estimator based on nonlinear studentized residuals has been proposed. In this research article an innovative method of testing nonlinear hypothesis using iterative restricted NLLS estimator is derived. Pesaran and Deaton [10] explained the methods of testing nonlinear hypothesis. This paper uses asymptotic properties of nonlinear least squares estimator proposed by Jenrich [8]. The main purpose of this paper is to provide very innovative methods of testing nonlinear hypothesis using iterative NLLS estimator, iterative NLLS estimator based on nonlinear studentized residuals and iterative restricted NLLS estimator. Eakambaram et al. [12] discussed least absolute deviation estimations versus nonlinear regression model with heteroscedastic errors and also they studied the problem of heteroscedasticity with reference to nonlinear regression models with suitable illustration. William Grene [13] examined the interaction effect in nonlinear models disused by Ai and Norton [14] and suggested ways to examine the effects that do not involve statistical testing. Peter [15] provided guidelines for identifying composite hypothesis and addressing the probability of false rejection for multiple hypotheses.

  19. Ultrasound estimates of muscle quality in older adults: reliability and comparison of Photoshop and ImageJ for the grayscale analysis of muscle echogenicity

    Directory of Open Access Journals (Sweden)

    Michael O. Harris-Love

    2016-02-01

    Full Text Available Background. Quantitative diagnostic ultrasound imaging has been proposed as a method of estimating muscle quality using measures of echogenicity. The Rectangular Marquee Tool (RMT and the Free Hand Tool (FHT are two types of editing features used in Photoshop and ImageJ for determining a region of interest (ROI within an ultrasound image. The primary objective of this study is to determine the intrarater and interrater reliability of Photoshop and ImageJ for the estimate of muscle tissue echogenicity in older adults via grayscale histogram analysis. The secondary objective is to compare the mean grayscale values obtained using both the RMT and FHT methods across both image analysis platforms. Methods. This cross-sectional observational study features 18 community-dwelling men (age = 61.5 ± 2.32 years. Longitudinal views of the rectus femoris were captured using B-mode ultrasound. The ROI for each scan was selected by 2 examiners using the RMT and FHT methods from each software program. Their reliability is assessed using intraclass correlation coefficients (ICCs and the standard error of the measurement (SEM. Measurement agreement for these values is depicted using Bland-Altman plots. A paired t-test is used to determine mean differences in echogenicity expressed as grayscale values using the RMT and FHT methods to select the post-image acquisition ROI. The degree of association among ROI selection methods and image analysis platforms is analyzed using the coefficient of determination (R2. Results. The raters demonstrated excellent intrarater and interrater reliability using the RMT and FHT methods across both platforms (lower bound 95% CI ICC = .97–.99, p < .001. Mean differences between the echogenicity estimates obtained with the RMT and FHT methods was .87 grayscale levels (95% CI [.54–1.21], p < .0001 using data obtained with both programs. The SEM for Photoshop was .97 and 1.05 grayscale levels when using the RMT and FHT ROI selection

  20. Reliable Viscosity Calculation from Equilibrium Molecular Dynamics Simulations: A Time Decomposition Method.

    Science.gov (United States)

    Zhang, Yong; Otani, Akihito; Maginn, Edward J

    2015-08-11

    Equilibrium molecular dynamics is often used in conjunction with a Green-Kubo integral of the pressure tensor autocorrelation function to compute the shear viscosity of fluids. This approach is computationally expensive and is subject to a large amount of variability because the plateau region of the Green-Kubo integral is difficult to identify unambiguously. Here, we propose a time decomposition approach for computing the shear viscosity using the Green-Kubo formalism. Instead of one long trajectory, multiple independent trajectories are run and the Green-Kubo relation is applied to each trajectory. The averaged running integral as a function of time is fit to a double-exponential function with a weighting function derived from the standard deviation of the running integrals. Such a weighting function minimizes the uncertainty of the estimated shear viscosity and provides an objective means of estimating the viscosity. While the formal Green-Kubo integral requires an integration to infinite time, we suggest an integration cutoff time tcut, which can be determined by the relative values of the running integral and the corresponding standard deviation. This approach for computing the shear viscosity can be easily automated and used in computational screening studies where human judgment and intervention in the data analysis are impractical. The method has been applied to the calculation of the shear viscosity of a relatively low-viscosity liquid, ethanol, and relatively high-viscosity ionic liquid, 1-n-butyl-3-methylimidazolium bis(trifluoromethane-sulfonyl)imide ([BMIM][Tf2N]), over a range of temperatures. These test cases show that the method is robust and yields reproducible and reliable shear viscosity values.

  1. Development of a Method for Quantifying the Reliability of Nuclear Safety-Related Software

    Energy Technology Data Exchange (ETDEWEB)

    Yi Zhang; Michael W. Golay

    2003-10-01

    The work of our project is intended to help introducing digital technologies into nuclear power into nuclear power plant safety related software applications. In our project we utilize a combination of modern software engineering methods: design process discipline and feedback, formal methods, automated computer aided software engineering tools, automatic code generation, and extensive feasible structure flow path testing to improve software quality. The tactics include ensuring that the software structure is kept simple, permitting routine testing during design development, permitting extensive finished product testing in the input data space of most likely service and using test-based Bayesian updating to estimate the probability that a random software input will encounter an error upon execution. From the results obtained the software reliability can be both improved and its value estimated. Hopefully our success in the project's work can aid the transition of the nuclear enterprise into the modern information world. In our work, we have been using the proprietary sample software, the digital Signal Validation Algorithm (SVA), provided by Westinghouse. Also our work is being done with their collaboration. The SVA software is used for selecting the plant instrumentation signal set which is to be used as the input the digital Plant Protection System (PPS). This is the system that automatically decides whether to trip the reactor. In our work, we are using -001 computer assisted software engineering (CASE) tool of Hamilton Technologies Inc. This tool is capable of stating the syntactic structure of a program reflecting its state requirements, logical functions and data structure.

  2. The juvenile face as a suitable age indicator in child pornography cases: a pilot study on the reliability of automated and visual estimation approaches.

    Science.gov (United States)

    Ratnayake, M; Obertová, Z; Dose, M; Gabriel, P; Bröker, H M; Brauckmann, M; Barkus, A; Rizgeliene, R; Tutkuviene, J; Ritz-Timme, S; Marasciuolo, L; Gibelli, D; Cattaneo, C

    2014-09-01

    In cases of suspected child pornography, the age of the victim represents a crucial factor for legal prosecution. The conventional methods for age estimation provide unreliable age estimates, particularly if teenage victims are concerned. In this pilot study, the potential of age estimation for screening purposes is explored for juvenile faces. In addition to a visual approach, an automated procedure is introduced, which has the ability to rapidly scan through large numbers of suspicious image data in order to trace juvenile faces. Age estimations were performed by experts, non-experts and the Demonstrator of a developed software on frontal facial images of 50 females aged 10-19 years from Germany, Italy, and Lithuania. To test the accuracy, the mean absolute error (MAE) between the estimates and the real ages was calculated for each examiner and the Demonstrator. The Demonstrator achieved the lowest MAE (1.47 years) for the 50 test images. Decreased image quality had no significant impact on the performance and classification results. The experts delivered slightly less accurate MAE (1.63 years). Throughout the tested age range, both the manual and the automated approach led to reliable age estimates within the limits of natural biological variability. The visual analysis of the face produces reasonably accurate age estimates up to the age of 18 years, which is the legally relevant age threshold for victims in cases of pedo-pornography. This approach can be applied in conjunction with the conventional methods for a preliminary age estimation of juveniles depicted on images.

  3. Novel method for quantitative estimation of biofilms

    DEFF Research Database (Denmark)

    Syal, Kirtimaan

    2017-01-01

    Biofilm protects bacteria from stress and hostile environment. Crystal violet (CV) assay is the most popular method for biofilm determination adopted by different laboratories so far. However, biofilm layer formed at the liquid-air interphase known as pellicle is extremely sensitive to its washing...... and staining steps. Early phase biofilms are also prone to damage by the latter steps. In bacteria like mycobacteria, biofilm formation occurs largely at the liquid-air interphase which is susceptible to loss. In the proposed protocol, loss of such biofilm layer was prevented. In place of inverting...... and discarding the media which can lead to the loss of the aerobic biofilm layer in CV assay, media was removed from the formed biofilm with the help of a syringe and biofilm layer was allowed to dry. The staining and washing steps were avoided, and an organic solvent-tetrahydrofuran (THF) was deployed...

  4. An Estimation of Human Error Probability of Filtered Containment Venting System Using Dynamic HRA Method

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seunghyun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The human failure events (HFEs) are considered in the development of system fault trees as well as accident sequence event trees in part of Probabilistic Safety Assessment (PSA). As a method for analyzing the human error, several methods, such as Technique for Human Error Rate Prediction (THERP), Human Cognitive Reliability (HCR), and Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) are used and new methods for human reliability analysis (HRA) are under developing at this time. This paper presents a dynamic HRA method for assessing the human failure events and estimation of human error probability for filtered containment venting system (FCVS) is performed. The action associated with implementation of the containment venting during a station blackout sequence is used as an example. In this report, dynamic HRA method was used to analyze FCVS-related operator action. The distributions of the required time and the available time were developed by MAAP code and LHS sampling. Though the numerical calculations given here are only for illustrative purpose, the dynamic HRA method can be useful tools to estimate the human error estimation and it can be applied to any kind of the operator actions, including the severe accident management strategy.

  5. Validation and reliability of the sex estimation of the human os coxae using freely available DSP2 software for bioarchaeology and forensic anthropology.

    Science.gov (United States)

    Brůžek, Jaroslav; Santos, Frédéric; Dutailly, Bruno; Murail, Pascal; Cunha, Eugenia

    2017-10-01

    A new tool for skeletal sex estimation based on measurements of the human os coxae is presented using skeletons from a metapopulation of identified adult individuals from twelve independent population samples. For reliable sex estimation, a posterior probability greater than 0.95 was considered to be the classification threshold: below this value, estimates are considered indeterminate. By providing free software, we aim to develop an even more disseminated method for sex estimation. Ten metric variables collected from 2,040 ossa coxa of adult subjects of known sex were recorded between 1986 and 2002 (reference sample). To test both the validity and reliability, a target sample consisting of two series of adult ossa coxa of known sex (n = 623) was used. The DSP2 software (Diagnose Sexuelle Probabiliste v2) is based on Linear Discriminant Analysis, and the posterior probabilities are calculated using an R script. For the reference sample, any combination of four dimensions provides a correct sex estimate in at least 99% of cases. The percentage of individuals for whom sex can be estimated depends on the number of dimensions; for all ten variables it is higher than 90%. Those results are confirmed in the target sample. Our posterior probability threshold of 0.95 for sex estimate corresponds to the traditional sectioning point used in osteological studies. DSP2 software is replacing the former version that should not be used anymore. DSP2 is a robust and reliable technique for sexing adult os coxae, and is also user friendly. © 2017 Wiley Periodicals, Inc.

  6. Bayesian reliability analysis for non-periodic inspection with estimation of uncertain parameters; Bayesian shinraisei kaiseki wo tekiyoshita hiteiki kozo kensa ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Itagaki, H. [Yokohama National University, Yokohama (Japan). Faculty of Engineering; Asada, H.; Ito, S. [National Aerospace Laboratory, Tokyo (Japan); Shinozuka, M.

    1996-12-31

    Risk assessed structural positions in a pressurized fuselage of a transport-type aircraft applied with damage tolerance design are taken up as the subject of discussion. A small number of data obtained from inspections on the positions was used to discuss the Bayesian reliability analysis that can estimate also a proper non-periodic inspection schedule, while estimating proper values for uncertain factors. As a result, time period of generating fatigue cracks was determined according to procedure of detailed visual inspections. The analysis method was found capable of estimating values that are thought reasonable and the proper inspection schedule using these values, in spite of placing the fatigue crack progress expression in a very simple form and estimating both factors as the uncertain factors. Thus, the present analysis method was verified of its effectiveness. This study has discussed at the same time the structural positions, modeling of fatigue cracks generated and develop in the positions, conditions for destruction, damage factors, and capability of the inspection from different viewpoints. This reliability analysis method is thought effective also on such other structures as offshore structures. 18 refs., 8 figs., 1 tab.

  7. Bayesian reliability analysis for non-periodic inspection with estimation of uncertain parameters; Bayesian shinraisei kaiseki wo tekiyoshita hiteiki kozo kensa ni kansuru kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    Itagaki, H [Yokohama National University, Yokohama (Japan). Faculty of Engineering; Asada, H; Ito, S [National Aerospace Laboratory, Tokyo (Japan); Shinozuka, M

    1997-12-31

    Risk assessed structural positions in a pressurized fuselage of a transport-type aircraft applied with damage tolerance design are taken up as the subject of discussion. A small number of data obtained from inspections on the positions was used to discuss the Bayesian reliability analysis that can estimate also a proper non-periodic inspection schedule, while estimating proper values for uncertain factors. As a result, time period of generating fatigue cracks was determined according to procedure of detailed visual inspections. The analysis method was found capable of estimating values that are thought reasonable and the proper inspection schedule using these values, in spite of placing the fatigue crack progress expression in a very simple form and estimating both factors as the uncertain factors. Thus, the present analysis method was verified of its effectiveness. This study has discussed at the same time the structural positions, modeling of fatigue cracks generated and develop in the positions, conditions for destruction, damage factors, and capability of the inspection from different viewpoints. This reliability analysis method is thought effective also on such other structures as offshore structures. 18 refs., 8 figs., 1 tab.

  8. Novel Method for 5G Systems NLOS Channels Parameter Estimation

    Directory of Open Access Journals (Sweden)

    Vladeta Milenkovic

    2017-01-01

    Full Text Available For the development of new 5G systems to operate in mm bands, there is a need for accurate radio propagation modelling at these bands. In this paper novel approach for NLOS channels parameter estimation will be presented. Estimation will be performed based on LCR performance measure, which will enable us to estimate propagation parameters in real time and to avoid weaknesses of ML and moment method estimation approaches.

  9. Residual-based a posteriori error estimation for multipoint flux mixed finite element methods

    KAUST Repository

    Du, Shaohong; Sun, Shuyu; Xie, Xiaoping

    2015-01-01

    A novel residual-type a posteriori error analysis technique is developed for multipoint flux mixed finite element methods for flow in porous media in two or three space dimensions. The derived a posteriori error estimator for the velocity and pressure error in L-norm consists of discretization and quadrature indicators, and is shown to be reliable and efficient. The main tools of analysis are a locally postprocessed approximation to the pressure solution of an auxiliary problem and a quadrature error estimate. Numerical experiments are presented to illustrate the competitive behavior of the estimator.

  10. Residual-based a posteriori error estimation for multipoint flux mixed finite element methods

    KAUST Repository

    Du, Shaohong

    2015-10-26

    A novel residual-type a posteriori error analysis technique is developed for multipoint flux mixed finite element methods for flow in porous media in two or three space dimensions. The derived a posteriori error estimator for the velocity and pressure error in L-norm consists of discretization and quadrature indicators, and is shown to be reliable and efficient. The main tools of analysis are a locally postprocessed approximation to the pressure solution of an auxiliary problem and a quadrature error estimate. Numerical experiments are presented to illustrate the competitive behavior of the estimator.

  11. VHTRC experiment for verification test of H∞ reactivity estimation method

    International Nuclear Information System (INIS)

    Fujii, Yoshio; Suzuki, Katsuo; Akino, Fujiyoshi; Yamane, Tsuyoshi; Fujisaki, Shingo; Takeuchi, Motoyoshi; Ono, Toshihiko

    1996-02-01

    This experiment was performed at VHTRC to acquire the data for verifying the H∞ reactivity estimation method. In this report, the experimental method, the measuring circuits and data processing softwares are described in details. (author)

  12. A Novel Reliability Enhanced Handoff Method in Future Wireless Heterogeneous Networks

    Directory of Open Access Journals (Sweden)

    Wang YuPeng

    2016-01-01

    Full Text Available As the demand increases, future networks will follow the trends of network variety and service flexibility, which requires heterogeneous type of network deployment and reliable communication method. In practice, most communication failure happens due to the bad radio link quality, i.e., high-speed users suffers a lot on the problem of radio link failure, which causes the problem of communication interrupt and radio link recovery. To make the communication more reliable, especially for the high mobility users, we propose a novel communication handoff mechanism to reduce the occurrence of service interrupt. Based on computer simulation, we find that the reliability on the service is greatly improved.

  13. Method for Estimating the Parameters of LFM Radar Signal

    Directory of Open Access Journals (Sweden)

    Tan Chuan-Zhang

    2017-01-01

    Full Text Available In order to obtain reliable estimate of parameters, it is very important to protect the integrality of linear frequency modulation (LFM signal. Therefore, in the practical LFM radar signal processing, the length of data frame is often greater than the pulse width (PW of signal. In this condition, estimating the parameters by fractional Fourier transform (FrFT will cause the signal to noise ratio (SNR decrease. Aiming at this problem, we multiply the data frame by a Gaussian window to improve the SNR. Besides, for a further improvement of parameters estimation precision, a novel algorithm is derived via Lagrange interpolation polynomial, and we enhance the algorithm by a logarithmic transformation. Simulation results demonstrate that the derived algorithm significantly reduces the estimation errors of chirp-rate and initial frequency.

  14. Reliability analysis for thermal cutting method based non-explosive separation device

    International Nuclear Information System (INIS)

    Choi, Jun Woo; Hwang, Kuk Ha; Kim, Byung Kyu

    2016-01-01

    In order to increase the reliability of a separation device for a small satellite, a new non-explosive separation device is invented. This device is activated using a thermal cutting method with a Ni-Cr wire. A reliability analysis is carried out for the proposed non-explosive separation device by applying the Fault tree analysis (FTA) method. In the FTA results for the separation device, only ten single-point failure modes are found. The reliability modeling and analysis for the device are performed considering failure of the power supply, the Ni-Cr wire burns failure and unwinds, the holder separation failure, the balls separation failure, and the pin release failure. Ultimately, the reliability of the proposed device is calculated as 0.999989 with five Ni-Cr wire coils

  15. Reliability analysis for thermal cutting method based non-explosive separation device

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jun Woo; Hwang, Kuk Ha; Kim, Byung Kyu [Korea Aerospace University, Goyang (Korea, Republic of)

    2016-12-15

    In order to increase the reliability of a separation device for a small satellite, a new non-explosive separation device is invented. This device is activated using a thermal cutting method with a Ni-Cr wire. A reliability analysis is carried out for the proposed non-explosive separation device by applying the Fault tree analysis (FTA) method. In the FTA results for the separation device, only ten single-point failure modes are found. The reliability modeling and analysis for the device are performed considering failure of the power supply, the Ni-Cr wire burns failure and unwinds, the holder separation failure, the balls separation failure, and the pin release failure. Ultimately, the reliability of the proposed device is calculated as 0.999989 with five Ni-Cr wire coils.

  16. Carbon footprint: current methods of estimation.

    Science.gov (United States)

    Pandey, Divya; Agrawal, Madhoolika; Pandey, Jai Shanker

    2011-07-01

    Increasing greenhouse gaseous concentration in the atmosphere is perturbing the environment to cause grievous global warming and associated consequences. Following the rule that only measurable is manageable, mensuration of greenhouse gas intensiveness of different products, bodies, and processes is going on worldwide, expressed as their carbon footprints. The methodologies for carbon footprint calculations are still evolving and it is emerging as an important tool for greenhouse gas management. The concept of carbon footprinting has permeated and is being commercialized in all the areas of life and economy, but there is little coherence in definitions and calculations of carbon footprints among the studies. There are disagreements in the selection of gases, and the order of emissions to be covered in footprint calculations. Standards of greenhouse gas accounting are the common resources used in footprint calculations, although there is no mandatory provision of footprint verification. Carbon footprinting is intended to be a tool to guide the relevant emission cuts and verifications, its standardization at international level are therefore necessary. Present review describes the prevailing carbon footprinting methods and raises the related issues.

  17. THE METHODS FOR ESTIMATING REGIONAL PROFESSIONAL MOBILE RADIO MARKET POTENTIAL

    Directory of Open Access Journals (Sweden)

    Y.À. Korobeynikov

    2008-12-01

    Full Text Available The paper represents the author’s methods of estimating regional professional mobile radio market potential, that belongs to high-tech b2b markets. These methods take into consideration such market peculiarities as great range and complexity of products, technological constraints and infrastructure development for the technological systems operation. The paper gives an estimation of professional mobile radio potential in Perm region. This estimation is already used by one of the systems integrator for its strategy development.

  18. An attempt to use FMEA method for an approximate reliability assessment of machinery

    Directory of Open Access Journals (Sweden)

    Przystupa Krzysztof

    2017-01-01

    Full Text Available The paper presents a modified FMEA (Failure Mode and Effect Analysis method to assess reliability of the components that make up a wrench type 2145: MAX Impactol TM Driver Ingersoll Rand Company. This case concerns the analysis of reliability in conditions, when full service data is not known. The aim of the study is to determine the weakest element in the design of the tool.

  19. Complex method to calculate objective assessments of information systems protection to improve expert assessments reliability

    Science.gov (United States)

    Abdenov, A. Zh; Trushin, V. A.; Abdenova, G. A.

    2018-01-01

    The paper considers the questions of filling the relevant SIEM nodes based on calculations of objective assessments in order to improve the reliability of subjective expert assessments. The proposed methodology is necessary for the most accurate security risk assessment of information systems. This technique is also intended for the purpose of establishing real-time operational information protection in the enterprise information systems. Risk calculations are based on objective estimates of the adverse events implementation probabilities, predictions of the damage magnitude from information security violations. Calculations of objective assessments are necessary to increase the reliability of the proposed expert assessments.

  20. Method for assessing software reliability of the document management system using the RFID technology

    Directory of Open Access Journals (Sweden)

    Kiedrowicz Maciej

    2016-01-01

    Full Text Available The deliberations presented in this study refer to the method for assessing software reliability of the docu-ment management system, using the RFID technology. A method for determining the reliability structure of the dis-cussed software, understood as the index vector for assessing reliability of its components, was proposed. The model of the analyzed software is the control transfer graph, in which the probability of activating individual components during the system's operation results from the so-called operational profile, which characterizes the actual working environment. The reliability structure is established as a result of the solution of a specific mathematical software task. The knowledge of the reliability structure of the software makes it possible to properly plan the time and finan-cial expenses necessary to build the software, which would meet the reliability requirements. The application of the presented method is illustrated by the number example, corresponding to the software reality of the RFID document management system.

  1. TCS: a new multiple sequence alignment reliability measure to estimate alignment accuracy and improve phylogenetic tree reconstruction.

    Science.gov (United States)

    Chang, Jia-Ming; Di Tommaso, Paolo; Notredame, Cedric

    2014-06-01

    Multiple sequence alignment (MSA) is a key modeling procedure when analyzing biological sequences. Homology and evolutionary modeling are the most common applications of MSAs. Both are known to be sensitive to the underlying MSA accuracy. In this work, we show how this problem can be partly overcome using the transitive consistency score (TCS), an extended version of the T-Coffee scoring scheme. Using this local evaluation function, we show that one can identify the most reliable portions of an MSA, as judged from BAliBASE and PREFAB structure-based reference alignments. We also show how this measure can be used to improve phylogenetic tree reconstruction using both an established simulated data set and a novel empirical yeast data set. For this purpose, we describe a novel lossless alternative to site filtering that involves overweighting the trustworthy columns. Our approach relies on the T-Coffee framework; it uses libraries of pairwise alignments to evaluate any third party MSA. Pairwise projections can be produced using fast or slow methods, thus allowing a trade-off between speed and accuracy. We compared TCS with Heads-or-Tails, GUIDANCE, Gblocks, and trimAl and found it to lead to significantly better estimates of structural accuracy and more accurate phylogenetic trees. The software is available from www.tcoffee.org/Projects/tcs. © The Author 2014. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  2. A Comparative Study of Distribution System Parameter Estimation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Yannan; Williams, Tess L.; Gourisetti, Sri Nikhil Gup

    2016-07-17

    In this paper, we compare two parameter estimation methods for distribution systems: residual sensitivity analysis and state-vector augmentation with a Kalman filter. These two methods were originally proposed for transmission systems, and are still the most commonly used methods for parameter estimation. Distribution systems have much lower measurement redundancy than transmission systems. Therefore, estimating parameters is much more difficult. To increase the robustness of parameter estimation, the two methods are applied with combined measurement snapshots (measurement sets taken at different points in time), so that the redundancy for computing the parameter values is increased. The advantages and disadvantages of both methods are discussed. The results of this paper show that state-vector augmentation is a better approach for parameter estimation in distribution systems. Simulation studies are done on a modified version of IEEE 13-Node Test Feeder with varying levels of measurement noise and non-zero error in the other system model parameters.

  3. A Fast LMMSE Channel Estimation Method for OFDM Systems

    Directory of Open Access Journals (Sweden)

    Zhou Wen

    2009-01-01

    Full Text Available A fast linear minimum mean square error (LMMSE channel estimation method has been proposed for Orthogonal Frequency Division Multiplexing (OFDM systems. In comparison with the conventional LMMSE channel estimation, the proposed channel estimation method does not require the statistic knowledge of the channel in advance and avoids the inverse operation of a large dimension matrix by using the fast Fourier transform (FFT operation. Therefore, the computational complexity can be reduced significantly. The normalized mean square errors (NMSEs of the proposed method and the conventional LMMSE estimation have been derived. Numerical results show that the NMSE of the proposed method is very close to that of the conventional LMMSE method, which is also verified by computer simulation. In addition, computer simulation shows that the performance of the proposed method is almost the same with that of the conventional LMMSE method in terms of bit error rate (BER.

  4. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    Science.gov (United States)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  5. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    DEFF Research Database (Denmark)

    Petersen, Bent; Petersen, Thomas Nordahl; Andersen, Pernille

    2009-01-01

    : The performance of the neural networks was evaluated on a commonly used set of sequences known as the CB513 set. An overall Pearson's correlation coefficient of 0.72 was obtained, which is comparable to the performance of the currently best public available method, Real-SPINE. Both methods associate a reliability...... comparing the Pearson's correlation coefficient for the upper 20% of predictions sorted according to reliability. For this subset, values of 0.79 and 0.74 are obtained using our and the compared method, respectively. This tendency is true for any selected subset....

  6. Estimating Value of Congestion and of Reliability from Observation of Route Choice Behavior of Car Drivers

    DEFF Research Database (Denmark)

    Prato, Carlo Giacomo; Rasmussen, Thomas Kjær; Nielsen, Otto Anker

    2014-01-01

    In recent years, a consensus has been reached about the relevance of calculating the value of congestion and the value of reliability for better understanding and therefore better prediction of travel behavior. The current study proposed a revealed preference approach that used a large amount...... both congestion and reliability terms. Results illustrated that the value of time and the value of congestion were significantly higher in the peak period because of possible higher penalties for drivers being late and consequently possible higher time pressure. Moreover, results showed...... that the marginal rate of substitution between travel time reliability and total travel time did not vary across periods and traffic conditions, with the obvious caveat that the absolute values were significantly higher for the peak period. Last, results showed the immense potential of exploiting the growing...

  7. Power System Real-Time Monitoring by Using PMU-Based Robust State Estimation Method

    DEFF Research Database (Denmark)

    Zhao, Junbo; Zhang, Gexiang; Das, Kaushik

    2016-01-01

    Accurate real-time states provided by the state estimator are critical for power system reliable operation and control. This paper proposes a novel phasor measurement unit (PMU)-based robust state estimation method (PRSEM) to real-time monitor a power system under different operation conditions...... the system real-time states with good robustness and can address several kinds of BD.......-based bad data (BD) detection method, which can handle the smearing effect and critical measurement errors, is presented. We evaluate PRSEM by using IEEE benchmark test systems and a realistic utility system. The numerical results indicate that, in short computation time, PRSEM can effectively track...

  8. A method to evaluate performance reliability of individual subjects in laboratory research applied to work settings.

    Science.gov (United States)

    1978-10-01

    This report presents a method that may be used to evaluate the reliability of performance of individual subjects, particularly in applied laboratory research. The method is based on analysis of variance of a tasks-by-subjects data matrix, with all sc...

  9. Joint Pitch and DOA Estimation Using the ESPRIT method

    DEFF Research Database (Denmark)

    Wu, Yuntao; Amir, Leshem; Jensen, Jesper Rindom

    2015-01-01

    In this paper, the problem of joint multi-pitch and direction-of-arrival (DOA) estimation for multi-channel harmonic sinusoidal signals is considered. A spatio-temporal matrix signal model for a uniform linear array is defined, and then the ESPRIT method based on subspace techniques that exploits...... the invariance property in the time domain is first used to estimate the multi pitch frequencies of multiple harmonic signals. Followed by the estimated pitch frequencies, the DOA estimations based on the ESPRIT method are also presented by using the shift invariance structure in the spatial domain. Compared...... to the existing stateof-the-art algorithms, the proposed method based on ESPRIT without 2-D searching is computationally more efficient but performs similarly. An asymptotic performance analysis of the DOA and pitch estimation of the proposed method are also presented. Finally, the effectiveness of the proposed...

  10. Classification of Amazonian rosewood essential oil by Raman spectroscopy and PLS-DA with reliability estimation.

    Science.gov (United States)

    Almeida, Mariana R; Fidelis, Carlos H V; Barata, Lauro E S; Poppi, Ronei J

    2013-12-15

    The Amazon tree Aniba rosaeodora Ducke (rosewood) provides an essential oil valuable for the perfume industry, but after decades of predatory extraction it is at risk of extinction. The extraction of the essential oil from wood implies the cutting of the tree, and then the study of oil extracted from the leaves is important as a sustainable alternative. The goal of this study was to test the applicability of Raman spectroscopy and Partial Least Square Discriminant Analysis (PLS-DA) as means to classify the essential oil extracted from different parties (wood, leaves and branches) of the Brazilian tree A. rosaeodora. For the development of classification models, the Raman spectra were split into two sets: training and test. The value of the limit that separates the classes was calculated based on the distribution of samples of training. This value was calculated in a manner that the classes are divided with a lower probability of incorrect classification for future estimates. The best model presented sensitivity and specificity of 100%, predictive accuracy and efficiency of 100%. These results give an overall vision of the behavior of the model, but do not give information about individual samples; in this case, the confidence interval for each sample of classification was also calculated using the resampling bootstrap technique. The methodology developed have the potential to be an alternative for standard procedures used for oil analysis and it can be employed as screening method, since it is fast, non-destructive and robust. © 2013 Elsevier B.V. All rights reserved.

  11. Estimating The Reliability of the Lawrence Livermore National Laboratory (LLNL) Flash X-ray (FXR) Machine

    International Nuclear Information System (INIS)

    Ong, M M; Kihara, R; Zentler, J M; Kreitzer, B R; DeHope, W J

    2007-01-01

    At Lawrence Livermore National Laboratory (LLNL), our flash X-ray accelerator (FXR) is used on multi-million dollar hydrodynamic experiments. Because of the importance of the radiographs, FXR must be ultra-reliable. Flash linear accelerators that can generate a 3 kA beam at 18 MeV are very complex. They have thousands, if not millions, of critical components that could prevent the machine from performing correctly. For the last five years, we have quantified and are tracking component failures. From this data, we have determined that the reliability of the high-voltage gas-switches that initiate the pulses, which drive the accelerator cells, dominates the statistics. The failure mode is a single-switch pre-fire that reduces the energy of the beam and degrades the X-ray spot-size. The unfortunate result is a lower resolution radiograph. FXR is a production machine that allows only a modest number of pulses for testing. Therefore, reliability switch testing that requires thousands of shots is performed on our test stand. Study of representative switches has produced pre-fire statistical information and probability distribution curves. This information is applied to FXR to develop test procedures and determine individual switch reliability using a minimal number of accelerator pulses

  12. R&D program benefits estimation: DOE Office of Electricity Delivery and Energy Reliability

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2006-12-04

    The overall mission of the U.S. Department of Energy’s Office of Electricity Delivery and Energy Reliability (OE) is to lead national efforts to modernize the electric grid, enhance the security and reliability of the energy infrastructure, and facilitate recovery from disruptions to the energy supply. In support of this mission, OE conducts a portfolio of research and development (R&D) activities to advance technologies to enhance electric power delivery. Multiple benefits are anticipated to result from the deployment of these technologies, including higher quality and more reliable power, energy savings, and lower cost electricity. In addition, OE engages State and local government decision-makers and the private sector to address issues related to the reliability and security of the grid, including responding to national emergencies that affect energy delivery. The OE R&D activities are comprised of four R&D lines: High Temperature Superconductivity (HTS), Visualization and Controls (V&C), Energy Storage and Power Electronics (ES&PE), and Distributed Systems Integration (DSI).

  13. Estimation of electromagnetic pumps reliability based on the results of their exploitation

    International Nuclear Information System (INIS)

    Vitkovskij, I.V.; Kirillov, I.R.; Chajka, P.Yu.; Kryuchkov, E.A.; Poplavskij, V.M.; Nosov, Yu.V.; Oshkanov, N.N.

    2007-01-01

    Main factors, determining the service life of induction electromagnetic pumps (IEP), are analyzed. It is shown that the IEP serviceability depends mainly on the winding reliability. The main damaging factors, acting on the windings, are noted. The expressions for calculation of the failure intensity for the coil and case insulations are obtained [ru

  14. Attenuation of the Squared Canonical Correlation Coefficient under Varying Estimates of Score Reliability

    Science.gov (United States)

    Wilson, Celia M.

    2010-01-01

    Research pertaining to the distortion of the squared canonical correlation coefficient has traditionally been limited to the effects of sampling error and associated correction formulas. The purpose of this study was to compare the degree of attenuation of the squared canonical correlation coefficient under varying conditions of score reliability.…

  15. Reliability and validity of food portion size estimation from images using manual flexible digital virtual meshes

    Science.gov (United States)

    The eButton takes frontal images at 4 second intervals throughout the day. A three-dimensional (3D) manually administered wire mesh procedure has been developed to quantify portion sizes from the two-dimensional (2D) images. This paper reports a test of the interrater reliability and validity of use...

  16. The Evaluation Method of the Lightning Strike on Transmission Lines Aiming at Power Grid Reliability

    Science.gov (United States)

    Wen, Jianfeng; Wu, Jianwei; Huang, Liandong; Geng, Yinan; Yu, zhanqing

    2018-01-01

    Lightning protection of power system focuses on reducing the flashover rate, only distinguishing by the voltage level, without considering the functional differences between the transmission lines, and being lack of analysis the effect on the reliability of power grid. This will lead lightning protection design of general transmission lines is surplus but insufficient for key lines. In order to solve this problem, the analysis method of lightning striking on transmission lines for power grid reliability is given. Full wave process theory is used to analyze the lightning back striking; the leader propagation model is used to describe the process of shielding failure of transmission lines. The index of power grid reliability is introduced and the effect of transmission line fault on the reliability of power system is discussed in detail.

  17. Investigation of Reliabilities of Bolt Distances for Bolted Structural Steel Connections by Monte Carlo Simulation Method

    Directory of Open Access Journals (Sweden)

    Ertekin Öztekin Öztekin

    2015-12-01

    Full Text Available Design of the distance of bolts to each other and design of the distance of bolts to the edge of connection plates are made based on minimum and maximum boundary values proposed by structural codes. In this study, reliabilities of those distances were investigated. For this purpose, loading types, bolt types and plate thicknesses were taken as variable parameters. Monte Carlo Simulation (MCS method was used in the reliability computations performed for all combination of those parameters. At the end of study, all reliability index values for all those distances were presented in graphics and tables. Results obtained from this study compared with the values proposed by some structural codes and finally some evaluations were made about those comparisons. Finally, It was emphasized in the end of study that, it would be incorrect of the usage of the same bolt distances in the both traditional designs and the higher reliability level designs.

  18. An Intelligent Method for Structural Reliability Analysis Based on Response Surface

    Institute of Scientific and Technical Information of China (English)

    桂劲松; 刘红; 康海贵

    2004-01-01

    As water depth increases, the structural safety and reliability of a system become more and more important and challenging. Therefore, the structural reliability method must be applied in ocean engineering design such as offshore platform design. If the performance function is known in structural reliability analysis, the first-order second-moment method is often used. If the performance function could not be definitely expressed, the response surface method is always used because it has a very clear train of thought and simple programming. However, the traditional response surface method fits the response surface of quadratic polynomials where the problem of accuracy could not be solved, because the true limit state surface can be fitted well only in the area near the checking point. In this paper, an intelligent computing method based on the whole response surface is proposed, which can be used for the situation where the performance function could not be definitely expressed in structural reliability analysis. In this method, a response surface of the fuzzy neural network for the whole area should be constructed first, and then the structural reliability can be calculated by the genetic algorithm. In the proposed method, all the sample points for the training network come from the whole area, so the true limit state surface in the whole area can be fitted. Through calculational examples and comparative analysis, it can be known that the proposed method is much better than the traditional response surface method of quadratic polynomials, because, the amount of calculation of finite element analysis is largely reduced, the accuracy of calculation is improved,and the true limit state surface can be fitted very well in the whole area. So, the method proposed in this paper is suitable for engineering application.

  19. A Bayesian reliability evaluation method with integrated accelerated degradation testing and field information

    International Nuclear Information System (INIS)

    Wang, Lizhi; Pan, Rong; Li, Xiaoyang; Jiang, Tongmin

    2013-01-01

    Accelerated degradation testing (ADT) is a common approach in reliability prediction, especially for products with high reliability. However, oftentimes the laboratory condition of ADT is different from the field condition; thus, to predict field failure, one need to calibrate the prediction made by using ADT data. In this paper a Bayesian evaluation method is proposed to integrate the ADT data from laboratory with the failure data from field. Calibration factors are introduced to calibrate the difference between the lab and the field conditions so as to predict a product's actual field reliability more accurately. The information fusion and statistical inference procedure are carried out through a Bayesian approach and Markov chain Monte Carlo methods. The proposed method is demonstrated by two examples and the sensitivity analysis to prior distribution assumption

  20. The Monte Carlo Simulation Method for System Reliability and Risk Analysis

    CERN Document Server

    Zio, Enrico

    2013-01-01

    Monte Carlo simulation is one of the best tools for performing realistic analysis of complex systems as it allows most of the limiting assumptions on system behavior to be relaxed. The Monte Carlo Simulation Method for System Reliability and Risk Analysis comprehensively illustrates the Monte Carlo simulation method and its application to reliability and system engineering. Readers are given a sound understanding of the fundamentals of Monte Carlo sampling and simulation and its application for realistic system modeling.   Whilst many of the topics rely on a high-level understanding of calculus, probability and statistics, simple academic examples will be provided in support to the explanation of the theoretical foundations to facilitate comprehension of the subject matter. Case studies will be introduced to provide the practical value of the most advanced techniques.   This detailed approach makes The Monte Carlo Simulation Method for System Reliability and Risk Analysis a key reference for senior undergra...

  1. Study on reliability analysis based on multilevel flow models and fault tree method

    International Nuclear Information System (INIS)

    Chen Qiang; Yang Ming

    2014-01-01

    Multilevel flow models (MFM) and fault tree method describe the system knowledge in different forms, so the two methods express an equivalent logic of the system reliability under the same boundary conditions and assumptions. Based on this and combined with the characteristics of MFM, a method mapping MFM to fault tree was put forward, thus providing a way to establish fault tree rapidly and realizing qualitative reliability analysis based on MFM. Taking the safety injection system of pressurized water reactor nuclear power plant as an example, its MFM was established and its reliability was analyzed qualitatively. The analysis result shows that the logic of mapping MFM to fault tree is correct. The MFM is easily understood, created and modified. Compared with the traditional fault tree analysis, the workload is greatly reduced and the modeling time is saved. (authors)

  2. Reverse survival method of fertility estimation: An evaluation

    Directory of Open Access Journals (Sweden)

    Thomas Spoorenberg

    2014-07-01

    Full Text Available Background: For the most part, demographers have relied on the ever-growing body of sample surveys collecting full birth history to derive total fertility estimates in less statistically developed countries. Yet alternative methods of fertility estimation can return very consistent total fertility estimates by using only basic demographic information. Objective: This paper evaluates the consistency and sensitivity of the reverse survival method -- a fertility estimation method based on population data by age and sex collected in one census or a single-round survey. Methods: A simulated population was first projected over 15 years using a set of fertility and mortality age and sex patterns. The projected population was then reverse survived using the Excel template FE_reverse_4.xlsx, provided with Timæus and Moultrie (2012. Reverse survival fertility estimates were then compared for consistency to the total fertility rates used to project the population. The sensitivity was assessed by introducing a series of distortions in the projection of the population and comparing the difference implied in the resulting fertility estimates. Results: The reverse survival method produces total fertility estimates that are very consistent and hardly affected by erroneous assumptions on the age distribution of fertility or by the use of incorrect mortality levels, trends, and age patterns. The quality of the age and sex population data that is 'reverse survived' determines the consistency of the estimates. The contribution of the method for the estimation of past and present trends in total fertility is illustrated through its application to the population data of five countries characterized by distinct fertility levels and data quality issues. Conclusions: Notwithstanding its simplicity, the reverse survival method of fertility estimation has seldom been applied. The method can be applied to a large body of existing and easily available population data

  3. Investigating the error sources of the online state of charge estimation methods for lithium-ion batteries in electric vehicles

    Science.gov (United States)

    Zheng, Yuejiu; Ouyang, Minggao; Han, Xuebing; Lu, Languang; Li, Jianqiu

    2018-02-01

    Sate of charge (SOC) estimation is generally acknowledged as one of the most important functions in battery management system for lithium-ion batteries in new energy vehicles. Though every effort is made for various online SOC estimation methods to reliably increase the estimation accuracy as much as possible within the limited on-chip resources, little literature discusses the error sources for those SOC estimation methods. This paper firstly reviews the commonly studied SOC estimation methods from a conventional classification. A novel perspective focusing on the error analysis of the SOC estimation methods is proposed. SOC estimation methods are analyzed from the views of the measured values, models, algorithms and state parameters. Subsequently, the error flow charts are proposed to analyze the error sources from the signal measurement to the models and algorithms for the widely used online SOC estimation methods in new energy vehicles. Finally, with the consideration of the working conditions, choosing more reliable and applicable SOC estimation methods is discussed, and the future development of the promising online SOC estimation methods is suggested.

  4. Method for estimating off-axis pulse tube losses

    Science.gov (United States)

    Fang, T.; Mulcahey, T. I.; Taylor, R. P.; Spoor, P. S.; Conrad, T. J.; Ghiaasiaan, S. M.

    2017-12-01

    Some Stirling-type pulse tube cryocoolers (PTCs) exhibit sensitivity to gravitational orientation and often exhibit significant cooling performance losses unless situated with the cold end pointing downward. Prior investigations have indicated that some coolers exhibit sensitivity while others do not; however, a reliable method of predicting the level of sensitivity during the design process has not been developed. In this study, we present a relationship that estimates an upper limit to gravitationally induced losses as a function of the dimensionless pulse tube convection number (NPTC) that can be used to ensure that a PTC would remain functional at adverse static tilt conditions. The empirical relationship is based on experimental data as well as experimentally validated 3-D computational fluid dynamics simulations that examine the effects of frequency, mass flow rate, pressure ratio, mass-pressure phase difference, hot and cold end temperatures, and static tilt angle. The validation of the computational model is based on experimental data collected from six commercial pulse tube cryocoolers. The simulation results are obtained from component-level models of the pulse tube and heat exchangers. Parameter ranges covered in component level simulations are 0-180° for tilt angle, 4-8 for length to diameter ratios, 4-80 K cold tip temperatures, -30° to +30° for mass flow to pressure phase angles, and 25-60 Hz operating frequencies. Simulation results and experimental data are aggregated to yield the relationship between inclined PTC performance and pulse tube convection numbers. The results indicate that the pulse tube convection number can be used as an order of magnitude indicator of the orientation sensitivity, but CFD simulations should be used to calculate the change in energy flow more accurately.

  5. NUMERICAL AND ANALYTIC METHODS OF ESTIMATION BRIDGES’ CONSTRUCTIONS

    Directory of Open Access Journals (Sweden)

    Y. Y. Luchko

    2010-03-01

    Full Text Available In this article the numerical and analytical methods of calculation of the stressed-and-strained state of bridge constructions are considered. The task on increasing of reliability and accuracy of the numerical method and its solution by means of calculations in two bases are formulated. The analytical solution of the differential equation of deformation of a ferro-concrete plate under the action of local loads is also obtained.

  6. A study of digital hardware architectures for nuclear reactors protection systems applications - reliability and safety analysis methods

    International Nuclear Information System (INIS)

    Benko, Pedro Luiz

    1997-01-01

    A study of digital hardware architectures, including experience in many countries, topologies and solutions to interface circuits for protection systems of nuclear reactors is presented. Methods for developing digital systems architectures based on fault tolerant and safety requirements is proposed. Directives for assessing such conditions are suggested. Techniques and the most common tools employed in reliability, safety evaluation and modeling of hardware architectures is also presented. Markov chain modeling is used to evaluate the reliability of redundant architectures. In order to estimate software quality, several mechanisms to be used in design, specification, and validation and verification (V and V) procedures are suggested. A digital protection system architecture has been analyzed as a case study. (author)

  7. A Method for Estimation of Death Tolls in Disastrous Earthquake

    Science.gov (United States)

    Pai, C.; Tien, Y.; Teng, T.

    2004-12-01

    whether the districts are more urbanized or not. As the present researches are concerned, there were not a good and reliable relationship between the mortality and the characteristics of ground motions. We propose the concept of Equal Population Gaps to resolve the influence of mortality in a rural or urban district and decision of the weighting function to each district. The relationship between PGA Index and the mortality determined in this study can be expressed as:\\[M=28.9/[1+exp{(1.67-0.0029 \\times PGA)}] \\] Here M is mortality in %, and PGA is PGA Index in gals. The corresponding curve matches the data reasonably well, with R2=0.91. We process the estimation for districts in different scales to verify the feasibility of the method. The mortality-based on PGA Index is particularly useful in real-time application for death tolls prediction and assessment--a piece of information most critical for post earthquake emergency response operation.

  8. Structural system reliability calculation using a probabilistic fault tree analysis method

    Science.gov (United States)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  9. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software.

    Science.gov (United States)

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2015-05-01

    Several sophisticated methods of footprint analysis currently exist. However, it is sometimes useful to apply standard measurement methods of recognized evidence with an easy and quick application. We sought to assess the reliability and validity of a new method of footprint assessment in a healthy population using Photoshop CS5 software (Adobe Systems Inc, San Jose, California). Forty-two footprints, corresponding to 21 healthy individuals (11 men with a mean ± SD age of 20.45 ± 2.16 years and 10 women with a mean ± SD age of 20.00 ± 1.70 years) were analyzed. Footprints were recorded in static bipedal standing position using optical podography and digital photography. Three trials for each participant were performed. The Hernández-Corvo, Chippaux-Smirak, and Staheli indices and the Clarke angle were calculated by manual method and by computerized method using Photoshop CS5 software. Test-retest was used to determine reliability. Validity was obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed high values (ICC, 0.98-0.99). Moreover, the validity test clearly showed no difference between techniques (ICC, 0.99-1). The reliability and validity of a method to measure, assess, and record the podometric indices using Photoshop CS5 software has been demonstrated. This provides a quick and accurate tool useful for the digital recording of morphostatic foot study parameters and their control.

  10. Improved Accuracy of Nonlinear Parameter Estimation with LAV and Interval Arithmetic Methods

    Directory of Open Access Journals (Sweden)

    Humberto Muñoz

    2009-06-01

    Full Text Available The reliable solution of nonlinear parameter es- timation problems is an important computational problem in many areas of science and engineering, including such applications as real time optimization. Its goal is to estimate accurate model parameters that provide the best fit to measured data, despite small- scale noise in the data or occasional large-scale mea- surement errors (outliers. In general, the estimation techniques are based on some kind of least squares or maximum likelihood criterion, and these require the solution of a nonlinear and non-convex optimiza- tion problem. Classical solution methods for these problems are local methods, and may not be reliable for finding the global optimum, with no guarantee the best model parameters have been found. Interval arithmetic can be used to compute completely and reliably the global optimum for the nonlinear para- meter estimation problem. Finally, experimental re- sults will compare the least squares, l2, and the least absolute value, l1, estimates using interval arithmetic in a chemical engineering application.

  11. A human reliability based usability evaluation method for safety-critical software

    International Nuclear Information System (INIS)

    Boring, R. L.; Tran, T. Q.; Gertman, D. I.; Ragsdale, A.

    2006-01-01

    Boring and Gertman (2005) introduced a novel method that augments heuristic usability evaluation methods with that of the human reliability analysis method of SPAR-H. By assigning probabilistic modifiers to individual heuristics, it is possible to arrive at the usability error probability (UEP). Although this UEP is not a literal probability of error, it nonetheless provides a quantitative basis to heuristic evaluation. This method allows one to seamlessly prioritize and identify usability issues (i.e., a higher UEP requires more immediate fixes). However, the original version of this method required the usability evaluator to assign priority weights to the final UEP, thus allowing the priority of a usability issue to differ among usability evaluators. The purpose of this paper is to explore an alternative approach to standardize the priority weighting of the UEP in an effort to improve the method's reliability. (authors)

  12. Inter- and intra- observer reliability of risk assessment of repetitive work without an explicit method.

    Science.gov (United States)

    Eliasson, Kristina; Palm, Peter; Nyman, Teresia; Forsman, Mikael

    2017-07-01

    A common way to conduct practical risk assessments is to observe a job and report the observed long term risks for musculoskeletal disorders. The aim of this study was to evaluate the inter- and intra-observer reliability of ergonomists' risk assessments without the support of an explicit risk assessment method. Twenty-one experienced ergonomists assessed the risk level (low, moderate, high risk) of eight upper body regions, as well as the global risk of 10 video recorded work tasks. Intra-observer reliability was assessed by having nine of the ergonomists repeat the procedure at least three weeks after the first assessment. The ergonomists made their risk assessment based on his/her experience and knowledge. The statistical parameters of reliability included agreement in %, kappa, linearly weighted kappa, intraclass correlation and Kendall's coefficient of concordance. The average inter-observer agreement of the global risk was 53% and the corresponding weighted kappa (K w ) was 0.32, indicating fair reliability. The intra-observer agreement was 61% and 0.41 (K w ). This study indicates that risk assessments of the upper body, without the use of an explicit observational method, have non-acceptable reliability. It is therefore recommended to use systematic risk assessment methods to a higher degree. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  13. Structural Reliability: An Assessment Using a New and Efficient Two-Phase Method Based on Artificial Neural Network and a Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Naser Kazemi Elaki

    2016-06-01

    Full Text Available In this research, a two-phase algorithm based on the artificial neural network (ANN and a harmony search (HS algorithm has been developed with the aim of assessing the reliability of structures with implicit limit state functions. The proposed method involves the generation of datasets to be used specifically for training by Finite Element analysis, to establish an ANN model using a proven ANN model in the reliability assessment process as an analyzer for structures, and finally estimate the reliability index and failure probability by using the HS algorithm, without any requirements for the explicit form of limit state function. The proposed algorithm is investigated here, and its accuracy and efficiency are demonstrated by using several numerical examples. The results obtained show that the proposed algorithm gives an appropriate estimate for the assessment of reliability of structures.

  14. Unemployment estimation: Spatial point referenced methods and models

    KAUST Repository

    Pereira, Soraia

    2017-06-26

    Portuguese Labor force survey, from 4th quarter of 2014 onwards, started geo-referencing the sampling units, namely the dwellings in which the surveys are carried. This opens new possibilities in analysing and estimating unemployment and its spatial distribution across any region. The labor force survey choose, according to an preestablished sampling criteria, a certain number of dwellings across the nation and survey the number of unemployed in these dwellings. Based on this survey, the National Statistical Institute of Portugal presently uses direct estimation methods to estimate the national unemployment figures. Recently, there has been increased interest in estimating these figures in smaller areas. Direct estimation methods, due to reduced sampling sizes in small areas, tend to produce fairly large sampling variations therefore model based methods, which tend to

  15. A Bayes linear Bayes method for estimation of correlated event rates.

    Science.gov (United States)

    Quigley, John; Wilson, Kevin J; Walls, Lesley; Bedford, Tim

    2013-12-01

    Typically, full Bayesian estimation of correlated event rates can be computationally challenging since estimators are intractable. When estimation of event rates represents one activity within a larger modeling process, there is an incentive to develop more efficient inference than provided by a full Bayesian model. We develop a new subjective inference method for correlated event rates based on a Bayes linear Bayes model under the assumption that events are generated from a homogeneous Poisson process. To reduce the elicitation burden we introduce homogenization factors to the model and, as an alternative to a subjective prior, an empirical method using the method of moments is developed. Inference under the new method is compared against estimates obtained under a full Bayesian model, which takes a multivariate gamma prior, where the predictive and posterior distributions are derived in terms of well-known functions. The mathematical properties of both models are presented. A simulation study shows that the Bayes linear Bayes inference method and the full Bayesian model provide equally reliable estimates. An illustrative example, motivated by a problem of estimating correlated event rates across different users in a simple supply chain, shows how ignoring the correlation leads to biased estimation of event rates. © 2013 Society for Risk Analysis.

  16. Automatic training and reliability estimation for 3D ASM applied to cardiac MRI segmentation.

    Science.gov (United States)

    Tobon-Gomez, Catalina; Sukno, Federico M; Butakoff, Constantine; Huguet, Marina; Frangi, Alejandro F

    2012-07-07

    Training active shape models requires collecting manual ground-truth meshes in a large image database. While shape information can be reused across multiple imaging modalities, intensity information needs to be imaging modality and protocol specific. In this context, this study has two main purposes: (1) to test the potential of using intensity models learned from MRI simulated datasets and (2) to test the potential of including a measure of reliability during the matching process to increase robustness. We used a population of 400 virtual subjects (XCAT phantom), and two clinical populations of 40 and 45 subjects. Virtual subjects were used to generate simulated datasets (MRISIM simulator). Intensity models were trained both on simulated and real datasets. The trained models were used to segment the left ventricle (LV) and right ventricle (RV) from real datasets. Segmentations were also obtained with and without reliability information. Performance was evaluated with point-to-surface and volume errors. Simulated intensity models obtained average accuracy comparable to inter-observer variability for LV segmentation. The inclusion of reliability information reduced volume errors in hypertrophic patients (EF errors from 17 ± 57% to 10 ± 18%; LV MASS errors from -27 ± 22 g to -14 ± 25 g), and in heart failure patients (EF errors from -8 ± 42% to -5 ± 14%). The RV model of the simulated images needs further improvement to better resemble image intensities around the myocardial edges. Both for real and simulated models, reliability information increased segmentation robustness without penalizing accuracy.

  17. Population Estimation with Mark and Recapture Method Program

    International Nuclear Information System (INIS)

    Limohpasmanee, W.; Kaewchoung, W.

    1998-01-01

    Population estimation is the important information which required for the insect control planning especially the controlling with SIT. Moreover, It can be used to evaluate the efficiency of controlling method. Due to the complexity of calculation, the population estimation with mark and recapture methods were not used widely. So that, this program is developed with Qbasic on the purpose to make it accuracy and easier. The program evaluation consists with 6 methods; follow Seber's, Jolly-seber's, Jackson's Ito's, Hamada's and Yamamura's methods. The results are compared with the original methods, found that they are accuracy and more easier to applied

  18. Ore reserve estimation: a summary of principles and methods

    International Nuclear Information System (INIS)

    Marques, J.P.M.

    1985-01-01

    The mining industry has experienced substantial improvements with the increasing utilization of computerized and electronic devices throughout the last few years. In the ore reserve estimation field the main methods have undergone recent advances in order to improve their overall efficiency. This paper presents the three main groups of ore reserve estimation methods presently used worldwide: Conventional, Statistical and Geostatistical, and elaborates a detaited description and comparative analysis of each. The Conventional Methods are the oldest, less complex and most employed ones. The Geostatistical Methods are the most recent precise and more complex ones. The Statistical Methods are intermediate to the others in complexity, diffusion and chronological order. (D.J.M.) [pt

  19. Estimating the impact of structural directionality: How reliable are undirected connectomes?

    Directory of Open Access Journals (Sweden)

    Penelope Kale

    2018-06-01

    Full Text Available Directionality is a fundamental feature of network connections. Most structural brain networks are intrinsically directed because of the nature of chemical synapses, which comprise most neuronal connections. Because of the limitations of noninvasive imaging techniques, the directionality of connections between structurally connected regions of the human brain cannot be confirmed. Hence, connections are represented as undirected, and it is still unknown how this lack of directionality affects brain network topology. Using six directed brain networks from different species and parcellations (cat, mouse, C. elegans, and three macaque networks, we estimate the inaccuracies in network measures (degree, betweenness, clustering coefficient, path length, global efficiency, participation index, and small-worldness associated with the removal of the directionality of connections. We employ three different methods to render directed brain networks undirected: (a remove unidirectional connections, (b add reciprocal connections, and (c combine equal numbers of removed and added unidirectional connections. We quantify the extent of inaccuracy in network measures introduced through neglecting connection directionality for individual nodes and across the network. We find that the coarse division between core and peripheral nodes remains accurate for undirected networks. However, hub nodes differ considerably when directionality is neglected. Comparing the different methods to generate undirected networks from directed ones, we generally find that the addition of reciprocal connections (false positives causes larger errors in graph-theoretic measures than the removal of the same number of directed connections (false negatives. These findings suggest that directionality plays an essential role in shaping brain networks and highlight some limitations of undirected connectomes. Most brain networks are inherently directed because of the nature of chemical synapses

  20. A Fast Optimization Method for Reliability and Performance of Cloud Services Composition Application

    Directory of Open Access Journals (Sweden)

    Zhao Wu

    2013-01-01

    Full Text Available At present the cloud computing is one of the newest trends of distributed computation, which is propelling another important revolution of software industry. The cloud services composition is one of the key techniques in software development. The optimization for reliability and performance of cloud services composition application, which is a typical stochastic optimization problem, is confronted with severe challenges due to its randomness and long transaction, as well as the characteristics of the cloud computing resources such as openness and dynamic. The traditional reliability and performance optimization techniques, for example, Markov model and state space analysis and so forth, have some defects such as being too time consuming and easy to cause state space explosion and unsatisfied the assumptions of component execution independence. To overcome these defects, we propose a fast optimization method for reliability and performance of cloud services composition application based on universal generating function and genetic algorithm in this paper. At first, a reliability and performance model for cloud service composition application based on the multiple state system theory is presented. Then the reliability and performance definition based on universal generating function is proposed. Based on this, a fast reliability and performance optimization algorithm is presented. In the end, the illustrative examples are given.