WorldWideScience

Sample records for reliable accurate information

  1. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee [Hanyang University, Seoul (Korea, Republic of); Lee, Minuk; Choi, Jong-su; Hong, Sup [Korea Research Insitute of Ships and Ocean Engineering, Daejeon (Korea, Republic of)

    2015-02-15

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF.

  2. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    International Nuclear Information System (INIS)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee; Lee, Minuk; Choi, Jong-su; Hong, Sup

    2015-01-01

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF

  3. An accurate and efficient reliability-based design optimization using the second order reliability method and improved stability transformation method

    Science.gov (United States)

    Meng, Zeng; Yang, Dixiong; Zhou, Huanlin; Yu, Bo

    2018-05-01

    The first order reliability method has been extensively adopted for reliability-based design optimization (RBDO), but it shows inaccuracy in calculating the failure probability with highly nonlinear performance functions. Thus, the second order reliability method is required to evaluate the reliability accurately. However, its application for RBDO is quite challenge owing to the expensive computational cost incurred by the repeated reliability evaluation and Hessian calculation of probabilistic constraints. In this article, a new improved stability transformation method is proposed to search the most probable point efficiently, and the Hessian matrix is calculated by the symmetric rank-one update. The computational capability of the proposed method is illustrated and compared to the existing RBDO approaches through three mathematical and two engineering examples. The comparison results indicate that the proposed method is very efficient and accurate, providing an alternative tool for RBDO of engineering structures.

  4. Reliability of dynamic systems under limited information.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr. (.,; .); Grigoriu, Mircea

    2006-09-01

    A method is developed for reliability analysis of dynamic systems under limited information. The available information includes one or more samples of the system output; any known information on features of the output can be used if available. The method is based on the theory of non-Gaussian translation processes and is shown to be particularly suitable for problems of practical interest. For illustration, we apply the proposed method to a series of simple example problems and compare with results given by traditional statistical estimators in order to establish the accuracy of the method. It is demonstrated that the method delivers accurate results for the case of linear and nonlinear dynamic systems, and can be applied to analyze experimental data and/or mathematical model outputs. Two complex applications of direct interest to Sandia are also considered. First, we apply the proposed method to assess design reliability of a MEMS inertial switch. Second, we consider re-entry body (RB) component vibration response during normal re-entry, where the objective is to estimate the time-dependent probability of component failure. This last application is directly relevant to re-entry random vibration analysis at Sandia, and may provide insights on test-based and/or model-based qualification of weapon components for random vibration environments.

  5. ACCURACY AND RELIABILITY AS CRITERIA OF INFORMATIVENESS IN THE NEWS STORY

    Directory of Open Access Journals (Sweden)

    Melnikova Ekaterina Aleksandrovna

    2014-12-01

    Full Text Available The article clarifies the meaning of the terms accuracy and reliability of the news story, offers a researcher's approach to obtaining objective data that helps to verify linguistic means of accuracy and reliability presence in the informative structure of the text. The accuracy of the news story is defined as a high relevance degree of event reflection through language representation of its constituents; the reliability is viewed as news story originality that is proved by introducing citations and sources of information considered being trustworthy into the text content. Having based the research on an event nominative density identification method, the author composed nominative charts of 115 news story texts, collected at web-sites of BBC and CNN media corporations; distinguished qualitative and quantitative markers of accuracy and reliability in the news story text; confirmed that the accuracy of the news story is achieved with terminological clearness in nominating event constituents in the text, thematic bind between words, presence of onyms that help deeply identify characteristics of the referent event. The reliability of the text is discovered in eyewitness accounts, quotations, and references to the sources being considered as trustworthy. Accurate revision of associations between accuracy and reliability and informing strategies in digital news nets allowed the author to set two variants of information delivery, that differ in their communicative and pragmatic functions: developing (that informs about major and minor details of an event and truncated (which gives some details thus raising the interest to the event and urging a reader to open a full story.

  6. Complex method to calculate objective assessments of information systems protection to improve expert assessments reliability

    Science.gov (United States)

    Abdenov, A. Zh; Trushin, V. A.; Abdenova, G. A.

    2018-01-01

    The paper considers the questions of filling the relevant SIEM nodes based on calculations of objective assessments in order to improve the reliability of subjective expert assessments. The proposed methodology is necessary for the most accurate security risk assessment of information systems. This technique is also intended for the purpose of establishing real-time operational information protection in the enterprise information systems. Risk calculations are based on objective estimates of the adverse events implementation probabilities, predictions of the damage magnitude from information security violations. Calculations of objective assessments are necessary to increase the reliability of the proposed expert assessments.

  7. Ultra-accurate collaborative information filtering via directed user similarity

    Science.gov (United States)

    Guo, Q.; Song, W.-J.; Liu, J.-G.

    2014-07-01

    A key challenge of the collaborative filtering (CF) information filtering is how to obtain the reliable and accurate results with the help of peers' recommendation. Since the similarities from small-degree users to large-degree users would be larger than the ones in opposite direction, the large-degree users' selections are recommended extensively by the traditional second-order CF algorithms. By considering the users' similarity direction and the second-order correlations to depress the influence of mainstream preferences, we present the directed second-order CF (HDCF) algorithm specifically to address the challenge of accuracy and diversity of the CF algorithm. The numerical results for two benchmark data sets, MovieLens and Netflix, show that the accuracy of the new algorithm outperforms the state-of-the-art CF algorithms. Comparing with the CF algorithm based on random walks proposed by Liu et al. (Int. J. Mod. Phys. C, 20 (2009) 285) the average ranking score could reach 0.0767 and 0.0402, which is enhanced by 27.3% and 19.1% for MovieLens and Netflix, respectively. In addition, the diversity, precision and recall are also enhanced greatly. Without relying on any context-specific information, tuning the similarity direction of CF algorithms could obtain accurate and diverse recommendations. This work suggests that the user similarity direction is an important factor to improve the personalized recommendation performance.

  8. [Toxoplasmosis and Pregnancy: Reliability of Internet Sources of Information].

    Science.gov (United States)

    Bobić, Branko; Štajner, Tijana; Nikolić, Aleksandra; Klun, Ivana; Srbljanović, Jelena; Djurković-Djaković, Olgica

    2015-01-01

    Health education of women of childbearing age has been shown to be an acceptable approach to the prevention of toxoplasmosis, the most frequent congenitally transmitted parasitic infection. The aim of this study was to evaluate the Internet as a source of health education on toxoplasmosis in pregnancy. A group of 100 pregnant women examined in the National Reference Laboratory for Toxoplasmosis was surveyed by a questionnaire on the source of their information on toxoplasmosis. We also analyzed information offered by websites in the Serbian and Croatian languages through the Google search engine, using "toxoplasmosis" as a keyword. The 23 top websites were evaluated for comprehensiveness and accuracy of information on the impact of toxoplasmosis on the course of pregnancy, diagnosis and prevention. Having knowledge on toxoplasmosis was confirmed by 64 (64.0%) examined women, 40.6% (26/64) of whom learned about toxoplasmosis through the Internet, 48.4% from physicians, and 10.9% from friends. Increase in the degree of education was found to be associated with the probability that pregnant women would be informed via the Internet (RR=3.15, 95% CI=1.27-7.82, p=0.013). Analysis of four interactive websites (allowing users to ask questions) showed that routes of infection were the most common concern, particularly the risk presented by pet cats and dogs, followed by the diagnosis of infection (who and when should be tested, and how should the results be interpreted). Of 20 sites containing educational articles, only seven were authorized and two listed sources. Evaluation confirmed that information relevant to pregnant women was significantly more accurate than comprehensive, but no site gave both comprehensive and completely accurate information. Only four sites (20%) were good sources of information for pregnant women. Internet has proved itself as an important source of information. However, despite numerous websites, only a few offer reliable information to the

  9. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    Science.gov (United States)

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  10. A Bayesian reliability evaluation method with integrated accelerated degradation testing and field information

    International Nuclear Information System (INIS)

    Wang, Lizhi; Pan, Rong; Li, Xiaoyang; Jiang, Tongmin

    2013-01-01

    Accelerated degradation testing (ADT) is a common approach in reliability prediction, especially for products with high reliability. However, oftentimes the laboratory condition of ADT is different from the field condition; thus, to predict field failure, one need to calibrate the prediction made by using ADT data. In this paper a Bayesian evaluation method is proposed to integrate the ADT data from laboratory with the failure data from field. Calibration factors are introduced to calibrate the difference between the lab and the field conditions so as to predict a product's actual field reliability more accurately. The information fusion and statistical inference procedure are carried out through a Bayesian approach and Markov chain Monte Carlo methods. The proposed method is demonstrated by two examples and the sensitivity analysis to prior distribution assumption

  11. Reliability Assessment of Cloud Computing Platform Based on Semiquantitative Information and Evidential Reasoning

    Directory of Open Access Journals (Sweden)

    Hang Wei

    2016-01-01

    Full Text Available A reliability assessment method based on evidential reasoning (ER rule and semiquantitative information is proposed in this paper, where a new reliability assessment architecture including four aspects with both quantitative data and qualitative knowledge is established. The assessment architecture is more objective in describing complex dynamic cloud computing environment than that in traditional method. In addition, the ER rule which has good performance for multiple attribute decision making problem is employed to integrate different types of the attributes in assessment architecture, which can obtain more accurate assessment results. The assessment results of the case study in an actual cloud computing platform verify the effectiveness and the advantage of the proposed method.

  12. Analysis of information security reliability: A tutorial

    International Nuclear Information System (INIS)

    Kondakci, Suleyman

    2015-01-01

    This article presents a concise reliability analysis of network security abstracted from stochastic modeling, reliability, and queuing theories. Network security analysis is composed of threats, their impacts, and recovery of the failed systems. A unique framework with a collection of the key reliability models is presented here to guide the determination of the system reliability based on the strength of malicious acts and performance of the recovery processes. A unique model, called Attack-obstacle model, is also proposed here for analyzing systems with immunity growth features. Most computer science curricula do not contain courses in reliability modeling applicable to different areas of computer engineering. Hence, the topic of reliability analysis is often too diffuse to most computer engineers and researchers dealing with network security. This work is thus aimed at shedding some light on this issue, which can be useful in identifying models, their assumptions and practical parameters for estimating the reliability of threatened systems and for assessing the performance of recovery facilities. It can also be useful for the classification of processes and states regarding the reliability of information systems. Systems with stochastic behaviors undergoing queue operations and random state transitions can also benefit from the approaches presented here. - Highlights: • A concise survey and tutorial in model-based reliability analysis applicable to information security. • A framework of key modeling approaches for assessing reliability of networked systems. • The framework facilitates quantitative risk assessment tasks guided by stochastic modeling and queuing theory. • Evaluation of approaches and models for modeling threats, failures, impacts, and recovery analysis of information systems

  13. Reliability of "Google" for obtaining medical information

    Directory of Open Access Journals (Sweden)

    Mihir Kothari

    2015-01-01

    Full Text Available Internet is used by many patients to obtain relevant medical information. We assessed the impact of "Google" search on the knowledge of the parents whose ward suffered from squint. In 21 consecutive patients, the "Google" search improved the mean score of the correct answers from 47% to 62%. We found that "Google" search was useful and reliable source of information for the patients with regards to the disease etiopathogenesis and the problems caused by the disease. The internet-based information, however, was incomplete and not reliable with regards to the disease treatment.

  14. Remote patient monitoring: Information reliability challenges

    NARCIS (Netherlands)

    Petkovic, M.

    2009-01-01

    An increasing number of extramural applications in the personal healthcare domain pose new challenges regarding the security of medical data. In this paper, we focus on remote patient monitoring systems and the issues around information reliability. In these systems medical data is not collected by

  15. The reliability and usability of district health information software ...

    African Journals Online (AJOL)

    The reliability and usability of district health information software: case studies from Tanzania. ... The District Health Information System (DHIS) software from the Health Information System ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  16. Information flow a data bank preparation in nuclear power plant reliability information system

    International Nuclear Information System (INIS)

    Kolesa, K.; Vejvodova, I.

    1983-01-01

    In the year 1981 the reliability information system for nuclear power plants (ISS-JE) was established. The objective of the system is to make a statistical evaluation of the operation of nuclear power plants and to obtain information on the reliability of the equipment of nuclear power plants and the transmission of this information to manufacturers with the aim of inducing them to take corrective measures. The HP 1000 computer with the data base system IMAGE 100 is used which allows to process single queries and periodical outputs. The content of periodical outputs designed for various groups of subcontractors is briefly described and trends of the further development of the system indicated. (Ha)

  17. Reliable and accurate point-based prediction of cumulative infiltration using soil readily available characteristics: A comparison between GMDH, ANN, and MLR

    Science.gov (United States)

    Rahmati, Mehdi

    2017-08-01

    Developing accurate and reliable pedo-transfer functions (PTFs) to predict soil non-readily available characteristics is one of the most concerned topic in soil science and selecting more appropriate predictors is a crucial factor in PTFs' development. Group method of data handling (GMDH), which finds an approximate relationship between a set of input and output variables, not only provide an explicit procedure to select the most essential PTF input variables, but also results in more accurate and reliable estimates than other mostly applied methodologies. Therefore, the current research was aimed to apply GMDH in comparison with multivariate linear regression (MLR) and artificial neural network (ANN) to develop several PTFs to predict soil cumulative infiltration point-basely at specific time intervals (0.5-45 min) using soil readily available characteristics (RACs). In this regard, soil infiltration curves as well as several soil RACs including soil primary particles (clay (CC), silt (Si), and sand (Sa)), saturated hydraulic conductivity (Ks), bulk (Db) and particle (Dp) densities, organic carbon (OC), wet-aggregate stability (WAS), electrical conductivity (EC), and soil antecedent (θi) and field saturated (θfs) water contents were measured at 134 different points in Lighvan watershed, northwest of Iran. Then, applying GMDH, MLR, and ANN methodologies, several PTFs have been developed to predict cumulative infiltrations using two sets of selected soil RACs including and excluding Ks. According to the test data, results showed that developed PTFs by GMDH and MLR procedures using all soil RACs including Ks resulted in more accurate (with E values of 0.673-0.963) and reliable (with CV values lower than 11 percent) predictions of cumulative infiltrations at different specific time steps. In contrast, ANN procedure had lower accuracy (with E values of 0.356-0.890) and reliability (with CV values up to 50 percent) compared to GMDH and MLR. The results also revealed

  18. The dependence of human reliability upon task information content

    International Nuclear Information System (INIS)

    Hermanson, E.M.; Golay, M.W.

    1994-09-01

    The role of human error in safety mishaps is an important factor in system design. As systems become increasingly complex the capacity of the human to deal with the added complexity is diminished. It is therefore crucial to understand the relationship between system complexity and human reliability so that systems may be built in such a way as to minimize human error. One way of understanding this relationship is to quantify system complexity and then measure the human reaction in response to situations of varying complexity. The quantification of system complexity may be performed by determining the information content present in the tasks that the human must execute. The purpose of this work is therefore to build and perform a consistent experiment which will determine the extent to which human reliability depends upon task information content. Two main conclusions may be drawn from this work. The first is that human reliability depends upon task information content. Specifically, as the information content contained in a task increases, the capacity of a human to deal successfully with the task decreases monotonically. Here the definition of total success is the ability to complete the task at hand fully and correctly. Furthermore, there exists a value of information content below which a human can deal with the task successfully, but above which the success of an individual decreases monotonically with increasing information. These ideas should be generalizable to any model where system complexity can be clearly and consistently defined

  19. Assessing the performance of commercial Agisoft PhotoScan software to deliver reliable data for accurate3D modelling

    Directory of Open Access Journals (Sweden)

    Jebur Ahmed

    2018-01-01

    Full Text Available 3D models delivered from digital photogrammetric techniques have massively increased and developed to meet the requirements of many applications. The reliability of these models is basically dependent on the data processing cycle and the adopted tool solution in addition to data quality. Agisoft PhotoScan is a professional image-based 3D modelling software, which seeks to create orderly, precise n 3D content from fixed images. It works with arbitrary images those qualified in both controlled and uncontrolled conditions. Following the recommendations of many users all around the globe, Agisoft PhotoScan, has become an important source to generate precise 3D data for different applications. How reliable is this data for accurate 3D modelling applications is the current question that needs an answer. Therefore; in this paper, the performance of the Agisoft PhotoScan software was assessed and analyzed to show the potential of the software for accurate 3D modelling applications. To investigate this, a study was carried out in the University of Baghdad / Al-Jaderia campus using data collected from airborne metric camera with 457m flying height. The Agisoft results show potential according to the research objective and the dataset quality following statistical and validation shape analysis.

  20. Evaluation of aileron actuator reliability with censored data

    Directory of Open Access Journals (Sweden)

    Li Huaiyuan

    2015-08-01

    Full Text Available For the purpose of enhancing reliability of aileron of Airbus new-generation A350XWB, an evaluation of aileron reliability on the basis of maintenance data is presented in this paper. Practical maintenance data contains large number of censoring samples, information uncertainty of which makes it hard to evaluate reliability of aileron actuator. Considering that true lifetime of censoring sample has identical distribution with complete sample, if censoring sample is transformed into complete sample, conversion frequency of censoring sample can be estimated according to frequency of complete sample. On the one hand, standard life table estimation and product limit method are improved on the basis of such conversion frequency, enabling accurate estimation of various censoring samples. On the other hand, by taking such frequency as one of the weight factors and integrating variance of order statistics under standard distribution, weighted least square estimation is formed for accurately estimating various censoring samples. Large amounts of experiments and simulations show that reliabilities of improved life table and improved product limit method are closer to the true value and more conservative; moreover, weighted least square estimate (WLSE, with conversion frequency of censoring sample and variances of order statistics as the weights, can still estimate accurately with high proportion of censored data in samples. Algorithm in this paper has good effect and can accurately estimate the reliability of aileron actuator even with small sample and high censoring rate. This research has certain significance in theory and engineering practice.

  1. Can Internet information on vertebroplasty be a reliable means of patient self-education?

    Science.gov (United States)

    Sullivan, T Barrett; Anderson, Joshua T; Ahn, Uri M; Ahn, Nicholas U

    2014-05-01

    regarding vertebroplasty is not only inadequate for proper patient education, but also potentially misleading as sites are more likely to present benefits of the procedure than risks. Although academic sites might be expected to offer higher-quality information than private, industry, or other sites, our data would suggest that they do not. HONCode certification cannot be used reliably as a means of qualifying website information quality. Academic sites should be expected to set a high standard and alter their Internet presence with adequate information distribution. Certification bodies also should alter their standards to necessitate provision of complete information in addition to emphasizing accurate information. Treating physicians may want to counsel their patients regarding the limitations of information present on the Internet and the pitfalls of current certification systems. Level IV, economic and decision analyses. See the Instructions for Authors for a complete description of levels of evidence.

  2. Software Estimation: Developing an Accurate, Reliable Method

    Science.gov (United States)

    2011-08-01

    based and size-based estimates is able to accurately plan, launch, and execute on schedule. Bob Sinclair, NAWCWD Chris Rickets , NAWCWD Brad Hodgins...Office by Carnegie Mellon University. SMPSP and SMTSP are service marks of Carnegie Mellon University. 1. Rickets , Chris A, “A TSP Software Maintenance...Life Cycle”, CrossTalk, March, 2005. 2. Koch, Alan S, “TSP Can Be the Building blocks for CMMI”, CrossTalk, March, 2005. 3. Hodgins, Brad, Rickets

  3. Reliability Modeling of Electromechanical System with Meta-Action Chain Methodology

    Directory of Open Access Journals (Sweden)

    Genbao Zhang

    2018-01-01

    Full Text Available To establish a more flexible and accurate reliability model, the reliability modeling and solving algorithm based on the meta-action chain thought are used in this thesis. Instead of estimating the reliability of the whole system only in the standard operating mode, this dissertation adopts the structure chain and the operating action chain for the system reliability modeling. The failure information and structure information for each component are integrated into the model to overcome the given factors applied in the traditional modeling. In the industrial application, there may be different operating modes for a multicomponent system. The meta-action chain methodology can estimate the system reliability under different operating modes by modeling the components with varieties of failure sensitivities. This approach has been identified by computing some electromechanical system cases. The results indicate that the process could improve the system reliability estimation. It is an effective tool to solve the reliability estimation problem in the system under various operating modes.

  4. A Reliable Measure of Information Security Awareness and the Identification of Bias in Responses

    Directory of Open Access Journals (Sweden)

    Agata McCormac

    2017-11-01

    Full Text Available The Human Aspects of Information Security Questionnaire (HAIS-Q is designed to measure Information Security Awareness. More specifically, the tool measures an individual’s knowledge, attitude, and self-reported behaviour relating to information security in the workplace. This paper reports on the reliability of the HAIS-Q, including test-retest reliability and internal consistency. The paper also assesses the reliability of three preliminary over-claiming items, designed specifically to complement the HAIS-Q, and identify those individuals who provide socially desirable responses. A total of 197 working Australians completed two iterations of the HAIS-Q and the over-claiming items, approximately 4 weeks apart. Results of the analysis showed that the HAIS-Q was externally reliable and internally consistent. Therefore, the HAIS-Q can be used to reliably measure information security awareness. Reliability testing on the preliminary over-claiming items was not as robust and further development is required and recommended. The implications of these findings mean that organisations can confidently use the HAIS-Q to not only measure the current state of employee information security awareness within their organisation, but they can also measure the effectiveness and impacts of training interventions, information security awareness programs and campaigns. The influence of cultural changes and the effect of security incidents can also be assessed.

  5. Differential contribution of visual and auditory information to accurately predict the direction and rotational motion of a visual stimulus.

    Science.gov (United States)

    Park, Seoung Hoon; Kim, Seonjin; Kwon, MinHyuk; Christou, Evangelos A

    2016-03-01

    Vision and auditory information are critical for perception and to enhance the ability of an individual to respond accurately to a stimulus. However, it is unknown whether visual and auditory information contribute differentially to identify the direction and rotational motion of the stimulus. The purpose of this study was to determine the ability of an individual to accurately predict the direction and rotational motion of the stimulus based on visual and auditory information. In this study, we recruited 9 expert table-tennis players and used table-tennis service as our experimental model. Participants watched recorded services with different levels of visual and auditory information. The goal was to anticipate the direction of the service (left or right) and the rotational motion of service (topspin, sidespin, or cut). We recorded their responses and quantified the following outcomes: (i) directional accuracy and (ii) rotational motion accuracy. The response accuracy was the accurate predictions relative to the total number of trials. The ability of the participants to predict the direction of the service accurately increased with additional visual information but not with auditory information. In contrast, the ability of the participants to predict the rotational motion of the service accurately increased with the addition of auditory information to visual information but not with additional visual information alone. In conclusion, this finding demonstrates that visual information enhances the ability of an individual to accurately predict the direction of the stimulus, whereas additional auditory information enhances the ability of an individual to accurately predict the rotational motion of stimulus.

  6. Social Information Is Integrated into Value and Confidence Judgments According to Its Reliability.

    Science.gov (United States)

    De Martino, Benedetto; Bobadilla-Suarez, Sebastian; Nouguchi, Takao; Sharot, Tali; Love, Bradley C

    2017-06-21

    How much we like something, whether it be a bottle of wine or a new film, is affected by the opinions of others. However, the social information that we receive can be contradictory and vary in its reliability. Here, we tested whether the brain incorporates these statistics when judging value and confidence. Participants provided value judgments about consumer goods in the presence of online reviews. We found that participants updated their initial value and confidence judgments in a Bayesian fashion, taking into account both the uncertainty of their initial beliefs and the reliability of the social information. Activity in dorsomedial prefrontal cortex tracked the degree of belief update. Analogous to how lower-level perceptual information is integrated, we found that the human brain integrates social information according to its reliability when judging value and confidence. SIGNIFICANCE STATEMENT The field of perceptual decision making has shown that the sensory system integrates different sources of information according to their respective reliability, as predicted by a Bayesian inference scheme. In this work, we hypothesized that a similar coding scheme is implemented by the human brain to process social signals and guide complex, value-based decisions. We provide experimental evidence that the human prefrontal cortex's activity is consistent with a Bayesian computation that integrates social information that differs in reliability and that this integration affects the neural representation of value and confidence. Copyright © 2017 De Martino et al.

  7. Reliability-based sensitivity of mechanical components with arbitrary distribution parameters

    International Nuclear Information System (INIS)

    Zhang, Yi Min; Yang, Zhou; Wen, Bang Chun; He, Xiang Dong; Liu, Qiaoling

    2010-01-01

    This paper presents a reliability-based sensitivity method for mechanical components with arbitrary distribution parameters. Techniques from the perturbation method, the Edgeworth series, the reliability-based design theory, and the sensitivity analysis approach were employed directly to calculate the reliability-based sensitivity of mechanical components on the condition that the first four moments of the original random variables are known. The reliability-based sensitivity information of the mechanical components can be accurately and quickly obtained using a practical computer program. The effects of the design parameters on the reliability of mechanical components were studied. The method presented in this paper provides the theoretic basis for the reliability-based design of mechanical components

  8. A New Method of Reliability Evaluation Based on Wavelet Information Entropy for Equipment Condition Identification

    International Nuclear Information System (INIS)

    He, Z J; Zhang, X L; Chen, X F

    2012-01-01

    Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.

  9. Quantum Monte Carlo: Faster, More Reliable, And More Accurate

    Science.gov (United States)

    Anderson, Amos Gerald

    2010-06-01

    The Schrodinger Equation has been available for about 83 years, but today, we still strain to apply it accurately to molecules of interest. The difficulty is not theoretical in nature, but practical, since we're held back by lack of sufficient computing power. Consequently, effort is applied to find acceptable approximations to facilitate real time solutions. In the meantime, computer technology has begun rapidly advancing and changing the way we think about efficient algorithms. For those who can reorganize their formulas to take advantage of these changes and thereby lift some approximations, incredible new opportunities await. Over the last decade, we've seen the emergence of a new kind of computer processor, the graphics card. Designed to accelerate computer games by optimizing quantity instead of quality in processor, they have become of sufficient quality to be useful to some scientists. In this thesis, we explore the first known use of a graphics card to computational chemistry by rewriting our Quantum Monte Carlo software into the requisite "data parallel" formalism. We find that notwithstanding precision considerations, we are able to speed up our software by about a factor of 6. The success of a Quantum Monte Carlo calculation depends on more than just processing power. It also requires the scientist to carefully design the trial wavefunction used to guide simulated electrons. We have studied the use of Generalized Valence Bond wavefunctions to simply, and yet effectively, captured the essential static correlation in atoms and molecules. Furthermore, we have developed significantly improved two particle correlation functions, designed with both flexibility and simplicity considerations, representing an effective and reliable way to add the necessary dynamic correlation. Lastly, we present our method for stabilizing the statistical nature of the calculation, by manipulating configuration weights, thus facilitating efficient and robust calculations. Our

  10. Accurate protein structure modeling using sparse NMR data and homologous structure information.

    Science.gov (United States)

    Thompson, James M; Sgourakis, Nikolaos G; Liu, Gaohua; Rossi, Paolo; Tang, Yuefeng; Mills, Jeffrey L; Szyperski, Thomas; Montelione, Gaetano T; Baker, David

    2012-06-19

    While information from homologous structures plays a central role in X-ray structure determination by molecular replacement, such information is rarely used in NMR structure determination because it can be incorrect, both locally and globally, when evolutionary relationships are inferred incorrectly or there has been considerable evolutionary structural divergence. Here we describe a method that allows robust modeling of protein structures of up to 225 residues by combining (1)H(N), (13)C, and (15)N backbone and (13)Cβ chemical shift data, distance restraints derived from homologous structures, and a physically realistic all-atom energy function. Accurate models are distinguished from inaccurate models generated using incorrect sequence alignments by requiring that (i) the all-atom energies of models generated using the restraints are lower than models generated in unrestrained calculations and (ii) the low-energy structures converge to within 2.0 Å backbone rmsd over 75% of the protein. Benchmark calculations on known structures and blind targets show that the method can accurately model protein structures, even with very remote homology information, to a backbone rmsd of 1.2-1.9 Å relative to the conventional determined NMR ensembles and of 0.9-1.6 Å relative to X-ray structures for well-defined regions of the protein structures. This approach facilitates the accurate modeling of protein structures using backbone chemical shift data without need for side-chain resonance assignments and extensive analysis of NOESY cross-peak assignments.

  11. Fast Monte Carlo reliability evaluation using support vector machine

    International Nuclear Information System (INIS)

    Rocco, Claudio M.; Moreno, Jose Ali

    2002-01-01

    This paper deals with the feasibility of using support vector machine (SVM) to build empirical models for use in reliability evaluation. The approach takes advantage of the speed of SVM in the numerous model calculations typically required to perform a Monte Carlo reliability evaluation. The main idea is to develop an estimation algorithm, by training a model on a restricted data set, and replace system performance evaluation by a simpler calculation, which provides reasonably accurate model outputs. The proposed approach is illustrated by several examples. Excellent system reliability results are obtained by training a SVM with a small amount of information

  12. Can blind persons accurately assess body size from the voice?

    Science.gov (United States)

    Pisanski, Katarzyna; Oleszkiewicz, Anna; Sorokowska, Agnieszka

    2016-04-01

    Vocal tract resonances provide reliable information about a speaker's body size that human listeners use for biosocial judgements as well as speech recognition. Although humans can accurately assess men's relative body size from the voice alone, how this ability is acquired remains unknown. In this study, we test the prediction that accurate voice-based size estimation is possible without prior audiovisual experience linking low frequencies to large bodies. Ninety-one healthy congenitally or early blind, late blind and sighted adults (aged 20-65) participated in the study. On the basis of vowel sounds alone, participants assessed the relative body sizes of male pairs of varying heights. Accuracy of voice-based body size assessments significantly exceeded chance and did not differ among participants who were sighted, or congenitally blind or who had lost their sight later in life. Accuracy increased significantly with relative differences in physical height between men, suggesting that both blind and sighted participants used reliable vocal cues to size (i.e. vocal tract resonances). Our findings demonstrate that prior visual experience is not necessary for accurate body size estimation. This capacity, integral to both nonverbal communication and speech perception, may be present at birth or may generalize from broader cross-modal correspondences. © 2016 The Author(s).

  13. People consider reliability and cost when verifying their autobiographical memories.

    Science.gov (United States)

    Wade, Kimberley A; Nash, Robert A; Garry, Maryanne

    2014-02-01

    Because memories are not always accurate, people rely on a variety of strategies to verify whether the events that they remember really did occur. Several studies have examined which strategies people tend to use, but none to date has asked why people opt for certain strategies over others. Here we examined the extent to which people's beliefs about the reliability and the cost of different strategies would determine their strategy selection. Subjects described a childhood memory and then suggested strategies they might use to verify the accuracy of that memory. Next, they rated the reliability and cost of each strategy, and the likelihood that they might use it. Reliability and cost each predicted strategy selection, but a combination of the two ratings provided even greater predictive value. Cost was significantly more influential than reliability, which suggests that a tendency to seek and to value "cheap" information more than reliable information could underlie many real-world memory errors. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Toddlers favor communicatively presented information over statistical reliability in learning about artifacts.

    Directory of Open Access Journals (Sweden)

    Hanna Marno

    Full Text Available Observed associations between events can be validated by statistical information of reliability or by testament of communicative sources. We tested whether toddlers learn from their own observation of efficiency, assessed by statistical information on reliability of interventions, or from communicatively presented demonstration, when these two potential types of evidence of validity of interventions on a novel artifact are contrasted with each other. Eighteen-month-old infants observed two adults, one operating the artifact by a method that was more efficient (2/3 probability of success than that of the other (1/3 probability of success. Compared to the Baseline condition, in which communicative signals were not employed, infants tended to choose the less reliable method to operate the artifact when this method was demonstrated in a communicative manner in the Experimental condition. This finding demonstrates that, in certain circumstances, communicative sanctioning of reliability may override statistical evidence for young learners. Such a bias can serve fast and efficient transmission of knowledge between generations.

  15. Reliability and Validity of Curriculum-Based Informal Reading Inventories.

    Science.gov (United States)

    Fuchs, Lynn; And Others

    A study was conducted to explore the reliability and validity of three prominent procedures used in informal reading inventories (IRIs): (1) choosing a 95% word recognition accuracy standard for determining student instructional level, (2) arbitrarily selecting a passage to represent the difficulty level of a basal reader, and (3) employing…

  16. Laboratory Information Management System Chain of Custody: Reliability and Security

    Science.gov (United States)

    Tomlinson, J. J.; Elliott-Smith, W.; Radosta, T.

    2006-01-01

    A chain of custody (COC) is required in many laboratories that handle forensics, drugs of abuse, environmental, clinical, and DNA testing, as well as other laboratories that want to assure reliability of reported results. Maintaining a dependable COC can be laborious, but with the recent establishment of the criteria for electronic records and signatures by US regulatory agencies, laboratory information management systems (LIMSs) are now being developed to fully automate COCs. The extent of automation and of data reliability can vary, and FDA- and EPA-compliant electronic signatures and system security are rare. PMID:17671623

  17. An information system supporting design for reliability and maintenance

    International Nuclear Information System (INIS)

    Rit, J.F.; Beraud, M.T.

    1997-01-01

    EDF is currently developing a methodology to integrate availability, operating experience and maintenance in the design of power plants. This involves studies that depend closely on the results and assumptions of each other about the reliability and operations of the plant. Therefore a support information system must be carefully designed. Concurrently with development of the methodology, a research oriented information system was designed and built. It is based on the database model of a logistic support repository that we tailored to our needs. (K.A.)

  18. An information system supporting design for reliability and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Rit, J.F.; Beraud, M.T

    1997-12-31

    EDF is currently developing a methodology to integrate availability, operating experience and maintenance in the design of power plants. This involves studies that depend closely on the results and assumptions of each other about the reliability and operations of the plant. Therefore a support information system must be carefully designed. Concurrently with development of the methodology, a research oriented information system was designed and built. It is based on the database model of a logistic support repository that we tailored to our needs. (K.A.) 10 refs.

  19. Under Construction: Reviewing and Producing Information Reliability on the Web

    NARCIS (Netherlands)

    S.A. Adams (Samantha)

    2006-01-01

    textabstractSince 1995, medical professionals, governments and independent organizations have been developing special tools to help lay-persons find websites that are guaranteed to give only reliable medical or health-related information. However, as these different actors also recognize, such a

  20. Molecular acidity: An accurate description with information-theoretic approach in density functional reactivity theory.

    Science.gov (United States)

    Cao, Xiaofang; Rong, Chunying; Zhong, Aiguo; Lu, Tian; Liu, Shubin

    2018-01-15

    Molecular acidity is one of the important physiochemical properties of a molecular system, yet its accurate calculation and prediction are still an unresolved problem in the literature. In this work, we propose to make use of the quantities from the information-theoretic (IT) approach in density functional reactivity theory and provide an accurate description of molecular acidity from a completely new perspective. To illustrate our point, five different categories of acidic series, singly and doubly substituted benzoic acids, singly substituted benzenesulfinic acids, benzeneseleninic acids, phenols, and alkyl carboxylic acids, have been thoroughly examined. We show that using IT quantities such as Shannon entropy, Fisher information, Ghosh-Berkowitz-Parr entropy, information gain, Onicescu information energy, and relative Rényi entropy, one is able to simultaneously predict experimental pKa values of these different categories of compounds. Because of the universality of the quantities employed in this work, which are all density dependent, our approach should be general and be applicable to other systems as well. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. Distributed Information and Control system reliability enhancement by fog-computing concept application

    Science.gov (United States)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-03-01

    The paper focuses on the information and control system reliability issue. Authors of the current paper propose a new complex approach of information and control system reliability enhancement by application of the computing concept elements. The approach proposed consists of a complex of optimization problems to be solved. These problems are: estimation of computational complexity, which can be shifted to the edge of the network and fog-layer, distribution of computations among the data processing elements and distribution of computations among the sensors. The problems as well as some simulated results and discussion are formulated and presented within this paper.

  2. Anatomical landmarks and skin markers are not reliable for accurate labeling of thoracic vertebrae on MRI

    International Nuclear Information System (INIS)

    Shabshin, Nogah; Schweitzer, Mark E.; Carrino, John A.

    2010-01-01

    Background: Numbering of the thoracic spine on MRI can be tedious if C2 and L5-S1 are not included and may lead to errors in lesion level. Purpose: To determine whether anatomic landmarks or external markers are reliable as an aid for accurate numbering of thoracic vertebrae on MRI. Material and Methods: Sixty-seven thoracic spine MR studies of 67 patients (30 males, 37 females, age range 18-83 years) were studied, composed of 52 consecutive MR studies and an additional 15 MRI in which vitamin E markers were placed over the skin. In the 52 thoracic MR examinations potential numbering aids such as the level of the sternal apex, pulmonary artery, aortic arch, and osseous or disc abnormalities were numbered on both cervical localizer (standard of reference) and thoracic sagittal images. The additional 15 examinations in which vitamin E markers were placed over the skin were evaluated for consistency in the level of the markers on different sequences in the same exam. Results: The sternal apex level ranged from T2 to T5 [T3 in 28/51 patients (55%), T2 in 10/51 (20%)]. The aortic arch level ranged from T2 to T4 [T4 in 18/48 (38%) and T3 in 17 (35%)]. Pulmonary artery level ranged from T4 to T6-7 disc [T5 in 20/52 patients (38%) and T6 in 14/52 (27%)]. In 3 of 12 patients who had abnormalities in a vertebral body or disc as definite point reference, the non-localizer image mislabelled the level. In 11/15 (73%) patients with vitamin E markers that were placed over the upper thoracic spine, the results showed consistency in the level of the markers in relation to the reference points or consistent inter-marker gap between the sequences. Conclusion: There are only two reliable ways to accurately define the levels if no landmarking feature is available on the magnet. The first is by including C2 in the thoracic sequence of a diagnostic quality, and the second is by using an abnormality in the discs or vertebral bodies as a point of reference

  3. Anatomical landmarks and skin markers are not reliable for accurate labeling of thoracic vertebrae on MRI

    Energy Technology Data Exchange (ETDEWEB)

    Shabshin, Nogah (Dept. of Diagnostic Imaging, Chaim Sheba Medical Center, Tel-HaShomer (Israel)), e-mail: shabshin@gmail.com; Schweitzer, Mark E. (Dept. of Diagnostic Imaging, Ottawa Hospital and Univ. of Ottawa, Ottawa (Canada)); Carrino, John A. (Dept. of Radiology, Johns Hopkins Univ. School of Medicine, Baltimore, MD (United States))

    2010-11-15

    Background: Numbering of the thoracic spine on MRI can be tedious if C2 and L5-S1 are not included and may lead to errors in lesion level. Purpose: To determine whether anatomic landmarks or external markers are reliable as an aid for accurate numbering of thoracic vertebrae on MRI. Material and Methods: Sixty-seven thoracic spine MR studies of 67 patients (30 males, 37 females, age range 18-83 years) were studied, composed of 52 consecutive MR studies and an additional 15 MRI in which vitamin E markers were placed over the skin. In the 52 thoracic MR examinations potential numbering aids such as the level of the sternal apex, pulmonary artery, aortic arch, and osseous or disc abnormalities were numbered on both cervical localizer (standard of reference) and thoracic sagittal images. The additional 15 examinations in which vitamin E markers were placed over the skin were evaluated for consistency in the level of the markers on different sequences in the same exam. Results: The sternal apex level ranged from T2 to T5 [T3 in 28/51 patients (55%), T2 in 10/51 (20%)]. The aortic arch level ranged from T2 to T4 [T4 in 18/48 (38%) and T3 in 17 (35%)]. Pulmonary artery level ranged from T4 to T6-7 disc [T5 in 20/52 patients (38%) and T6 in 14/52 (27%)]. In 3 of 12 patients who had abnormalities in a vertebral body or disc as definite point reference, the non-localizer image mislabelled the level. In 11/15 (73%) patients with vitamin E markers that were placed over the upper thoracic spine, the results showed consistency in the level of the markers in relation to the reference points or consistent inter-marker gap between the sequences. Conclusion: There are only two reliable ways to accurately define the levels if no landmarking feature is available on the magnet. The first is by including C2 in the thoracic sequence of a diagnostic quality, and the second is by using an abnormality in the discs or vertebral bodies as a point of reference

  4. Methods for Calculating Frequency of Maintenance of Complex Information Security System Based on Dynamics of Its Reliability

    Science.gov (United States)

    Varlataya, S. K.; Evdokimov, V. E.; Urzov, A. Y.

    2017-11-01

    This article describes a process of calculating a certain complex information security system (CISS) reliability using the example of the technospheric security management model as well as ability to determine the frequency of its maintenance using the system reliability parameter which allows one to assess man-made risks and to forecast natural and man-made emergencies. The relevance of this article is explained by the fact the CISS reliability is closely related to information security (IS) risks. Since reliability (or resiliency) is a probabilistic characteristic of the system showing the possibility of its failure (and as a consequence - threats to the protected information assets emergence), it is seen as a component of the overall IS risk in the system. As it is known, there is a certain acceptable level of IS risk assigned by experts for a particular information system; in case of reliability being a risk-forming factor maintaining an acceptable risk level should be carried out by the routine analysis of the condition of CISS and its elements and their timely service. The article presents a reliability parameter calculation for the CISS with a mixed type of element connection, a formula of the dynamics of such system reliability is written. The chart of CISS reliability change is a S-shaped curve which can be divided into 3 periods: almost invariable high level of reliability, uniform reliability reduction, almost invariable low level of reliability. Setting the minimum acceptable level of reliability, the graph (or formula) can be used to determine the period of time during which the system would meet requirements. Ideally, this period should not be longer than the first period of the graph. Thus, the proposed method of calculating the CISS maintenance frequency helps to solve a voluminous and critical task of the information assets risk management.

  5. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  6. Application of Vibration and Oil Analysis for Reliability Information on Helicopter Main Rotor Gearbox

    Science.gov (United States)

    Murrad, Muhamad; Leong, M. Salman

    Based on the experiences of the Malaysian Armed Forces (MAF), failure of the main rotor gearbox (MRGB) was one of the major contributing factors to helicopter breakdowns. Even though vibration and oil analysis are the effective techniques for monitoring the health of helicopter components, these two techniques were rarely combined to form an effective assessment tool in MAF. Results of the oil analysis were often used only for oil changing schedule while assessments of MRGB condition were mainly based on overall vibration readings. A study group was formed and given a mandate to improve the maintenance strategy of S61-A4 helicopter fleet in the MAF. The improvement consisted of a structured approach to the reassessment/redefinition suitable maintenance actions that should be taken for the MRGB. Basic and enhanced tools for condition monitoring (CM) are investigated to address the predominant failures of the MRGB. Quantitative accelerated life testing (QALT) was considered in this work with an intent to obtain the required reliability information in a shorter time with tests under normal stress conditions. These tests when performed correctly can provide valuable information about MRGB performance under normal operating conditions which enable maintenance personnel to make decision more quickly, accurately and economically. The time-to-failure and probability of failure information of the MRGB were generated by applying QALT analysis principles. This study is anticipated to make a dramatic change in its approach to CM, bringing significant savings and various benefits to MAF.

  7. Accurate thermoelastic tensor and acoustic velocities of NaCl

    Energy Technology Data Exchange (ETDEWEB)

    Marcondes, Michel L., E-mail: michel@if.usp.br [Physics Institute, University of Sao Paulo, Sao Paulo, 05508-090 (Brazil); Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455 (United States); Shukla, Gaurav, E-mail: shukla@physics.umn.edu [School of Physics and Astronomy, University of Minnesota, Minneapolis, 55455 (United States); Minnesota supercomputer Institute, University of Minnesota, Minneapolis, 55455 (United States); Silveira, Pedro da [Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455 (United States); Wentzcovitch, Renata M., E-mail: wentz002@umn.edu [Chemical Engineering and Material Science, University of Minnesota, Minneapolis, 55455 (United States); Minnesota supercomputer Institute, University of Minnesota, Minneapolis, 55455 (United States)

    2015-12-15

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  8. Improving Metrological Reliability of Information-Measuring Systems Using Mathematical Modeling of Their Metrological Characteristics

    Science.gov (United States)

    Kurnosov, R. Yu; Chernyshova, T. I.; Chernyshov, V. N.

    2018-05-01

    The algorithms for improving the metrological reliability of analogue blocks of measuring channels and information-measuring systems are developed. The proposed algorithms ensure the optimum values of their metrological reliability indices for a given analogue circuit block solution.

  9. FINANCIAL STATEMENTS – SUPPLIER OF FINANCIAL ACCOUNTING INFORMATION

    Directory of Open Access Journals (Sweden)

    Costi Boby

    2013-09-01

    Full Text Available Continuous improvement of the accounting information system is considering financial and accounting information, which must be true, accurate, reliable, timely presented to users, constructed so as to meet the different goals of different users.

  10. Highly accurate fluorogenic DNA sequencing with information theory-based error correction.

    Science.gov (United States)

    Chen, Zitian; Zhou, Wenxiong; Qiao, Shuo; Kang, Li; Duan, Haifeng; Xie, X Sunney; Huang, Yanyi

    2017-12-01

    Eliminating errors in next-generation DNA sequencing has proved challenging. Here we present error-correction code (ECC) sequencing, a method to greatly improve sequencing accuracy by combining fluorogenic sequencing-by-synthesis (SBS) with an information theory-based error-correction algorithm. ECC embeds redundancy in sequencing reads by creating three orthogonal degenerate sequences, generated by alternate dual-base reactions. This is similar to encoding and decoding strategies that have proved effective in detecting and correcting errors in information communication and storage. We show that, when combined with a fluorogenic SBS chemistry with raw accuracy of 98.1%, ECC sequencing provides single-end, error-free sequences up to 200 bp. ECC approaches should enable accurate identification of extremely rare genomic variations in various applications in biology and medicine.

  11. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    Directory of Open Access Journals (Sweden)

    Kryszczuk Krzysztof

    2007-01-01

    Full Text Available We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature and multimodal (speech and face systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  12. Reliability and Validity of the Clinical Dementia Rating for Community-Living Elderly Subjects without an Informant.

    Science.gov (United States)

    Nyunt, Ma Shwe Zin; Chong, Mei Sian; Lim, Wee Shiong; Lee, Tih Shih; Yap, Philip; Ng, Tze Pin

    2013-01-01

    The Clinical Dementia Rating (CDR) scale is widely used to assess cognitive impairment in Alzheimer's disease. It requires collateral information from a reliable informant who is not available in many instances. We adapted the original CDR scale for use with elderly subjects without an informant (CDR-NI) and evaluated its reliability and validity for assessing mild cognitive impairment (MCI) and dementia among community-dwelling elderly subjects. At two consecutive visits 1 week apart, nurses trained in CDR assessment interviewed, observed and rated cognitive and functional performance according to a protocol in 90 elderly subjects with suboptimal cognitive performance [Mini-Mental State Examination (MMSE) reliability (κ 0.77-1.00 for six domains and 0.95 for global rating) and test-retest reliability (κ 0.75-1.00 for six domains and 0.80 for global rating), good agreement (κ 0.79) with the clinical assessment status of MCI (n = 37) and dementia (n = 4) and significant differences in the mean scores for MMSE, MOCA and Instrumental Activities of Daily Living (ANOVA global p reliable assessment of MCI and dementia in community-living elderly subjects without an informant.

  13. Using electronic health records and Internet search information for accurate influenza forecasting.

    Science.gov (United States)

    Yang, Shihao; Santillana, Mauricio; Brownstein, John S; Gray, Josh; Richardson, Stewart; Kou, S C

    2017-05-08

    Accurate influenza activity forecasting helps public health officials prepare and allocate resources for unusual influenza activity. Traditional flu surveillance systems, such as the Centers for Disease Control and Prevention's (CDC) influenza-like illnesses reports, lag behind real-time by one to 2 weeks, whereas information contained in cloud-based electronic health records (EHR) and in Internet users' search activity is typically available in near real-time. We present a method that combines the information from these two data sources with historical flu activity to produce national flu forecasts for the United States up to 4 weeks ahead of the publication of CDC's flu reports. We extend a method originally designed to track flu using Google searches, named ARGO, to combine information from EHR and Internet searches with historical flu activities. Our regularized multivariate regression model dynamically selects the most appropriate variables for flu prediction every week. The model is assessed for the flu seasons within the time period 2013-2016 using multiple metrics including root mean squared error (RMSE). Our method reduces the RMSE of the publicly available alternative (Healthmap flutrends) method by 33, 20, 17 and 21%, for the four time horizons: real-time, one, two, and 3 weeks ahead, respectively. Such accuracy improvements are statistically significant at the 5% level. Our real-time estimates correctly identified the peak timing and magnitude of the studied flu seasons. Our method significantly reduces the prediction error when compared to historical publicly available Internet-based prediction systems, demonstrating that: (1) the method to combine data sources is as important as data quality; (2) effectively extracting information from a cloud-based EHR and Internet search activity leads to accurate forecast of flu.

  14. Pre-Proposal Assessment of Reliability for Spacecraft Docking with Limited Information

    Science.gov (United States)

    Brall, Aron

    2013-01-01

    This paper addresses the problem of estimating the reliability of a critical system function as well as its impact on the system reliability when limited information is available. The approach addresses the basic function reliability, and then the impact of multiple attempts to accomplish the function. The dependence of subsequent attempts on prior failure to accomplish the function is also addressed. The autonomous docking of two spacecraft was the specific example that generated the inquiry, and the resultant impact on total reliability generated substantial interest in presenting the results due to the relative insensitivity of overall performance to basic function reliability and moderate degradation given sufficient attempts to try and accomplish the required goal. The application of the methodology allows proper emphasis on the characteristics that can be estimated with some knowledge, and to insulate the integrity of the design from those characteristics that can't be properly estimated with any rational value of uncertainty. The nature of NASA's missions contains a great deal of uncertainty due to the pursuit of new science or operations. This approach can be applied to any function where multiple attempts at success, with or without degradation, are allowed.

  15. Analysis of an Internet Community about Pneumothorax and the Importance of Accurate Information about the Disease.

    Science.gov (United States)

    Kim, Bong Jun; Lee, Sungsoo

    2018-04-01

    The huge improvements in the speed of data transmission and the increasing amount of data available as the Internet has expanded have made it easy to obtain information about any disease. Since pneumothorax frequently occurs in young adolescents, patients often search the Internet for information on pneumothorax. This study analyzed an Internet community for exchanging information on pneumothorax, with an emphasis on the importance of accurate information and doctors' role in providing such information. This study assessed 599,178 visitors to the Internet community from June 2008 to April 2017. There was an average of 190 visitors, 2.2 posts, and 4.5 replies per day. A total of 6,513 posts were made, and 63.3% of them included questions about the disease. The visitors mostly searched for terms such as 'pneumothorax,' 'recurrent pneumothorax,' 'pneumothorax operation,' and 'obtaining a medical certification of having been diagnosed with pneumothorax.' However, 22% of the pneumothorax-related posts by visitors contained inaccurate information. Internet communities can be an important source of information. However, incorrect information about a disease can be harmful for patients. We, as doctors, should try to provide more in-depth information about diseases to patients and to disseminate accurate information about diseases in Internet communities.

  16. Limits on reliable information flows through stochastic populations.

    Science.gov (United States)

    Boczkowski, Lucas; Natale, Emanuele; Feinerman, Ofer; Korman, Amos

    2018-06-06

    Biological systems can share and collectively process information to yield emergent effects, despite inherent noise in communication. While man-made systems often employ intricate structural solutions to overcome noise, the structure of many biological systems is more amorphous. It is not well understood how communication noise may affect the computational repertoire of such groups. To approach this question we consider the basic collective task of rumor spreading, in which information from few knowledgeable sources must reliably flow into the rest of the population. We study the effect of communication noise on the ability of groups that lack stable structures to efficiently solve this task. We present an impossibility result which strongly restricts reliable rumor spreading in such groups. Namely, we prove that, in the presence of even moderate levels of noise that affect all facets of the communication, no scheme can significantly outperform the trivial one in which agents have to wait until directly interacting with the sources-a process which requires linear time in the population size. Our results imply that in order to achieve efficient rumor spread a system must exhibit either some degree of structural stability or, alternatively, some facet of the communication which is immune to noise. We then corroborate this claim by providing new analyses of experimental data regarding recruitment in Cataglyphis niger desert ants. Finally, in light of our theoretical results, we discuss strategies to overcome noise in other biological systems.

  17. Towards more accurate and reliable predictions for nuclear applications

    International Nuclear Information System (INIS)

    Goriely, S.

    2015-01-01

    The need for nuclear data far from the valley of stability, for applications such as nuclear astrophysics or future nuclear facilities, challenges the robustness as well as the predictive power of present nuclear models. Most of the nuclear data evaluation and prediction are still performed on the basis of phenomenological nuclear models. For the last decades, important progress has been achieved in fundamental nuclear physics, making it now feasible to use more reliable, but also more complex microscopic or semi-microscopic models in the evaluation and prediction of nuclear data for practical applications. In the present contribution, the reliability and accuracy of recent nuclear theories are discussed for most of the relevant quantities needed to estimate reaction cross sections and beta-decay rates, namely nuclear masses, nuclear level densities, gamma-ray strength, fission properties and beta-strength functions. It is shown that nowadays, mean-field models can be tuned at the same level of accuracy as the phenomenological models, renormalized on experimental data if needed, and therefore can replace the phenomenogical inputs in the prediction of nuclear data. While fundamental nuclear physicists keep on improving state-of-the-art models, e.g. within the shell model or ab initio models, nuclear applications could make use of their most recent results as quantitative constraints or guides to improve the predictions in energy or mass domain that will remain inaccessible experimentally. (orig.)

  18. INNOVATIVE METHODS TO EVALUATE THE RELIABILITY OF INFORMATION CONSOLIDATED FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    Irina P. Kurochkina

    2014-01-01

    Full Text Available The article explores the possibility of using foreign innovative methods to assess the reliabilityof information consolidated fi nancial statements of Russian companies. Recommendations aremade under their adaptation and applicationinto commercial organizations. Banish methodindicators are implemented in one of the world’s largest vertically integrated steel and miningcompanies. Audit firms are proposed to usemethods of assessing the reliability of information in the practical application of ISA.

  19. Digital Cadastres Facilitating Land Information Management ...

    African Journals Online (AJOL)

    However, to achieve betterment in managing land, there is need for accurate, reliable and up to date information about land. Such proper land management policies however remain a challenge to most governments in African nations. Problems with land information differ case by case, but among the most common are the ...

  20. Modeling reliability measurement of interface on information system: Towards the forensic of rules

    Science.gov (United States)

    Nasution, M. K. M.; Sitompul, Darwin; Harahap, Marwan

    2018-02-01

    Today almost all machines depend on the software. As a software and hardware system depends also on the rules that are the procedures for its use. If the procedure or program can be reliably characterized by involving the concept of graph, logic, and probability, then regulatory strength can also be measured accordingly. Therefore, this paper initiates an enumeration model to measure the reliability of interfaces based on the case of information systems supported by the rules of use by the relevant agencies. An enumeration model is obtained based on software reliability calculation.

  1. OPTIMUM DESIGN OF EXPERIMENTS FOR ACCELERATED RELIABILITY TESTING

    Directory of Open Access Journals (Sweden)

    Sebastian Marian ZAHARIA

    2014-05-01

    Full Text Available In this paper is presented a case study that demonstrates how design to experiments (DOE information can be used to design better accelerated reliability tests. In the case study described in this paper, will be done a comparison and optimization between main accelerated reliability test plans (3 Level Best Standard Plan, 3 Level Best Compromise Plan, 3 Level Best Equal Expected Number Failing Plan, 3 Level 4:2:1 Allocation Plan. Before starting an accelerated reliability test, it is advisable to have a plan that helps in accurately estimating reliability at operating conditions while minimizing test time and costs. A test plan should be used to decide on the appropriate stress levels that should be used (for each stress type and the amount of the test units that need to be allocated to the different stress levels (for each combination of the different stress types' levels. For the case study it used ALTA 7 software what provides a complete analysis for data from accelerated reliability tests

  2. Formulating informative, data-based priors for failure probability estimation in reliability analysis

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2007-01-01

    Priors play an important role in the use of Bayesian methods in risk analysis, and using all available information to formulate an informative prior can lead to more accurate posterior inferences. This paper examines the practical implications of using five different methods for formulating an informative prior for a failure probability based on past data. These methods are the method of moments, maximum likelihood (ML) estimation, maximum entropy estimation, starting from a non-informative 'pre-prior', and fitting a prior based on confidence/credible interval matching. The priors resulting from the use of these different methods are compared qualitatively, and the posteriors are compared quantitatively based on a number of different scenarios of observed data used to update the priors. The results show that the amount of information assumed in the prior makes a critical difference in the accuracy of the posterior inferences. For situations in which the data used to formulate the informative prior is an accurate reflection of the data that is later observed, the ML approach yields the minimum variance posterior. However, the maximum entropy approach is more robust to differences between the data used to formulate the prior and the observed data because it maximizes the uncertainty in the prior subject to the constraints imposed by the past data

  3. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  4. The Data Reliability of Volunteered Geographic Information with Using Traffic Accident Data

    Science.gov (United States)

    Sevinç, H. K.; Karaş, I. R.

    2017-11-01

    The development of mobile technologies is important in the lives of humans. Mobile devices constitute a great part of the daily lives of people. It has come to such a point that when people first wake up, they check their smart phones for the first thing. Users may share their positions with the GNSS sensors in mobile devices or they can add information about their positions in mobile applications. Users contribute to Geographical Information System with this sharing. These users consist of native (citizens) living in that geographical position not of the CBS specialists. Creating, collecting, sharing and disseminating the geographical data provided by voluntary individuals constitute the Volunteered Geographic Information System. The data in the Volunteered Geographic Information System are received from amateur users. "How reliable will the data received from amateur users instead of specialists of the field be in scientific terms?" In this study, the reliability between the data received from the voluntary users through Volunteered Geographic Information System and real data is investigated. The real data consist of the traffic accident coordinates. The data that will be received from users will be received through the speed values in the relevant coordinates and the marking of the users for possible accident points on the map.

  5. 78 FR 41339 - Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards

    Science.gov (United States)

    2013-07-10

    ...] Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards AGENCY: Federal... Reliability Standards identified by the North American Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization. FOR FURTHER INFORMATION CONTACT: Kevin Ryan (Legal Information...

  6. 78 FR 59053 - Agency Information Collection Activities: Notice of an Extension of an Information Collection...

    Science.gov (United States)

    2013-09-25

    ..., monitoring of invading populations; improving understanding of the ecology of invaders and factors in the... compiling and synthesizing accurate and reliable data and information on invasive species, and the... regarding the distribution of nonindigenous aquatic species, primarily fish, in open waters of the United...

  7. 78 FR 58492 - Generator Verification Reliability Standards

    Science.gov (United States)

    2013-09-24

    ... power capability that is available for planning models and bulk electric system reliability assessments... of generator equipment needed to support Bulk-Power System reliability and enhance coordination of... support Bulk-Power System reliability and will ensure that accurate data is verified and made available...

  8. The Effect of Information Access Strategy on Power Consumption and Reliability in Wireless Sensor Network

    DEFF Research Database (Denmark)

    Tobgay, Sonam; Olsen, Rasmus Løvenstein; Prasad, Ramjee

    2013-01-01

    This paper examines the effect of different information access strategies on power consumption and information reliability, considering the wireless sensor network as the source of information. Basically, the paper explores three different access strategies, namely; reactive, periodic and hybrid...

  9. Use and perceptions of information among family physicians: sources considered accessible, relevant, and reliable.

    Science.gov (United States)

    Kosteniuk, Julie G; Morgan, Debra G; D'Arcy, Carl K

    2013-01-01

    The research determined (1) the information sources that family physicians (FPs) most commonly use to update their general medical knowledge and to make specific clinical decisions, and (2) the information sources FPs found to be most physically accessible, intellectually accessible (easy to understand), reliable (trustworthy), and relevant to their needs. A cross-sectional postal survey of 792 FPs and locum tenens, in full-time or part-time medical practice, currently practicing or on leave of absence in the Canadian province of Saskatchewan was conducted during the period of January to April 2008. Of 666 eligible physicians, 331 completed and returned surveys, resulting in a response rate of 49.7% (331/666). Medical textbooks and colleagues in the main patient care setting were the top 2 sources for the purpose of making specific clinical decisions. Medical textbooks were most frequently considered by FPs to be reliable (trustworthy), and colleagues in the main patient care setting were most physically accessible (easy to access). When making specific clinical decisions, FPs were most likely to use information from sources that they considered to be reliable and generally physically accessible, suggesting that FPs can best be supported by facilitating easy and convenient access to high-quality information.

  10. Exploratory Movement Generates Higher-Order Information That Is Sufficient for Accurate Perception of Scaled Egocentric Distance

    Science.gov (United States)

    Mantel, Bruno; Stoffregen, Thomas A.; Campbell, Alain; Bardy, Benoît G.

    2015-01-01

    Body movement influences the structure of multiple forms of ambient energy, including optics and gravito-inertial force. Some researchers have argued that egocentric distance is derived from inferential integration of visual and non-visual stimulation. We suggest that accurate information about egocentric distance exists in perceptual stimulation as higher-order patterns that extend across optics and inertia. We formalize a pattern that specifies the egocentric distance of a stationary object across higher-order relations between optics and inertia. This higher-order parameter is created by self-generated movement of the perceiver in inertial space relative to the illuminated environment. For this reason, we placed minimal restrictions on the exploratory movements of our participants. We asked whether humans can detect and use the information available in this higher-order pattern. Participants judged whether a virtual object was within reach. We manipulated relations between body movement and the ambient structure of optics and inertia. Judgments were precise and accurate when the higher-order optical-inertial parameter was available. When only optic flow was available, judgments were poor. Our results reveal that participants perceived egocentric distance from the higher-order, optical-inertial consequences of their own exploratory activity. Analysis of participants’ movement trajectories revealed that self-selected movements were complex, and tended to optimize availability of the optical-inertial pattern that specifies egocentric distance. We argue that accurate information about egocentric distance exists in higher-order patterns of ambient energy, that self-generated movement can generate these higher-order patterns, and that these patterns can be detected and used to support perception of egocentric distance that is precise and accurate. PMID:25856410

  11. A rule induction approach to improve Monte Carlo system reliability assessment

    International Nuclear Information System (INIS)

    Rocco S, Claudio M.

    2003-01-01

    A Decision Tree (DT) approach to build empirical models for use in Monte Carlo reliability evaluation is presented. The main idea is to develop an estimation algorithm, by training a model on a restricted data set, and replacing the Evaluation Function (EF) by a simpler calculation, which provides reasonably accurate model outputs. The proposed approach is illustrated with two systems of different size, represented by their equivalent networks. The robustness of the DT approach as an approximated method to replace the EF is also analysed. Excellent system reliability results are obtained by training a DT with a small amount of information

  12. Interventions to assist health consumers to find reliable online health information: a comprehensive review.

    Directory of Open Access Journals (Sweden)

    Kenneth Lee

    Full Text Available BACKGROUND: Health information on the Internet is ubiquitous, and its use by health consumers prevalent. Finding and understanding relevant online health information, and determining content reliability, pose real challenges for many health consumers. PURPOSE: To identify the types of interventions that have been implemented to assist health consumers to find reliable online health information, and where possible, describe and compare the types of outcomes studied. DATA SOURCES: PubMed, PsycINFO, CINAHL Plus and Cochrane Library databases; WorldCat and Scirus 'gray literature' search engines; and manual review of reference lists of selected publications. STUDY SELECTION: Publications were selected by firstly screening title, abstract, and then full text. DATA EXTRACTION: Seven publications met the inclusion criteria, and were summarized in a data extraction form. The form incorporated the PICOS (Population Intervention Comparators Outcomes and Study Design Model. Two eligible gray literature papers were also reported. DATA SYNTHESIS: Relevant data from included studies were tabulated to enable descriptive comparison. A brief critique of each study was included in the tables. This review was unable to follow systematic review methods due to the paucity of research and humanistic interventions reported. LIMITATIONS: While extensive, the gray literature search may have had limited reach in some countries. The paucity of research on this topic limits conclusions that may be drawn. CONCLUSIONS: The few eligible studies predominantly adopted a didactic approach to assisting health consumers, whereby consumers were either taught how to find credible websites, or how to use the Internet. Common types of outcomes studied include knowledge and skills pertaining to Internet use and searching for reliable health information. These outcomes were predominantly self-assessed by participants. There is potential for further research to explore other avenues for

  13. Interventions to assist health consumers to find reliable online health information: a comprehensive review.

    Science.gov (United States)

    Lee, Kenneth; Hoti, Kreshnik; Hughes, Jeffery D; Emmerton, Lynne M

    2014-01-01

    Health information on the Internet is ubiquitous, and its use by health consumers prevalent. Finding and understanding relevant online health information, and determining content reliability, pose real challenges for many health consumers. To identify the types of interventions that have been implemented to assist health consumers to find reliable online health information, and where possible, describe and compare the types of outcomes studied. PubMed, PsycINFO, CINAHL Plus and Cochrane Library databases; WorldCat and Scirus 'gray literature' search engines; and manual review of reference lists of selected publications. Publications were selected by firstly screening title, abstract, and then full text. Seven publications met the inclusion criteria, and were summarized in a data extraction form. The form incorporated the PICOS (Population Intervention Comparators Outcomes and Study Design) Model. Two eligible gray literature papers were also reported. Relevant data from included studies were tabulated to enable descriptive comparison. A brief critique of each study was included in the tables. This review was unable to follow systematic review methods due to the paucity of research and humanistic interventions reported. While extensive, the gray literature search may have had limited reach in some countries. The paucity of research on this topic limits conclusions that may be drawn. The few eligible studies predominantly adopted a didactic approach to assisting health consumers, whereby consumers were either taught how to find credible websites, or how to use the Internet. Common types of outcomes studied include knowledge and skills pertaining to Internet use and searching for reliable health information. These outcomes were predominantly self-assessed by participants. There is potential for further research to explore other avenues for assisting health consumers to find reliable online health information, and to assess outcomes via objective measures.

  14. Media and Information Literacy (MIL) in journalistic learning: strategies for accurately engaging with information and reporting news

    Science.gov (United States)

    Inayatillah, F.

    2018-01-01

    In the era of digital technology, there is abundant information from various sources. This ease of access needs to be accompanied by the ability to engage with the information wisely. Thus, information and media literacy is required. From the results of preliminary observations, it was found that the students of Universitas Negeri Surabaya, whose major is Indonesian Literature, and they take journalistic course lack of the skill of media and information literacy (MIL). Therefore, they need to be equipped with MIL. The method used is descriptive qualitative, which includes data collection, data analysis, and presentation of data analysis. Observation and documentation techniques were used to obtain data of MIL’s impact on journalistic learning for students. This study aims at describing the important role of MIL for students of journalistic and its impact on journalistic learning for students of Indonesian literature batch 2014. The results of this research indicate that journalistic is a science that is essential for students because it affects how a person perceives news report. Through the reinforcement of the course, students can avoid a hoax. MIL-based journalistic learning makes students will be more skillful at absorbing, processing, and presenting information accurately. The subject influences students in engaging with information so that they can report news credibly.

  15. Possibility of obtaining reliable information on component safety by means of large-scale tensile samples with Orowan-Soete flaws

    International Nuclear Information System (INIS)

    Aurich, D.; Wobst, K.; Kafka, H.

    1984-01-01

    The aim of the paper is to review the present knowledge regarding the ability of wide plate tensile specimen with saw cut trough center flaws of providing accurate information on component reliability; it points out the advantages and disadvantages of this specimen geometries. The effect of temperature, specimen geometry, ligament size and notch radii are discussed in comparison with other specimen geometries. This is followed by a comparison of the results of such tests with tests on inside stressed tanks. Conclusions: wide-plate tensile specimen are generally appropriate for assessing welded joints. However, they result in a more favourable evaluation of low-toughness steels from the point of view of crack growth than of high-toughness and soft steels in case of stresses with incipient cracks, as compared with the results obtained with three-point bending samples. (orig.) [de

  16. Reliability and Validity of the Clinical Dementia Rating for Community-Living Elderly Subjects without an Informant

    Directory of Open Access Journals (Sweden)

    Ma Shwe Zin Nyunt

    2013-10-01

    Full Text Available Background: The Clinical Dementia Rating (CDR scale is widely used to assess cognitive impairment in Alzheimer's disease. It requires collateral information from a reliable informant who is not available in many instances. We adapted the original CDR scale for use with elderly subjects without an informant (CDR-NI and evaluated its reliability and validity for assessing mild cognitive impairment (MCI and dementia among community-dwelling elderly subjects. Method: At two consecutive visits 1 week apart, nurses trained in CDR assessment interviewed, observed and rated cognitive and functional performance according to a protocol in 90 elderly subjects with suboptimal cognitive performance [Mini-Mental State Examination (MMSE Results: The CDR-NI scores (0, 0.5, 1 showed good internal consistency (Crohnbach's a 0.83-0.84, inter-rater reliability (κ 0.77-1.00 for six domains and 0.95 for global rating and test-retest reliability (κ 0.75-1.00 for six domains and 0.80 for global rating, good agreement (κ 0.79 with the clinical assessment status of MCI (n = 37 and dementia (n = 4 and significant differences in the mean scores for MMSE, MOCA and Instrumental Activities of Daily Living (ANOVA global p Conclusion: Owing to the protocol of the interviews, assessments and structured observations gathered during the two visits, CDR-NI provides valid and reliable assessment of MCI and dementia in community-living elderly subjects without an informant.

  17. The Impact of the Reliability of Teleinformation Systems on the Quality of Transmitted Information

    Directory of Open Access Journals (Sweden)

    Stawowy Marek

    2016-10-01

    Full Text Available The work describes the impact the reliability of the information quality IQ for information and communication systems. One of the components of IQ is the reliability properties such as relativity, accuracy, timeliness, completeness, consistency, adequacy, accessibility, credibility, congruence. Each of these components of IQ is independent and to properly estimate the value of IQ, use one of the methods of modeling uncertainty. In this article, we used a hybrid method that has been developed jointly by one of the authors. This method is based on the mathematical theory of evidence know as Dempstera-Shafera (DS theory and serial links of dependent hybrid named IQ (hyb.

  18. Accurate facade feature extraction method for buildings from three-dimensional point cloud data considering structural information

    Science.gov (United States)

    Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia

    2018-05-01

    Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.

  19. Laboratory Information Management System Chain of Custody: Reliability and Security

    OpenAIRE

    Tomlinson, J. J.; Elliott-Smith, W.; Radosta, T.

    2006-01-01

    A chain of custody (COC) is required in many laboratories that handle forensics, drugs of abuse, environmental, clinical, and DNA testing, as well as other laboratories that want to assure reliability of reported results. Maintaining a dependable COC can be laborious, but with the recent establishment of the criteria for electronic records and signatures by US regulatory agencies, laboratory information management systems (LIMSs) are now being developed to fully automate COCs. The extent of a...

  20. REMOTE SENSING APPLICATIONS WITH HIGH RELIABILITY IN CHANGJIANG WATER RESOURCE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    L. Ma

    2018-04-01

    Full Text Available Remote sensing technology has been widely used in many fields. But most of the applications cannot get the information with high reliability and high accuracy in large scale, especially for the applications using automatic interpretation methods. We have designed an application-oriented technology system (PIR composed of a series of accurate interpretation techniques,which can get over 85 % correctness in Water Resource Management from the view of photogrammetry and expert knowledge. The techniques compose of the spatial positioning techniques from the view of photogrammetry, the feature interpretation techniques from the view of expert knowledge, and the rationality analysis techniques from the view of data mining. Each interpreted polygon is accurate enough to be applied to the accuracy sensitive projects, such as the Three Gorge Project and the South - to - North Water Diversion Project. In this paper, we present several remote sensing applications with high reliability in Changjiang Water Resource Management,including water pollution investigation, illegal construction inspection, and water conservation monitoring, etc.

  1. Remote Sensing Applications with High Reliability in Changjiang Water Resource Management

    Science.gov (United States)

    Ma, L.; Gao, S.; Yang, A.

    2018-04-01

    Remote sensing technology has been widely used in many fields. But most of the applications cannot get the information with high reliability and high accuracy in large scale, especially for the applications using automatic interpretation methods. We have designed an application-oriented technology system (PIR) composed of a series of accurate interpretation techniques,which can get over 85 % correctness in Water Resource Management from the view of photogrammetry and expert knowledge. The techniques compose of the spatial positioning techniques from the view of photogrammetry, the feature interpretation techniques from the view of expert knowledge, and the rationality analysis techniques from the view of data mining. Each interpreted polygon is accurate enough to be applied to the accuracy sensitive projects, such as the Three Gorge Project and the South - to - North Water Diversion Project. In this paper, we present several remote sensing applications with high reliability in Changjiang Water Resource Management,including water pollution investigation, illegal construction inspection, and water conservation monitoring, etc.

  2. A Method to Increase Drivers' Trust in Collision Warning Systems Based on Reliability Information of Sensor

    Science.gov (United States)

    Tsutsumi, Shigeyoshi; Wada, Takahiro; Akita, Tokihiko; Doi, Shun'ichi

    Driver's workload tends to be increased during driving under complicated traffic environments like a lane change. In such cases, rear collision warning is effective for reduction of cognitive workload. On the other hand, it is pointed out that false alarm or missing alarm caused by sensor errors leads to decrease of driver' s trust in the warning system and it can result in low efficiency of the system. Suppose that reliability information of the sensor is provided in real-time. In this paper, we propose a new warning method to increase driver' s trust in the system even with low sensor reliability utilizing the sensor reliability information. The effectiveness of the warning methods is shown by driving simulator experiments.

  3. Advancing methods for reliably assessing motivational interviewing fidelity using the motivational interviewing skills code.

    Science.gov (United States)

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C

    2015-02-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. PET-MR image fusion in soft tissue sarcoma: accuracy, reliability and practicality of interactive point-based and automated mutual information techniques

    International Nuclear Information System (INIS)

    Somer, Edward J.R.; Marsden, Paul K.; Benatar, Nigel A.; O'Doherty, Michael J.; Goodey, Joanne; Smith, Michael A.

    2003-01-01

    The fusion of functional positron emission tomography (PET) data with anatomical magnetic resonance (MR) or computed tomography images, using a variety of interactive and automated techniques, is becoming commonplace, with the technique of choice dependent on the specific application. The case of PET-MR image fusion in soft tissue is complicated by a lack of conspicuous anatomical features and deviation from the rigid-body model. Here we compare a point-based external marker technique with an automated mutual information algorithm and discuss the practicality, reliability and accuracy of each when applied to the study of soft tissue sarcoma. Ten subjects with suspected sarcoma in the knee, thigh, groin, flank or back underwent MR and PET scanning after the attachment of nine external fiducial markers. In the assessment of the point-based technique, three error measures were considered: fiducial localisation error (FLE), fiducial registration error (FRE) and target registration error (TRE). FLE, which represents the accuracy with which the fiducial points can be located, is related to the FRE minimised by the registration algorithm. The registration accuracy is best characterised by the TRE, which is the distance between corresponding points in each image space after registration. In the absence of salient features within the target volume, the TRE can be measured at fiducials excluded from the registration process. To assess the mutual information technique, PET data, acquired after physically removing the markers, were reconstructed in a variety of ways and registered with MR. Having applied the transform suggested by the algorithm to the PET scan acquired before the markers were removed, the residual distance between PET and MR marker-pairs could be measured. The manual point-based technique yielded the best results (RMS TRE =8.3 mm, max =22.4 mm, min =1.7 mm), performing better than the automated algorithm (RMS TRE =20.0 mm, max =30.5 mm, min =7.7 mm) when

  5. Activity assays and immunoassays for plasma Renin and prorenin: information provided and precautions necessary for accurate measurement

    DEFF Research Database (Denmark)

    Campbell, Duncan J; Nussberger, Juerg; Stowasser, Michael

    2009-01-01

    into focus the differences in information provided by activity assays and immunoassays for renin and prorenin measurement and has drawn attention to the need for precautions to ensure their accurate measurement. CONTENT: Renin activity assays and immunoassays provide related but different information...... provided by these assays and of the precautions necessary to ensure their accuracy....

  6. Fog-computing concept usage as means to enhance information and control system reliability

    Science.gov (United States)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-05-01

    This paper focuses on the reliability issue of information and control systems (ICS). The authors propose using the elements of the fog-computing concept to enhance the reliability function. The key idea of fog-computing is to shift computations to the fog-layer of the network, and thus to decrease the workload of the communication environment and data processing components. As for ICS, workload also can be distributed among sensors, actuators and network infrastructure facilities near the sources of data. The authors simulated typical workload distribution situations for the “traditional” ICS architecture and for the one with fogcomputing concept elements usage. The paper contains some models, selected simulation results and conclusion about the prospects of the fog-computing as a means to enhance ICS reliability.

  7. Efficient surrogate models for reliability analysis of systems with multiple failure modes

    International Nuclear Information System (INIS)

    Bichon, Barron J.; McFarland, John M.; Mahadevan, Sankaran

    2011-01-01

    Despite many advances in the field of computational reliability analysis, the efficient estimation of the reliability of a system with multiple failure modes remains a persistent challenge. Various sampling and analytical methods are available, but they typically require accepting a tradeoff between accuracy and computational efficiency. In this work, a surrogate-based approach is presented that simultaneously addresses the issues of accuracy, efficiency, and unimportant failure modes. The method is based on the creation of Gaussian process surrogate models that are required to be locally accurate only in the regions of the component limit states that contribute to system failure. This approach to constructing surrogate models is demonstrated to be both an efficient and accurate method for system-level reliability analysis. - Highlights: → Extends efficient global reliability analysis to systems with multiple failure modes. → Constructs locally accurate Gaussian process models of each response. → Highly efficient and accurate method for assessing system reliability. → Effectiveness is demonstrated on several test problems from the literature.

  8. Prediction of safety critical software operational reliability from test reliability using testing environment factors

    International Nuclear Information System (INIS)

    Jung, Hoan Sung; Seong, Poong Hyun

    1999-01-01

    It has been a critical issue to predict the safety critical software reliability in nuclear engineering area. For many years, many researches have focused on the quantification of software reliability and there have been many models developed to quantify software reliability. Most software reliability models estimate the reliability with the failure data collected during the test assuming that the test environments well represent the operation profile. User's interest is however on the operational reliability rather than on the test reliability. The experiences show that the operational reliability is higher than the test reliability. With the assumption that the difference in reliability results from the change of environment, from testing to operation, testing environment factors comprising the aging factor and the coverage factor are developed in this paper and used to predict the ultimate operational reliability with the failure data in testing phase. It is by incorporating test environments applied beyond the operational profile into testing environment factors. The application results show that the proposed method can estimate the operational reliability accurately. (Author). 14 refs., 1 tab., 1 fig

  9. Investigation of reliability indicators of information analysis systems based on Markov’s absorbing chain model

    Science.gov (United States)

    Gilmanshin, I. R.; Kirpichnikov, A. P.

    2017-09-01

    In the result of study of the algorithm of the functioning of the early detection module of excessive losses, it is proven the ability to model it by using absorbing Markov chains. The particular interest is in the study of probability characteristics of early detection module functioning algorithm of losses in order to identify the relationship of indicators of reliability of individual elements, or the probability of occurrence of certain events and the likelihood of transmission of reliable information. The identified relations during the analysis allow to set thresholds reliability characteristics of the system components.

  10. Photogrammetry: an accurate and reliable tool to detect thoracic musculoskeletal abnormalities in preterm infants.

    Science.gov (United States)

    Davidson, Josy; dos Santos, Amelia Miyashiro N; Garcia, Kessey Maria B; Yi, Liu C; João, Priscila C; Miyoshi, Milton H; Goulart, Ana Lucia

    2012-09-01

    To analyse the accuracy and reproducibility of photogrammetry in detecting thoracic abnormalities in infants born prematurely. Cross-sectional study. The Premature Clinic at the Federal University of São Paolo. Fifty-eight infants born prematurely in their first year of life. Measurement of the manubrium/acromion/trapezius angle (degrees) and the deepest thoracic retraction (cm). Digitised photographs were analysed by two blinded physiotherapists using a computer program (SAPO; http://SAPO.incubadora.fapesp.br) to detect shoulder elevation and thoracic retraction. Physical examinations performed independently by two physiotherapists were used to assess the accuracy of the new tool. Thoracic alterations were detected in 39 (67%) and in 40 (69%) infants by Physiotherapists 1 and 2, respectively (kappa coefficient=0.80). Using a receiver operating characteristic curve, measurement of the manubrium/acromion/trapezius angle and the deepest thoracic retraction indicated accuracy of 0.79 and 0.91, respectively. For measurement of the manubrium/acromion/trapezius angle, the Bland and Altman limits of agreement were -6.22 to 7.22° [mean difference (d)=0.5] for repeated measures by one physiotherapist, and -5.29 to 5.79° (d=0.75) between two physiotherapists. For thoracic retraction, the intra-rater limits of agreement were -0.14 to 0.18cm (d=0.02) and the inter-rater limits of agreement were -0.20 to -0.17cm (d=0.02). SAPO provided an accurate and reliable tool for the detection of thoracic abnormalities in preterm infants. Copyright © 2011 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  11. Analyzing the reliability of shuffle-exchange networks using reliability block diagrams

    International Nuclear Information System (INIS)

    Bistouni, Fathollah; Jahanshahi, Mohsen

    2014-01-01

    Supercomputers and multi-processor systems are comprised of thousands of processors that need to communicate in an efficient way. One reasonable solution would be the utilization of multistage interconnection networks (MINs), where the challenge is to analyze the reliability of such networks. One of the methods to increase the reliability and fault-tolerance of the MINs is use of various switching stages. Therefore, recently, the reliability of one of the most common MINs namely shuffle-exchange network (SEN) has been evaluated through the investigation on the impact of increasing the number of switching stage. Also, it is concluded that the reliability of SEN with one additional stage (SEN+) is better than SEN or SEN with two additional stages (SEN+2), even so, the reliability of SEN is better compared to SEN with two additional stages (SEN+2). Here we re-evaluate the reliability of these networks where the results of the terminal, broadcast, and network reliability analysis demonstrate that SEN+ and SEN+2 continuously outperform SEN and are very alike in terms of reliability. - Highlights: • The impact of increasing the number of stages on reliability of MINs is investigated. • The RBD method as an accurate method is used for the reliability analysis of MINs. • Complex series–parallel RBDs are used to determine the reliability of the MINs. • All measures of the reliability (i.e. terminal, broadcast, and network reliability) are analyzed. • All reliability equations will be calculated for different size N×N

  12. Corrections for criterion reliability in validity generalization: The consistency of Hermes, the utility of Midas

    Directory of Open Access Journals (Sweden)

    Jesús F. Salgado

    2016-04-01

    Full Text Available There is criticism in the literature about the use of interrater coefficients to correct for criterion reliability in validity generalization (VG studies and disputing whether .52 is an accurate and non-dubious estimate of interrater reliability of overall job performance (OJP ratings. We present a second-order meta-analysis of three independent meta-analytic studies of the interrater reliability of job performance ratings and make a number of comments and reflections on LeBreton et al.s paper. The results of our meta-analysis indicate that the interrater reliability for a single rater is .52 (k = 66, N = 18,582, SD = .105. Our main conclusions are: (a the value of .52 is an accurate estimate of the interrater reliability of overall job performance for a single rater; (b it is not reasonable to conclude that past VG studies that used .52 as the criterion reliability value have a less than secure statistical foundation; (c based on interrater reliability, test-retest reliability, and coefficient alpha, supervisor ratings are a useful and appropriate measure of job performance and can be confidently used as a criterion; (d validity correction for criterion unreliability has been unanimously recommended by "classical" psychometricians and I/O psychologists as the proper way to estimate predictor validity, and is still recommended at present; (e the substantive contribution of VG procedures to inform HRM practices in organizations should not be lost in these technical points of debate.

  13. Trust in Testimony about Strangers: Young Children Prefer Reliable Informants Who Make Positive Attributions

    Science.gov (United States)

    Boseovski, Janet J.

    2012-01-01

    Young children have been described as critical consumers of information, particularly in the domain of language learning. Indeed, children are more likely to learn novel words from people with accurate histories of object labeling than with inaccurate ones. But what happens when informant testimony conflicts with a tendency to see the world in a…

  14. Perceived Physician-informed Weight Status Predicts Accurate Weight Self-Perception and Weight Self-Regulation in Low-income, African American Women.

    Science.gov (United States)

    Harris, Charlie L; Strayhorn, Gregory; Moore, Sandra; Goldman, Brian; Martin, Michelle Y

    2016-01-01

    Obese African American women under-appraise their body mass index (BMI) classification and report fewer weight loss attempts than women who accurately appraise their weight status. This cross-sectional study examined whether physician-informed weight status could predict weight self-perception and weight self-regulation strategies in obese women. A convenience sample of 118 low-income women completed a survey assessing demographic characteristics, comorbidities, weight self-perception, and weight self-regulation strategies. BMI was calculated during nurse triage. Binary logistic regression models were performed to test hypotheses. The odds of obese accurate appraisers having been informed about their weight status were six times greater than those of under-appraisers. The odds of those using an "approach" self-regulation strategy having been physician-informed were four times greater compared with those using an "avoidance" strategy. Physicians are uniquely positioned to influence accurate weight self-perception and adaptive weight self-regulation strategies in underserved women, reducing their risk for obesity-related morbidity.

  15. The role of quality tools in assessing reliability of the internet for health information.

    Science.gov (United States)

    Hanif, Faisal; Read, Janet C; Goodacre, John A; Chaudhry, Afzal; Gibbs, Paul

    2009-12-01

    The Internet has made it possible for patients and their families to access vast quantities of information that previously would have been difficult for anyone but a physician or librarian to obtain. Health information websites, however, are recognised to differ widely in quality and reliability of their content. This has led to the development of various codes of conduct or quality rating tools to assess the quality of health websites. However, the validity and reliability of these quality tools and their applicability to different health websites also varies. In principle, rating tools should be available to consumers, require a limited number of elements to be assessed, be assessable in all elements, be readable and be able to gauge the readability and consistency of information provided from a patient's view point. This article reviews the literature on the trends of the Internet use for health and analyses various codes of conduct/ethics or 'quality tools' available to monitor the quality of health websites from a patient perspective.

  16. Towards accurate emergency response behavior

    International Nuclear Information System (INIS)

    Sargent, T.O.

    1981-01-01

    Nuclear reactor operator emergency response behavior has persisted as a training problem through lack of information. The industry needs an accurate definition of operator behavior in adverse stress conditions, and training methods which will produce the desired behavior. Newly assembled information from fifty years of research into human behavior in both high and low stress provides a more accurate definition of appropriate operator response, and supports training methods which will produce the needed control room behavior. The research indicates that operator response in emergencies is divided into two modes, conditioned behavior and knowledge based behavior. Methods which assure accurate conditioned behavior, and provide for the recovery of knowledge based behavior, are described in detail

  17. The reliability and validity of self-reported reproductive history and obstetric morbidity amongst birth to ten mothers in Soweto

    Directory of Open Access Journals (Sweden)

    GTH Ellison

    2000-09-01

    Full Text Available Objective: To assess whether self-reports of reproductive history and obstetric morbidity provide an accurate basis for clinical decision-making. Setting, participants and methods: Self-reports of maternal age and reproductive history, together with clinical measurements of five medical disorders, were abstracted from the obstetric notes of 517 mothers whose children were enrolled in the Birth to Ten study. These data were compared to self-reported information collected by interview during the Birth to Ten study. Findings: The reliability of self-reported age and gravidity was high (R=0.810-0.993, yet self-reports of previous miscarriages, terminations, premature- and stillbirths were only fairly reliable (Kappa=0.48-0.50. Self-reported diabetes and high blood pressure had specificities of more than 95% for glycosuria, hypertension and pre-eclampsia. However, the specificity of self-reported oedema for hypertensive disorders and the specificity of self-reported urinary tract infection for STD seropositivity were only around 65%. Conclusions: The modest reliability and limited validity of self-reported obstetric morbidity undermines the clinical utility of this information. Recommendations: These results strengthen the case for providing mothers with “Home-based Maternal Records” to facilitate access to accurate obstetric information during subsequent clinical consultations.

  18. Information about robustness, reliability and safety in early design phases

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster

    methods, and an industrial case to assess how the use of information about robustness, reliability and safety as practised by current methods influences concept development. Current methods cannot be used in early design phases due to their dependence on detailed design information for the identification...... alternatives. This prompts designers to reuse working principles that are inherently flawed, as they are liable to disturbances, failures and hazards. To address this issue, an approach based upon individual records of early design issues consists of comparing failures and benefits from prior working...... principles, before making a decision, and improving the more suitable alternatives through this feedback. Workshops were conducted with design practitioners to evaluate the potential of the approach and to simulate decision-making and gain feedback on a proof-of-concept basis. The evaluation has demonstrated...

  19. Biomimetic Approach for Accurate, Real-Time Aerodynamic Coefficients, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Aerodynamic and structural reliability and efficiency depends critically on the ability to accurately assess the aerodynamic loads and moments for each lifting...

  20. Reliability and Failure in NASA Missions: Blunders, Normal Accidents, High Reliability, Bad Luck

    Science.gov (United States)

    Jones, Harry W.

    2015-01-01

    NASA emphasizes crew safety and system reliability but several unfortunate failures have occurred. The Apollo 1 fire was mistakenly unanticipated. After that tragedy, the Apollo program gave much more attention to safety. The Challenger accident revealed that NASA had neglected safety and that management underestimated the high risk of shuttle. Probabilistic Risk Assessment was adopted to provide more accurate failure probabilities for shuttle and other missions. NASA's "faster, better, cheaper" initiative and government procurement reform led to deliberately dismantling traditional reliability engineering. The Columbia tragedy and Mars mission failures followed. Failures can be attributed to blunders, normal accidents, or bad luck. Achieving high reliability is difficult but possible.

  1. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  2. 76 FR 23171 - Electric Reliability Organization Interpretations of Interconnection Reliability Operations and...

    Science.gov (United States)

    2011-04-26

    ... Reliability Standards for the Bulk-Power System, Order No. 693, FERC Stats. & Regs. ] 31,242, order on reh'g...-Power System reliability may request an interpretation of a Reliability Standard.\\7\\ The ERO's standards... information in its reliability assessments. The Reliability Coordinator must monitor Bulk Electric System...

  3. Accurate and precise determination of small quantity uranium by means of automatic potentiometric titration

    International Nuclear Information System (INIS)

    Liu Quanwei; Luo Zhongyan; Zhu Haiqiao; Wu Jizong

    2007-01-01

    For high radioactivity level of dissolved solution of spent fuel and the solution of uranium product, radioactive hazard must be considered and reduced as low as possible during accurate determination of uranium. In this work automatic potentiometric titration was applied and the sample only 10 mg of uranium contained was taken in order to reduce the harm of analyzer suffered from the radioactivity. RSD<0.06%, at the same time the result can be corrected for more reliable and accurate measurement. The determination method can effectively reduce the harm of analyzer suffered from the radioactivity, and meets the requirement of reliable accurate measurement of uranium. (authors)

  4. METHODS OF IMPROVING THE RELIABILITY OF THE CONTROL SYSTEM TRACTION POWER SUPPLY OF ELECTRIC TRANSPORT BASED ON AN EXPERT INFORMATION

    Directory of Open Access Journals (Sweden)

    O. O. Matusevych

    2009-03-01

    Full Text Available The author proposed the numerous methods of solving the multi-criterion task – increasing of reliability of control system on the basis of expert information. The information, which allows choosing thoughtfully the method of reliability increasing for a control system of electric transport, is considered.

  5. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  6. Extracting information from an ensemble of GCMs to reliably assess future global runoff change

    NARCIS (Netherlands)

    Sperna Weiland, F.C.; Beek, L.P.H. van; Weerts, A.H.; Bierkens, M.F.P.

    2011-01-01

    Future runoff projections derived from different global climate models (GCMs) show large differences. Therefore, within this study the, information from multiple GCMs has been combined to better assess hydrological changes. For projections of precipitation and temperature the Reliability ensemble

  7. Effectiveness of different approaches to disseminating traveler information on travel time reliability. [supporting datasets

    Science.gov (United States)

    2013-11-30

    Travel time reliability information includes static data about traffic speeds or trip times that capture historic variations from day to day, and it can help individuals understand the level of variation in traffic. Unlike real-time travel time infor...

  8. The reliability and validity of the informant AD8 by comparison with a series of cognitive assessment tools in primary healthcare.

    Science.gov (United States)

    Shaik, Muhammad Amin; Xu, Xin; Chan, Qun Lin; Hui, Richard Jor Yeong; Chong, Steven Shih Tsze; Chen, Christopher Li-Hsian; Dong, YanHong

    2016-03-01

    The validity and reliability of the informant AD8 in primary healthcare has not been established. Therefore, the present study examined the validity and reliability of the informant AD8 in government subsidized primary healthcare centers in Singapore. Eligible patients (≥60 years old) were recruited from primary healthcare centers and their informants received the AD8. Patient-informant dyads who agreed for further cognitive assessments received the Mini-Mental State Examination (MMSE), Montreal Cognitive Assessment (MoCA), Clinical Dementia Rating (CDR), and a locally validated formal neuropsychological battery at a research center in a tertiary hospital. 1,082 informants completed AD8 assessment at two primary healthcare centers. Of these, 309 patients-informant dyads were further assessed, of whom 243 (78.6%) were CDR = 0; 22 (7.1%) were CDR = 0.5; and 44 (14.2%) were CDR≥1. The mean administration time of the informant AD8 was 2.3 ± 1.0 minutes. The informant AD8 demonstrated good internal consistency (Cronbach's α = 0.85); inter-rater reliability (Intraclass Correlation Coefficient (ICC) = 0.85); and test-retest reliability (weighted κ = 0.80). Concurrent validity, as measured by the correlation between total AD8 scores and CDR global (R = 0.65, p validity, as measured by convergent validity (R ≥ 0.4) between individual items of AD8 with CDR and neuropsychological domains was acceptable. The informant AD8 demonstrated good concurrent and construct validity and is a reliable measure to detect cognitive dysfunction in primary healthcare.

  9. The Outcome and Assessment Information Set (OASIS): A Review of Validity and Reliability

    Science.gov (United States)

    O’CONNOR, MELISSA; DAVITT, JOAN K.

    2015-01-01

    The Outcome and Assessment Information Set (OASIS) is the patient-specific, standardized assessment used in Medicare home health care to plan care, determine reimbursement, and measure quality. Since its inception in 1999, there has been debate over the reliability and validity of the OASIS as a research tool and outcome measure. A systematic literature review of English-language articles identified 12 studies published in the last 10 years examining the validity and reliability of the OASIS. Empirical findings indicate the validity and reliability of the OASIS range from low to moderate but vary depending on the item studied. Limitations in the existing research include: nonrepresentative samples; inconsistencies in methods used, items tested, measurement, and statistical procedures; and the changes to the OASIS itself over time. The inconsistencies suggest that these results are tentative at best; additional research is needed to confirm the value of the OASIS for measuring patient outcomes, research, and quality improvement. PMID:23216513

  10. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  11. Accurate Information, Virtual Reality, Good Librarianship Doğru Bilgi, Sanal Gerçeklik, İyi Kütüphanecilik

    Directory of Open Access Journals (Sweden)

    M. Tayfun Gülle

    2010-03-01

    Full Text Available Departing from the idea that internet, which has become a deep information tunnel, is causing a problem in access to “accurate information”, it is expressed that societies are imprisoned within the world of “virtual reality” with web 2.0/web 3.0 technologies and social media applications. In order to diagnose this problem correctly, the media used from past to present for accessing information are explained shortly as “social tools.” Furthermore, it is emphasised and summarised with an editorial viewpoint that the means of reaching accurate information can be increased via the freedom of expression channel which will be brought forth by “good librarianship” applications. IFLA Principles of Freedom of Expression and Good Librarianship is referred to at the end of the editorial.

  12. GP preferences for information systems: conjoint analysis of speed, reliability, access and users.

    Science.gov (United States)

    Wyatt, Jeremy C; Batley, Richard P; Keen, Justin

    2010-10-01

    To elicit the preferences and trade-offs of UK general practitioners about key features of health information systems, to help inform the design of such systems in future. A stated choice study to uncover implicit preferences based on a binary choice between scenarios presented in random order. were all 303 general practice members of the UK Internet service provider, Medix who were approached by email to participate. The main outcome measure was the number of seconds delay in system response that general practitioners were willing to trade off for each key system feature: the reliability of the system, the sites from which the system could be accessed and which staff are able to view patient data. Doctors valued speed of response most in information systems but would be prepared to wait 28 seconds to access a system in exchange for improved reliability from 95% to 99%, a further 2 seconds for an improvement to 99.9% and 27 seconds for access to data from anywhere including their own home compared with one place in a single health care premises. However, they would require a system that was 14 seconds faster to compensate for allowing social care as well as National Health Service staff to read patient data. These results provide important new evidence about which system characteristics doctors value highly, and hence which characteristics designers need to focus on when large scale health information systems are planned. © 2010 Blackwell Publishing Ltd.

  13. Enhancing operability and reliability through configuration management

    International Nuclear Information System (INIS)

    Hancock, L.R.

    1993-01-01

    This paper describes the evolution of plant design control techniques from the early 1970's to today's operating environment that demands accurate, up-to-date design data. This evolution of design control is responsible for the increasingly troublesome scenario of design data being very difficult to locate and when found, its credibility is questioned. The design information could be suspect because there are discrepancies between two or more source documents or there is a difference between the design documents and the physical configuration of the plant. This paper discusses the impact these design control problems are having on plant operations and presents common sense solutions for improving configuration management techniques to ultimately enhance operability and reliability

  14. A data-informed PIF hierarchy for model-based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Groth, Katrina M.; Mosleh, Ali

    2012-01-01

    This paper addresses three problems associated with the use of Performance Shaping Factors in Human Reliability Analysis. (1) There are more than a dozen Human Reliability Analysis (HRA) methods that use Performance Influencing Factors (PIFs) or Performance Shaping Factors (PSFs) to model human performance, but there is not a standard set of PIFs used among the methods, nor is there a framework available to compare the PIFs used in various methods. (2) The PIFs currently in use are not defined specifically enough to ensure consistent interpretation of similar PIFs across methods. (3) There are few rules governing the creation, definition, and usage of PIF sets. This paper introduces a hierarchical set of PIFs that can be used for both qualitative and quantitative HRA. The proposed PIF set is arranged in a hierarchy that can be collapsed or expanded to meet multiple objectives. The PIF hierarchy has been developed with respect to a set fundamental principles necessary for PIF sets, which are also introduced in this paper. This paper includes definitions of the PIFs to allow analysts to map the proposed PIFs onto current and future HRA methods. The standardized PIF hierarchy will allow analysts to combine different types of data and will therefore make the best use of the limited data in HRA. The collapsible hierarchy provides the structure necessary to combine multiple types of information without reducing the quality of the information.

  15. Measuring time and risk preferences: Reliability, stability, domain specificity

    NARCIS (Netherlands)

    Wölbert, E.M.; Riedl, A.M.

    2013-01-01

    To accurately predict behavior economists need reliable measures of individual time preferences and attitudes toward risk and typically need to assume stability of these characteristics over time and across decision domains. We test the reliability of two choice tasks for eliciting discount rates,

  16. As reliable as the sun

    Science.gov (United States)

    Leijtens, J. A. P.

    2017-11-01

    Fortunately there is almost nothing as reliable as the sun which can consequently be utilized as a very reliable source of spacecraft power. In order to harvest this power, the solar panels have to be pointed towards the sun as accurately and reliably as possible. To this extend, sunsensors are available on almost every satellite to support vital sun-pointing capability throughout the mission, even in the deployment and save mode phases of the satellites life. Given the criticality of the application one would expect that after more than 50 years of sun sensor utilisation, such sensors would be fully matured and optimised. In actual fact though, the majority of sunsensors employed are still coarse sunsensors which have a proven extreme reliability but present major issues regarding albedo sensitivity and pointing accuracy.

  17. Constructing the Best Reliability Data for the Job

    Science.gov (United States)

    Kleinhammer, R. K.; Kahn, J. C.

    2014-01-01

    Modern business and technical decisions are based on the results of analyses. When considering assessments using "reliability data", the concern is how long a system will continue to operate as designed. Generally, the results are only as good as the data used. Ideally, a large set of pass/fail tests or observations to estimate the probability of failure of the item under test would produce the best data. However, this is a costly endeavor if used for every analysis and design. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, we attempt to develop the "best" or composite analog data to support our assessments. One method used incorporates processes for reviewing existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. Data that is more representative of reality and more project specific would provide more accurate analysis, and hopefully a better final decision.

  18. Constructing the "Best" Reliability Data for the Job

    Science.gov (United States)

    DeMott, D. L.; Kleinhammer, R. K.

    2014-01-01

    Modern business and technical decisions are based on the results of analyses. When considering assessments using "reliability data", the concern is how long a system will continue to operate as designed. Generally, the results are only as good as the data used. Ideally, a large set of pass/fail tests or observations to estimate the probability of failure of the item under test would produce the best data. However, this is a costly endeavor if used for every analysis and design. Developing specific data is costly and time consuming. Instead, analysts rely on available data to assess reliability. Finding data relevant to the specific use and environment for any project is difficult, if not impossible. Instead, we attempt to develop the "best" or composite analog data to support our assessments. One method used incorporates processes for reviewing existing data sources and identifying the available information based on similar equipment, then using that generic data to derive an analog composite. Dissimilarities in equipment descriptions, environment of intended use, quality and even failure modes impact the "best" data incorporated in an analog composite. Once developed, this composite analog data provides a "better" representation of the reliability of the equipment or component can be used to support early risk or reliability trade studies, or analytical models to establish the predicted reliability data points. Data that is more representative of reality and more project specific would provide more accurate analysis, and hopefully a better final decision.

  19. Measure of Truck Delay and Reliability at the Corridor Level

    Science.gov (United States)

    2018-04-01

    Freight transportation provides a significant contribution to our nations economy. A reliable and accessible freight network enables business in the Twin Cities to be more competitive in the Upper Midwest region. Accurate and reliable freight data...

  20. Factors to consider when planning a pipeline inspection: making an informed best choice

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Ben [GE Oil and Gas, PII Pipeline Solutions, Cramlington Northumberland (United Kingdom)

    2009-07-01

    When managing pipeline integrity, the quality of inspection information is critical to determining the true condition of the pipeline, and predicting its future condition over time. The cost of a pipeline failure is nearly always much more than the cost of using a quality inspection service to obtain accurate information on the condition of the line. With pressure to reduce costs for all services, the focus often falls on the two most visible areas, tool selection and short-term service cost. These are important factors, but what is equally important is the quality and reliability of the data obtained and its effect on pipeline integrity. Without reliable data our information on pipeline condition is uncertain at best. These issues of data quality are often not well understood since many specialist technical factors are involved. This paper highlights some of the issues that need to be considered so that their importance and the effort that goes into them can be appreciated better. At the end of the day, the prime requirement is a safe, accurate and reliable inspection, delivering a good specification over a wide range of pipeline conditions and flow velocities. (author)

  1. Do we need 3D tube current modulation information for accurate organ dosimetry in chest CT? Protocols dose comparisons.

    Science.gov (United States)

    Lopez-Rendon, Xochitl; Zhang, Guozhi; Coudyzer, Walter; Develter, Wim; Bosmans, Hilde; Zanca, Federica

    2017-11-01

    To compare the lung and breast dose associated with three chest protocols: standard, organ-based tube current modulation (OBTCM) and fast-speed scanning; and to estimate the error associated with organ dose when modelling the longitudinal (z-) TCM versus the 3D-TCM in Monte Carlo simulations (MC) for these three protocols. Five adult and three paediatric cadavers with different BMI were scanned. The CTDI vol of the OBTCM and the fast-speed protocols were matched to the patient-specific CTDI vol of the standard protocol. Lung and breast doses were estimated using MC with both z- and 3D-TCM simulated and compared between protocols. The fast-speed scanning protocol delivered the highest doses. A slight reduction for breast dose (up to 5.1%) was observed for two of the three female cadavers with the OBTCM in comparison to the standard. For both adult and paediatric, the implementation of the z-TCM data only for organ dose estimation resulted in 10.0% accuracy for the standard and fast-speed protocols, while relative dose differences were up to 15.3% for the OBTCM protocol. At identical CTDI vol values, the standard protocol delivered the lowest overall doses. Only for the OBTCM protocol is the 3D-TCM needed if an accurate (<10.0%) organ dosimetry is desired. • The z-TCM information is sufficient for accurate dosimetry for standard protocols. • The z-TCM information is sufficient for accurate dosimetry for fast-speed scanning protocols. • For organ-based TCM schemes, the 3D-TCM information is necessary for accurate dosimetry. • At identical CTDI vol , the fast-speed scanning protocol delivered the highest doses. • Lung dose was higher in XCare than standard protocol at identical CTDI vol .

  2. Online patient information on Vagus Nerve Stimulation: How reliable is it for facilitating shared decision making?

    Science.gov (United States)

    Ved, Ronak; Cobbold, Naomi; Igbagiri, Kueni; Willis, Mark; Leach, Paul; Zaben, Malik

    2017-08-01

    This study evaluates the quality of information available on the internet for carers of children with epilepsy considering treatment with Vagus Nerve Stimulation (VNS). Selected key phrases were entered into two popular search engines (Google™, Yahoo™). These phrases were: "Vagus nerve stimulator", alone and in combination with "childhood epilepsy", "paediatric epilepsy" and "epilepsy in childhood"; "VNS", and "VNS epilepsy". The first 50 hits per search were then screened. Of 600 identified sites, duplicated (262), irrelevant (230) and inaccessible (15) results were excluded. 93 websites were identified for evaluation using the DISCERN instrument, an online validation tool for patient information websites. The mean DISCERN score of all analysed websites was 39/80 (49%; SD 13.5). This equates to Fair to borderline Poor global quality, (Excellent=80-63; Good=62-51; Fair=50-39; Poor=38-27; Very poor=26-15). None of the analysed sites obtained an Excellent quality rating. 13% (12) obtained a Good score, 40% (37) obtained an Average score, 35% (33) obtained a Poor score, and 12% (11) obtained a Very poor score. The cohort of websites scored particularly poorly on assessment of whether reliable, holistic information was presented, for instance provision of reliable sources, (28%, SD 18) and discussion of alternative treatments, (30%, SD 14). To facilitate patient-centred shared decision-making, high quality information needs to be available for patients and families considering VNS. This study identifies that such information is difficult to locate on the internet. There is a need to develop focussed and reliable online patient resources for VNS. Copyright © 2017 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  3. On Bayesian reliability analysis with informative priors and censoring

    International Nuclear Information System (INIS)

    Coolen, F.P.A.

    1996-01-01

    In the statistical literature many methods have been presented to deal with censored observations, both within the Bayesian and non-Bayesian frameworks, and such methods have been successfully applied to, e.g., reliability problems. Also, in reliability theory it is often emphasized that, through shortage of statistical data and possibilities for experiments, one often needs to rely heavily on judgements of engineers, or other experts, for which means Bayesian methods are attractive. It is therefore important that such judgements can be elicited easily to provide informative prior distributions that reflect the knowledge of the engineers well. In this paper we focus on this aspect, especially on the situation that the judgements of the consulted engineers are based on experiences in environments where censoring has also been present previously. We suggest the use of the attractive interpretation of hyperparameters of conjugate prior distributions when these are available for assumed parametric models for lifetimes, and we show how one may go beyond the standard conjugate priors, using similar interpretations of hyper-parameters, to enable easier elicitation when censoring has been present in the past. This may even lead to more flexibility for modelling prior knowledge than when using standard conjugate priors, whereas the disadvantage of more complicated calculations that may be needed to determine posterior distributions play a minor role due to the advanced mathematical and statistical software that is widely available these days

  4. An automated method for estimating reliability of grid systems using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Emmanuel Ramirez-Marquez, Jose

    2012-01-01

    Grid computing has become relevant due to its applications to large-scale resource sharing, wide-area information transfer, and multi-institutional collaborating. In general, in grid computing a service requests the use of a set of resources, available in a grid, to complete certain tasks. Although analysis tools and techniques for these types of systems have been studied, grid reliability analysis is generally computation-intensive to obtain due to the complexity of the system. Moreover, conventional reliability models have some common assumptions that cannot be applied to the grid systems. Therefore, new analytical methods are needed for effective and accurate assessment of grid reliability. This study presents a new method for estimating grid service reliability, which does not require prior knowledge about the grid system structure unlike the previous studies. Moreover, the proposed method does not rely on any assumptions about the link and node failure rates. This approach is based on a data-mining algorithm, the K2, to discover the grid system structure from raw historical system data, that allows to find minimum resource spanning trees (MRST) within the grid then, uses Bayesian networks (BN) to model the MRST and estimate grid service reliability.

  5. Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability in the Deployable Disbursing System

    Science.gov (United States)

    2009-02-17

    Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability in the Deployable...TITLE AND SUBTITLE Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability...Systems During the Audit ofInternal Controls and Data Reliability in the Deployable Disbursing System (Report No. D-2009-054) Weare providing this

  6. Epistemic Trust and Education: Effects of Informant Reliability on Student Learning of Decimal Concepts

    Science.gov (United States)

    Durkin, Kelley; Shafto, Patrick

    2016-01-01

    The epistemic trust literature emphasizes that children's evaluations of informants' trustworthiness affects learning, but there is no evidence that epistemic trust affects learning in academic domains. The current study investigated how reliability affects decimal learning. Fourth and fifth graders (N = 122; M[subscript age] = 10.1 years)…

  7. Reliability, Validity, Comparability and Practical Utility of Cybercrime-Related Data, Metrics, and Information

    OpenAIRE

    Nir Kshetri

    2013-01-01

    With an increasing pervasiveness, prevalence and severity of cybercrimes, various metrics, measures and statistics have been developed and used to measure various aspects of this phenomenon. Cybercrime-related data, metrics, and information, however, pose important and difficult dilemmas regarding the issues of reliability, validity, comparability and practical utility. While many of the issues of the cybercrime economy are similar to other underground and underworld industries, this economy ...

  8. The Data Evaluation for Obtaining Accuracy and Reliability

    International Nuclear Information System (INIS)

    Kim, Chang Geun; Chae, Kyun Shik; Lee, Sang Tae; Bhang, Gun Woong

    2012-01-01

    Nemours scientific measurement results are flooded from the paper, data book, etc. as fast growing of internet. We meet many different measurement results on the same measurand. In this moment, we are face to choose most reliable one out of them. But it is not easy to choose and use the accurate and reliable data as we do at an ice cream parlor. Even expert users feel difficult to distinguish the accurate and reliable scientific data from huge amount of measurement results. For this reason, the data evaluation is getting more important as the fast growing of internet and globalization. Furthermore the expressions of measurement results are not in standardi-zation. As these need, the international movement has been enhanced. At the first step, the global harmonization of terminology used in metrology and the expression of uncertainty in measurement were published in ISO. These methods are wide spread to many area of science on their measurement to obtain the accuracy and reliability. In this paper, it is introduced that the GUM, SRD and data evaluation on atomic collisions.

  9. A dynamic discretization method for reliability inference in Dynamic Bayesian Networks

    International Nuclear Information System (INIS)

    Zhu, Jiandao; Collette, Matthew

    2015-01-01

    The material and modeling parameters that drive structural reliability analysis for marine structures are subject to a significant uncertainty. This is especially true when time-dependent degradation mechanisms such as structural fatigue cracking are considered. Through inspection and monitoring, information such as crack location and size can be obtained to improve these parameters and the corresponding reliability estimates. Dynamic Bayesian Networks (DBNs) are a powerful and flexible tool to model dynamic system behavior and update reliability and uncertainty analysis with life cycle data for problems such as fatigue cracking. However, a central challenge in using DBNs is the need to discretize certain types of continuous random variables to perform network inference while still accurately tracking low-probability failure events. Most existing discretization methods focus on getting the overall shape of the distribution correct, with less emphasis on the tail region. Therefore, a novel scheme is presented specifically to estimate the likelihood of low-probability failure events. The scheme is an iterative algorithm which dynamically partitions the discretization intervals at each iteration. Through applications to two stochastic crack-growth example problems, the algorithm is shown to be robust and accurate. Comparisons are presented between the proposed approach and existing methods for the discretization problem. - Highlights: • A dynamic discretization method is developed for low-probability events in DBNs. • The method is compared to existing approaches on two crack growth problems. • The method is shown to improve on existing methods for low-probability events

  10. Leveraging Two Kinect Sensors for Accurate Full-Body Motion Capture

    Directory of Open Access Journals (Sweden)

    Zhiquan Gao

    2015-09-01

    Full Text Available Accurate motion capture plays an important role in sports analysis, the medical field and virtual reality. Current methods for motion capture often suffer from occlusions, which limits the accuracy of their pose estimation. In this paper, we propose a complete system to measure the pose parameters of the human body accurately. Different from previous monocular depth camera systems, we leverage two Kinect sensors to acquire more information about human movements, which ensures that we can still get an accurate estimation even when significant occlusion occurs. Because human motion is temporally constant, we adopt a learning analysis to mine the temporal information across the posture variations. Using this information, we estimate human pose parameters accurately, regardless of rapid movement. Our experimental results show that our system can perform an accurate pose estimation of the human body with the constraint of information from the temporal domain.

  11. PREP KITT, System Reliability by Fault Tree Analysis. PREP, Min Path Set and Min Cut Set for Fault Tree Analysis, Monte-Carlo Method. KITT, Component and System Reliability Information from Kinetic Fault Tree Theory

    International Nuclear Information System (INIS)

    Vesely, W.E.; Narum, R.E.

    1997-01-01

    1 - Description of problem or function: The PREP/KITT computer program package obtains system reliability information from a system fault tree. The PREP program finds the minimal cut sets and/or the minimal path sets of the system fault tree. (A minimal cut set is a smallest set of components such that if all the components are simultaneously failed the system is failed. A minimal path set is a smallest set of components such that if all of the components are simultaneously functioning the system is functioning.) The KITT programs determine reliability information for the components of each minimal cut or path set, for each minimal cut or path set, and for the system. Exact, time-dependent reliability information is determined for each component and for each minimal cut set or path set. For the system, reliability results are obtained by upper bound approximations or by a bracketing procedure in which various upper and lower bounds may be obtained as close to one another as desired. The KITT programs can handle independent components which are non-repairable or which have a constant repair time. Any assortment of non-repairable components and components having constant repair times can be considered. Any inhibit conditions having constant probabilities of occurrence can be handled. The failure intensity of each component is assumed to be constant with respect to time. The KITT2 program can also handle components which during different time intervals, called phases, may have different reliability properties. 2 - Method of solution: The PREP program obtains minimal cut sets by either direct deterministic testing or by an efficient Monte Carlo algorithm. The minimal path sets are obtained using the Monte Carlo algorithm. The reliability information is obtained by the KITT programs from numerical solution of the simple integral balance equations of kinetic tree theory. 3 - Restrictions on the complexity of the problem: The PREP program will obtain the minimal cut and

  12. Recommendations for certification or measurement of reliability for reliable digital archival repositories with emphasis on access

    Directory of Open Access Journals (Sweden)

    Paula Regina Ventura Amorim Gonçalez

    2017-04-01

    Full Text Available Introduction: Considering the guidelines of ISO 16363: 2012 (Space data and information transfer systems -- Audit and certification of trustworthy digital repositories and the text of CONARQ Resolution 39 for certification of Reliable Digital Archival Repository (RDC-Arq, verify the technical recommendations should be used as the basis for a digital archival repository to be considered reliable. Objective: Identify requirements for the creation of Reliable Digital Archival Repositories with emphasis on access to information from the ISO 16363: 2012 and CONARQ Resolution 39. Methodology: For the development of the study, the methodology consisted of an exploratory, descriptive and documentary theoretical investigation, since it is based on ISO 16363: 2012 and CONARQ Resolution 39. From the perspective of the problem approach, the study is qualitative and quantitative, since the data were collected, tabulated, and analyzed from the interpretation of their contents. Results: We presented a set of Checklist Recommendations for reliability measurement and/or certification for RDC-Arq with a clipping focused on the identification of requirements with emphasis on access to information is presented. Conclusions: The right to information as well as access to reliable information is a premise for Digital Archival Repositories, so the set of recommendations is directed to archivists who work in Digital Repositories and wish to verify the requirements necessary to evaluate the reliability of the Digital Repository or still guide the information professional in collecting requirements for repository reliability certification.

  13. A Simple and Accurate Method for Measuring Enzyme Activity.

    Science.gov (United States)

    Yip, Din-Yan

    1997-01-01

    Presents methods commonly used for investigating enzyme activity using catalase and presents a new method for measuring catalase activity that is more reliable and accurate. Provides results that are readily reproduced and quantified. Can also be used for investigations of enzyme properties such as the effects of temperature, pH, inhibitors,…

  14. Visual reliability and information rate in the retina of a nocturnal bee.

    Science.gov (United States)

    Frederiksen, Rikard; Wcislo, William T; Warrant, Eric J

    2008-03-11

    Nocturnal animals relying on vision typically have eyes that are optically and morphologically adapted for both increased sensitivity and greater information capacity in dim light. Here, we investigate whether adaptations for increased sensitivity also are found in their photoreceptors by using closely related and fast-flying nocturnal and diurnal bees as model animals. The nocturnal bee Megalopta genalis is capable of foraging and homing by using visually discriminated landmarks at starlight intensities. Megalopta's near relative, Lasioglossum leucozonium, performs these tasks only in bright sunshine. By recording intracellular responses to Gaussian white-noise stimuli, we show that photoreceptors in Megalopta actually code less information at most light levels than those in Lasioglossum. However, as in several other nocturnal arthropods, Megalopta's photoreceptors possess a much greater gain of transduction, indicating that nocturnal photoreceptors trade information capacity for sensitivity. By sacrificing photoreceptor signal-to-noise ratio and information capacity in dim light for an increased gain and, thus, an increased sensitivity, this strategy can benefit nocturnal insects that use neural summation to improve visual reliability at night.

  15. Improved Reliability-Based Optimization with Support Vector Machines and Its Application in Aircraft Wing Design

    Directory of Open Access Journals (Sweden)

    Yu Wang

    2015-01-01

    Full Text Available A new reliability-based design optimization (RBDO method based on support vector machines (SVM and the Most Probable Point (MPP is proposed in this work. SVM is used to create a surrogate model of the limit-state function at the MPP with the gradient information in the reliability analysis. This guarantees that the surrogate model not only passes through the MPP but also is tangent to the limit-state function at the MPP. Then, importance sampling (IS is used to calculate the probability of failure based on the surrogate model. This treatment significantly improves the accuracy of reliability analysis. For RBDO, the Sequential Optimization and Reliability Assessment (SORA is employed as well, which decouples deterministic optimization from the reliability analysis. The improved SVM-based reliability analysis is used to amend the error from linear approximation for limit-state function in SORA. A mathematical example and a simplified aircraft wing design demonstrate that the improved SVM-based reliability analysis is more accurate than FORM and needs less training points than the Monte Carlo simulation and that the proposed optimization strategy is efficient.

  16. The Centralized Reliability Data Organization (CREDO); an advanced nuclear reactor reliability, availability, and maintainability data bank and data analysis center

    International Nuclear Information System (INIS)

    Knee, H.E.

    1991-01-01

    The Centralized Reliability Data Organization (CREDO) is a data bank and data analysis center, which since 1985 has been jointly sponsored by the US Department of Energy's (US DOE's) Office of Technology Support Programs and Japan's Power Reactor and Nuclear Fuel Development Corporation (PNC). It focuses on reliability, availability and maintainability (RAM) data for components (e.g. valves, pumps, etc.) operating in advanced nuclear reactor facilities. As originally intended, the purpose of the CREDO system was to provide a centralized source of accurate, up-to-date data and information for use in RAM analyses necessary for meeting DOE's data needs in the areas of advanced reactor safety assessments, design and licensing. In particular, creation of the CREDO system was considered an essential element needed to fulfill the DOE Breeder Reactor Safety Program's commitment of 'identifying and exploiting areas in which probabilistic methods can be developed and used in making reactor safety Research and Development choices and optimizing designs of safety systems'. CREDO and its operation are explained. (author)

  17. Plant Reliability - an Integrated System for Management (PR-ISM)

    International Nuclear Information System (INIS)

    Aukeman, M.C.; Leininger, E.G.; Carr, P.

    1984-01-01

    The Toledo Edison Company, located in Toledo, Ohio, United States of America, recently implemented a comprehensive maintenance management information system for the Davis-Besse Nuclear Power Station. The system is called PR-ISM, meaning Plant Reliability - An Integrated System for Management. PR-ISM provides the tools needed by station management to effectively plan and control maintenance and other plant activities. The PR-ISM system as it exists today consists of four integrated computer applications: equipment data base maintenance, maintenance work order control, administrative activity tracking, and technical specification compliance. PR-ISM is designed as an integrated on-line system and incorporates strong human factors features. PR-ISM provides each responsible person information to do his job on a daily basis and to look ahead towards future events. It goes beyond 'after the fact' reporting. In this respect, PR-ISM is an 'interactive' control system which: captures work requirements and commitments as they are identified, provides accurate and up-to-date status immediately to those who need it, simplifies paperwork and reduces the associated time delays, provides the information base for work management and reliability analysis, and improves productivity by replacing clerical tasks and consolidating maintenance activities. The functional and technical features of PR-ISM, the experience of Toledo Edison during the first year of operation, and the factors which led to the success of the development project are highlighted. (author)

  18. Mathematical reliability an expository perspective

    CERN Document Server

    Mazzuchi, Thomas; Singpurwalla, Nozer

    2004-01-01

    In this volume consideration was given to more advanced theoretical approaches and novel applications of reliability to ensure that topics having a futuristic impact were specifically included. Topics like finance, forensics, information, and orthopedics, as well as the more traditional reliability topics were purposefully undertaken to make this collection different from the existing books in reliability. The entries have been categorized into seven parts, each emphasizing a theme that seems poised for the future development of reliability as an academic discipline with relevance. The seven parts are networks and systems; recurrent events; information and design; failure rate function and burn-in; software reliability and random environments; reliability in composites and orthopedics, and reliability in finance and forensics. Embedded within the above are some of the other currently active topics such as causality, cascading, exchangeability, expert testimony, hierarchical modeling, optimization and survival...

  19. Geometric information provider platform

    Directory of Open Access Journals (Sweden)

    Meisam Yousefzadeh

    2015-07-01

    Full Text Available Renovation of existing buildings is known as an essential stage in reduction of the energy loss. Considerable part of renovation process depends on geometric reconstruction of building based on semantic parameters. Following many research projects which were focused on parameterizing the energy usage, various energy modelling methods were developed during the last decade. On the other hand, by developing accurate measuring tools such as laser scanners, the interests of having accurate 3D building models are rapidly growing. But the automation of 3D building generation from laser point cloud or detection of specific objects in that is still a challenge.  The goal is designing a platform through which required geometric information can be efficiently produced to support energy simulation software. Developing a reliable procedure which extracts required information from measured data and delivers them to a standard energy modelling system is the main purpose of the project.

  20. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  1. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  2. The European reliability data system. An organized information exchange on the operation of European nuclear reactors

    International Nuclear Information System (INIS)

    Mancini, G.; Amesz, J.; Bastianini, P.; Capobianchi, S.

    1983-01-01

    The paper revises the aims and objectives of the European Reliability Data System (ERDS), a centralized system collecting and organizing, at European level, information related to the operation of LWRs. The ERDS project was started in 1977 and after a preliminary feasibility study that ended in 1979 is now proceeding towards the final design and implementation stages. ERDS exploits information collected in national data systems and information deriving from single reactor sources. The paper describes first the development of the four data banks constituting the system: Component Event Data Bank, CEDB; Abnormal Occurrences Reporting System, AORS; Operating Unit Status Report, OUSR; and Generic Reliability Parameter Data Bank, GRPDB. Several typical aspects concerning the project are then outlined from the need of homogeneization of data and therefore the need for setting up reference classifications, to the problem of data transcoding and input into the system. Furthermore, the need is stressed of involving much more deeply nuclear power plant operators into the process of data acquisition by providing them with a useful feedback from the data analysis. (author)

  3. Toward accurate and precise estimates of lion density.

    Science.gov (United States)

    Elliot, Nicholas B; Gopalaswamy, Arjun M

    2017-08-01

    Reliable estimates of animal density are fundamental to understanding ecological processes and population dynamics. Furthermore, their accuracy is vital to conservation because wildlife authorities rely on estimates to make decisions. However, it is notoriously difficult to accurately estimate density for wide-ranging carnivores that occur at low densities. In recent years, significant progress has been made in density estimation of Asian carnivores, but the methods have not been widely adapted to African carnivores, such as lions (Panthera leo). Although abundance indices for lions may produce poor inferences, they continue to be used to estimate density and inform management and policy. We used sighting data from a 3-month survey and adapted a Bayesian spatially explicit capture-recapture (SECR) model to estimate spatial lion density in the Maasai Mara National Reserve and surrounding conservancies in Kenya. Our unstructured spatial capture-recapture sampling design incorporated search effort to explicitly estimate detection probability and density on a fine spatial scale, making our approach robust in the context of varying detection probabilities. Overall posterior mean lion density was estimated to be 17.08 (posterior SD 1.310) lions >1 year old/100 km 2 , and the sex ratio was estimated at 2.2 females to 1 male. Our modeling framework and narrow posterior SD demonstrate that SECR methods can produce statistically rigorous and precise estimates of population parameters, and we argue that they should be favored over less reliable abundance indices. Furthermore, our approach is flexible enough to incorporate different data types, which enables robust population estimates over relatively short survey periods in a variety of systems. Trend analyses are essential to guide conservation decisions but are frequently based on surveys of differing reliability. We therefore call for a unified framework to assess lion numbers in key populations to improve management and

  4. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  5. Reliability, Validity, Comparability and Practical Utility of Cybercrime-Related Data, Metrics, and Information

    Directory of Open Access Journals (Sweden)

    Nir Kshetri

    2013-02-01

    Full Text Available With an increasing pervasiveness, prevalence and severity of cybercrimes, various metrics, measures and statistics have been developed and used to measure various aspects of this phenomenon. Cybercrime-related data, metrics, and information, however, pose important and difficult dilemmas regarding the issues of reliability, validity, comparability and practical utility. While many of the issues of the cybercrime economy are similar to other underground and underworld industries, this economy also has various unique aspects. For one thing, this industry also suffers from a problem partly rooted in the incredibly broad definition of the term “cybercrime”. This article seeks to provide insights and analysis into this phenomenon, which is expected to advance our understanding into cybercrime-related information.

  6. Creation of reliable relevance judgments in information retrieval systems evaluation experimentation through crowdsourcing: a review.

    Science.gov (United States)

    Samimi, Parnia; Ravana, Sri Devi

    2014-01-01

    Test collection is used to evaluate the information retrieval systems in laboratory-based evaluation experimentation. In a classic setting, generating relevance judgments involves human assessors and is a costly and time consuming task. Researchers and practitioners are still being challenged in performing reliable and low-cost evaluation of retrieval systems. Crowdsourcing as a novel method of data acquisition is broadly used in many research fields. It has been proven that crowdsourcing is an inexpensive and quick solution as well as a reliable alternative for creating relevance judgments. One of the crowdsourcing applications in IR is to judge relevancy of query document pair. In order to have a successful crowdsourcing experiment, the relevance judgment tasks should be designed precisely to emphasize quality control. This paper is intended to explore different factors that have an influence on the accuracy of relevance judgments accomplished by workers and how to intensify the reliability of judgments in crowdsourcing experiment.

  7. Inference on the reliability of Weibull distribution with multiply Type-I censored data

    International Nuclear Information System (INIS)

    Jia, Xiang; Wang, Dong; Jiang, Ping; Guo, Bo

    2016-01-01

    In this paper, we focus on the reliability of Weibull distribution under multiply Type-I censoring, which is a general form of Type-I censoring. In multiply Type-I censoring in this study, all units in the life testing experiment are terminated at different times. Reliability estimation with the maximum likelihood estimate of Weibull parameters is conducted. With the delta method and Fisher information, we propose a confidence interval for reliability and compare it with the bias-corrected and accelerated bootstrap confidence interval. Furthermore, a scenario involving a few expert judgments of reliability is considered. A method is developed to generate extended estimations of reliability according to the original judgments and transform them to estimations of Weibull parameters. With Bayes theory and the Monte Carlo Markov Chain method, a posterior sample is obtained to compute the Bayes estimate and credible interval for reliability. Monte Carlo simulation demonstrates that the proposed confidence interval outperforms the bootstrap one. The Bayes estimate and credible interval for reliability are both satisfactory. Finally, a real example is analyzed to illustrate the application of the proposed methods. - Highlights: • We focus on reliability of Weibull distribution under multiply Type-I censoring. • The proposed confidence interval for the reliability is superior after comparison. • The Bayes estimates with a few expert judgements on reliability are satisfactory. • We specify the cases where the MLEs do not exist and present methods to remedy it. • The distribution of estimate of reliability should be used for accurate estimate.

  8. Reliability assessment based on subjective inferences

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    The reliability information which comes from subjective analysis is often incomplete prior. This information can be generally assumed to exist in the form of either a stated prior mean of R (reliability) or a stated prior credibility interval on R. An efficient approach is developed to determine a complete beta prior distribution from the subjective information according to the principle of maximum entropy, and the the reliability of survival/failure product is assessed via Bayes theorem. Numerical examples are presented to illustrate the methods

  9. Is the information about dengue available on Brazilian websites of quality and reliable?

    Directory of Open Access Journals (Sweden)

    Thiago Henrique de Lima

    2016-12-01

    Full Text Available The objective of the present study was to identify and evaluate the content of information about dengue available on Brazilian websites. Thirty-two websites were selected for the analysis. For the evaluation of the content of information about dengue, a form was prepared with 16 topics grouped in six information blocks: etiology/transmission, vector, control and prevention, disease/diagnosis, treatment and epidemiology. The websites were also evaluated according to the following criteria: authorship, update, language, interactivity, scientific basis and graphic elements. The results showed a predominantly lack of information in relation to the topics analyzed in each information block. Regarding the technical quality of the websites, only 28.1% showed some indication of scientific basis and 34.3% contained the date of publication or of the last update. Such results attested the low reliability of the selected websites. Knowing that the internet is an efficient mechanism for disseminating information on health topics, we concluded that the creation of such mechanisms to disseminate correct and comprehensive information about dengue is necessary in order to apply this useful tool in the prevention and control of the disease in Brazil.

  10. Can Consumers Trust Web-Based Information About Celiac Disease? Accuracy, Comprehensiveness, Transparency, and Readability of Information on the Internet

    Science.gov (United States)

    McNally, Shawna L; Donohue, Michael C; Newton, Kimberly P; Ogletree, Sandra P; Conner, Kristen K; Ingegneri, Sarah E

    2012-01-01

    Background Celiac disease is an autoimmune disease that affects approximately 1% of the US population. Disease is characterized by damage to the small intestinal lining and malabsorption of nutrients. Celiac disease is activated in genetically susceptible individuals by dietary exposure to gluten in wheat and gluten-like proteins in rye and barley. Symptoms are diverse and include gastrointestinal and extraintestinal manifestations. Treatment requires strict adherence to a gluten-free diet. The Internet is a major source of health information about celiac disease. Nonetheless, information about celiac disease that is available on various websites often is questioned by patients and other health care professionals regarding its reliability and content. Objectives To determine the accuracy, comprehensiveness, transparency, and readability of information on 100 of the most widely accessed websites that provide information on celiac disease. Methods Using the search term celiac disease, we analyzed 100 of the top English-language websites published by academic, commercial, nonprofit, and other professional (nonacademic) sources for accuracy, comprehensiveness, transparency, and reading grade level. Each site was assessed independently by 3 reviewers. Website accuracy and comprehensiveness were probed independently using a set of objective core information about celiac disease. We used 19 general criteria to assess website transparency. Website readability was determined by the Flesch-Kincaid reading grade level. Results for each parameter were analyzed independently. In addition, we weighted and combined parameters to generate an overall score, termed website quality. Results We included 98 websites in the final analysis. Of these, 47 (48%) provided specific information about celiac disease that was less than 95% accurate (ie, the predetermined cut-off considered a minimum acceptable level of accuracy). Independent of whether the information posted was accurate, 51 of

  11. Can consumers trust web-based information about celiac disease? Accuracy, comprehensiveness, transparency, and readability of information on the internet.

    Science.gov (United States)

    McNally, Shawna L; Donohue, Michael C; Newton, Kimberly P; Ogletree, Sandra P; Conner, Kristen K; Ingegneri, Sarah E; Kagnoff, Martin F

    2012-04-04

    Celiac disease is an autoimmune disease that affects approximately 1% of the US population. Disease is characterized by damage to the small intestinal lining and malabsorption of nutrients. Celiac disease is activated in genetically susceptible individuals by dietary exposure to gluten in wheat and gluten-like proteins in rye and barley. Symptoms are diverse and include gastrointestinal and extraintestinal manifestations. Treatment requires strict adherence to a gluten-free diet. The Internet is a major source of health information about celiac disease. Nonetheless, information about celiac disease that is available on various websites often is questioned by patients and other health care professionals regarding its reliability and content. To determine the accuracy, comprehensiveness, transparency, and readability of information on 100 of the most widely accessed websites that provide information on celiac disease. Using the search term celiac disease, we analyzed 100 of the top English-language websites published by academic, commercial, nonprofit, and other professional (nonacademic) sources for accuracy, comprehensiveness, transparency, and reading grade level. Each site was assessed independently by 3 reviewers. Website accuracy and comprehensiveness were probed independently using a set of objective core information about celiac disease. We used 19 general criteria to assess website transparency. Website readability was determined by the Flesch-Kincaid reading grade level. Results for each parameter were analyzed independently. In addition, we weighted and combined parameters to generate an overall score, termed website quality. We included 98 websites in the final analysis. Of these, 47 (48%) provided specific information about celiac disease that was less than 95% accurate (ie, the predetermined cut-off considered a minimum acceptable level of accuracy). Independent of whether the information posted was accurate, 51 of 98 (52%) websites contained less than

  12. Accurate Modeling of Ionospheric Electromagnetic Fields Generated by a Low Altitude VLF Transmitter

    Science.gov (United States)

    2009-03-31

    AFRL-RV-HA-TR-2009-1055 Accurate Modeling of Ionospheric Electromagnetic Fields Generated by a Low Altitude VLF Transmitter ...m (or even 500 m) at mid to high latitudes . At low latitudes , the FDTD model exhibits variations that make it difficult to determine a reliable...Scientific, Final 3. DATES COVERED (From - To) 02-08-2006 – 31-12-2008 4. TITLE AND SUBTITLE Accurate Modeling of Ionospheric Electromagnetic Fields

  13. A Scalable and Reliable Message Transport Service for the ATLAS Trigger and Data Acquisition System

    CERN Document Server

    Kazarov, A; The ATLAS collaboration; Kolos, S; Lehmann Miotto, G; Soloviev, I

    2014-01-01

    The ATLAS Trigger and Data Acquisition (TDAQ) is a large distributed computing system composed of several thousands of interconnected computers and tens of thousands applications. During a run, TDAQ applications produce a lot of control and information messages with variable rates, addressed to TDAQ operators or to other applications. Reliable, fast and accurate delivery of the messages is important for the functioning of the whole TDAQ system. The Message Transport Service (MTS) provides facilities for the reliable transport, the filtering and the routing of the messages, basing on publish-subscribe-notify communication pattern with content-based message filtering. During the ongoing LHC shutdown, the MTS was re-implemented, taking into account important requirements like reliability, scalability and performance, handling of slow subscribers case and also simplicity of the design and the implementation. MTS uses CORBA middleware, a common layer for TDAQ infrastructure, and provides sending/subscribing APIs i...

  14. Waste container weighing data processing to create reliable information of household waste generation.

    Science.gov (United States)

    Korhonen, Pirjo; Kaila, Juha

    2015-05-01

    Household mixed waste container weighing data was processed by knowledge discovery and data mining techniques to create reliable information of household waste generation. The final data set included 27,865 weight measurements covering the whole year 2013 and it was selected from a database of Helsinki Region Environmental Services Authority, Finland. The data set contains mixed household waste arising in 6m(3) containers and it was processed identifying missing values and inconsistently low and high values as errors. The share of missing values and errors in the data set was 0.6%. This provides evidence that the waste weighing data gives reliable information of mixed waste generation at collection point level. Characteristic of mixed household waste arising at the waste collection point level is a wide variation between pickups. The seasonal variation pattern as a result of collective similarities in behaviour of households was clearly detected by smoothed medians of waste weight time series. The evaluation of the collection time series against the defined distribution range of pickup weights on the waste collection point level shows that 65% of the pickups were from collection points with optimally dimensioned container capacity and the collection points with over- and under-dimensioned container capacities were noted in 9.5% and 3.4% of all pickups, respectively. Occasional extra waste in containers occurred in 21.2% of the pickups indicating the irregular behaviour of individual households. The results of this analysis show that processing waste weighing data using knowledge discovery and data mining techniques provides trustworthy information of household waste generation and its variations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Design for reliability information and computer-based systems

    CERN Document Server

    Bauer, Eric

    2010-01-01

    "System reliability, availability and robustness are often not well understood by system architects, engineers and developers. They often don't understand what drives customer's availability expectations, how to frame verifiable availability/robustness requirements, how to manage and budget availability/robustness, how to methodically architect and design systems that meet robustness requirements, and so on. The book takes a very pragmatic approach of framing reliability and robustness as a functional aspect of a system so that architects, designers, developers and testers can address it as a concrete, functional attribute of a system, rather than an abstract, non-functional notion"--Provided by publisher.

  16. AMID: Accurate Magnetic Indoor Localization Using Deep Learning

    Directory of Open Access Journals (Sweden)

    Namkyoung Lee

    2018-05-01

    Full Text Available Geomagnetic-based indoor positioning has drawn a great attention from academia and industry due to its advantage of being operable without infrastructure support and its reliable signal characteristics. However, it must overcome the problems of ambiguity that originate with the nature of geomagnetic data. Most studies manage this problem by incorporating particle filters along with inertial sensors. However, they cannot yield reliable positioning results because the inertial sensors in smartphones cannot precisely predict the movement of users. There have been attempts to recognize the magnetic sequence pattern, but these attempts are proven only in a one-dimensional space, because magnetic intensity fluctuates severely with even a slight change of locations. This paper proposes accurate magnetic indoor localization using deep learning (AMID, an indoor positioning system that recognizes magnetic sequence patterns using a deep neural network. Features are extracted from magnetic sequences, and then the deep neural network is used for classifying the sequences by patterns that are generated by nearby magnetic landmarks. Locations are estimated by detecting the landmarks. AMID manifested the proposed features and deep learning as an outstanding classifier, revealing the potential of accurate magnetic positioning with smartphone sensors alone. The landmark detection accuracy was over 80% in a two-dimensional environment.

  17. Component reliability for electronic systems

    CERN Document Server

    Bajenescu, Titu-Marius I

    2010-01-01

    The main reason for the premature breakdown of today's electronic products (computers, cars, tools, appliances, etc.) is the failure of the components used to build these products. Today professionals are looking for effective ways to minimize the degradation of electronic components to help ensure longer-lasting, more technically sound products and systems. This practical book offers engineers specific guidance on how to design more reliable components and build more reliable electronic systems. Professionals learn how to optimize a virtual component prototype, accurately monitor product reliability during the entire production process, and add the burn-in and selection procedures that are the most appropriate for the intended applications. Moreover, the book helps system designers ensure that all components are correctly applied, margins are adequate, wear-out failure modes are prevented during the expected duration of life, and system interfaces cannot lead to failure.

  18. Metrological Reliability of Medical Devices

    Science.gov (United States)

    Costa Monteiro, E.; Leon, L. F.

    2015-02-01

    The prominent development of health technologies of the 20th century triggered demands for metrological reliability of physiological measurements comprising physical, chemical and biological quantities, essential to ensure accurate and comparable results of clinical measurements. In the present work, aspects concerning metrological reliability in premarket and postmarket assessments of medical devices are discussed, pointing out challenges to be overcome. In addition, considering the social relevance of the biomeasurements results, Biometrological Principles to be pursued by research and innovation aimed at biomedical applications are proposed, along with the analysis of their contributions to guarantee the innovative health technologies compliance with the main ethical pillars of Bioethics.

  19. Designing reliability information flows

    International Nuclear Information System (INIS)

    Petkova, Valia T.; Lu Yuan; Ion, Roxana A.; Sander, Peter C.

    2005-01-01

    It is well-known [Reliab. Eng. Syst. Saf. 75 (2002) 295] that in modern development processes it is essential to have an information flow structure that facilitates fast feedback from product users (customers) to departments at the front end, in particular development and production. As information is only relevant if it is used when taking decisions, this paper presents a guideline for building field feedback information flows that facilitate the decision taking during the product creation and realisation process. The guideline takes into consideration that the type of decisions depends on the span-of-control, therefore following Parsons [Structure and Process in Modern Societies (1990)] the span-of-control is subdivided into the following three levels: strategic, tactic, and executive. The guideline is illustrated with a case in which it is used for analysing the quality of existing field feedback flows

  20. Minimal information: an urgent need to assess the functional reliability of recombinant proteins used in biological experiments

    Directory of Open Access Journals (Sweden)

    de Marco Ario

    2008-07-01

    Full Text Available Abstract Structural characterization of proteins used in biological experiments is largely neglected. In most publications, the information available is totally insufficient to judge the functionality of the proteins used and, therefore, the significance of identified protein-protein interactions (was the interaction specific or due to unspecific binding of misfolded protein regions? or reliability of kinetic and thermodynamic data (how much protein was in its native form?. As a consequence, the results of single experiments might not only become questionable, but the whole reliability of systems biology, built on these fundaments, would be weakened. The introduction of Minimal Information concerning purified proteins to add as metadata to the main body of a manuscript would render straightforward the assessment of their functional and structural qualities and, consequently, of results obtained using these proteins. Furthermore, accepted standards for protein annotation would simplify data comparison and exchange. This article has been envisaged as a proposal for aggregating scientists who share the opinion that the scientific community needs a platform for Minimum Information for Protein Functionality Evaluation (MIPFE.

  1. Neutron logging reliability techniques and apparatus

    International Nuclear Information System (INIS)

    Johnstone, C.W.

    1978-01-01

    This invention relates in general to neutron logging of earth formations, and in particular, to novel apparatus and procedures for determining the validity, or reliability, of data derived at least in part by logging neutron characteristics of earth formations and, if desired, for affording verifiably accurate indications of such data

  2. Neutron logging reliability techniques and apparatus

    International Nuclear Information System (INIS)

    Johnstone, C.W.

    1974-01-01

    This invention relates in general to neutron logging of earth formations, and in particular, to novel apparatus and procedures for determining the validity, or reliability, of data derived at least in part by logging neutron characteristics of earth formations and, if desired, for affording verifiably accurate indications of such data. (author)

  3. Mental models accurately predict emotion transitions.

    Science.gov (United States)

    Thornton, Mark A; Tamir, Diana I

    2017-06-06

    Successful social interactions depend on people's ability to predict others' future actions and emotions. People possess many mechanisms for perceiving others' current emotional states, but how might they use this information to predict others' future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others' emotional dynamics. People could then use these mental models of emotion transitions to predict others' future emotions from currently observable emotions. To test this hypothesis, studies 1-3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants' ratings of emotion transitions predicted others' experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation-valence, social impact, rationality, and human mind-inform participants' mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants' accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone.

  4. Mental models accurately predict emotion transitions

    Science.gov (United States)

    Thornton, Mark A.; Tamir, Diana I.

    2017-01-01

    Successful social interactions depend on people’s ability to predict others’ future actions and emotions. People possess many mechanisms for perceiving others’ current emotional states, but how might they use this information to predict others’ future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others’ emotional dynamics. People could then use these mental models of emotion transitions to predict others’ future emotions from currently observable emotions. To test this hypothesis, studies 1–3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants’ ratings of emotion transitions predicted others’ experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation—valence, social impact, rationality, and human mind—inform participants’ mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants’ accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone. PMID:28533373

  5. A Method of Nuclear Software Reliability Estimation

    International Nuclear Information System (INIS)

    Park, Gee Yong; Eom, Heung Seop; Cheon, Se Woo; Jang, Seung Cheol

    2011-01-01

    A method on estimating software reliability for nuclear safety software is proposed. This method is based on the software reliability growth model (SRGM) where the behavior of software failure is assumed to follow the non-homogeneous Poisson process. Several modeling schemes are presented in order to estimate and predict more precisely the number of software defects based on a few of software failure data. The Bayesian statistical inference is employed to estimate the model parameters by incorporating the software test cases into the model. It is identified that this method is capable of accurately estimating the remaining number of software defects which are on-demand type directly affecting safety trip functions. The software reliability can be estimated from a model equation and one method of obtaining the software reliability is proposed

  6. Development of a Tablet-based symbol digit modalities test for reliably assessing information processing speed in patients with stroke.

    Science.gov (United States)

    Tung, Li-Chen; Yu, Wan-Hui; Lin, Gong-Hong; Yu, Tzu-Ying; Wu, Chien-Te; Tsai, Chia-Yin; Chou, Willy; Chen, Mei-Hsiang; Hsieh, Ching-Lin

    2016-09-01

    To develop a Tablet-based Symbol Digit Modalities Test (T-SDMT) and to examine the test-retest reliability and concurrent validity of the T-SDMT in patients with stroke. The study had two phases. In the first phase, six experts, nine college students and five outpatients participated in the development and testing of the T-SDMT. In the second phase, 52 outpatients were evaluated twice (2 weeks apart) with the T-SDMT and SDMT to examine the test-retest reliability and concurrent validity of the T-SDMT. The T-SDMT was developed via expert input and college student/patient feedback. Regarding test-retest reliability, the practise effects of the T-SDMT and SDMT were both trivial (d=0.12) but significant (p≦0.015). The improvement in the T-SDMT (4.7%) was smaller than that in the SDMT (5.6%). The minimal detectable changes (MDC%) of the T-SDMT and SDMT were 6.7 (22.8%) and 10.3 (32.8%), respectively. The T-SDMT and SDMT were highly correlated with each other at the two time points (Pearson's r=0.90-0.91). The T-SDMT demonstrated good concurrent validity with the SDMT. Because the T-SDMT had a smaller practise effect and less random measurement error (superior test-retest reliability), it is recommended over the SDMT for assessing information processing speed in patients with stroke. Implications for Rehabilitation The Symbol Digit Modalities Test (SDMT), a common measure of information processing speed, showed a substantial practise effect and considerable random measurement error in patients with stroke. The Tablet-based SDMT (T-SDMT) has been developed to reduce the practise effect and random measurement error of the SDMT in patients with stroke. The T-SDMT had smaller practise effect and random measurement error than the SDMT, which can provide more reliable assessments of information processing speed.

  7. Emergency pediatric anesthesia - accessibility of information.

    Science.gov (United States)

    King, Hannah; Pipe, Georgina E M; Linford, Sarah L; Moppett, Iain K; Armstrong, James A M

    2015-03-01

    Emergency pediatric situations are stressful for all involved. Variation in weight, physiology, and anatomy can be substantial and errors in calculating drugs and fluids can be catastrophic. To evaluate the reliability of information resources that anesthetic trainees might use when faced with common pediatric emergencies. Anesthetic trainees from a single UK deanery were recruited and timed while they identified 18 predetermined pieces of information from three Advanced Pediatric Life Support (APLS) scenarios. The two most popular smartphone applications identified from a previous survey, PaedsED (PaedsED. iED limited, Version 1.0.8, Updated March 2011. ©2009) and Anapaed (AnaPaed. Thierry Girard, Version 1.4.2, Updated Nov 2, 2012. ©Thierry Girard), the British National Formulary for Children (cBNF) and trainee's inherent knowledge were compared with a local, check-list style, handbook of pediatric emergency algorithms - Pediatric Anesthetic Emergency Data sheets (PAEDs). Twenty anesthetic trainees were recruited. The fastest source of information was the trainees own knowledge (median 61 s, IQR 51-83 s). Second fastest was PAEDs (80, [59-110] s), followed by PaedsED (84, [65-111]). The most accurate source overall was PaedsED (100, [83-100]) although the accuracy varied between scenarios. The handbook was rated as the most popular resource by the trainees. Although fastest, trainees own knowledge is inaccurate, highlighting the need for additional, rapidly accessible, information. Of the two smartphone applications, PaedsED proved to be fast, accurate, and more popular, while Anapaed was accurate but slow to use. The PAEDs handbook, with its checklist-style format, was also fast, accurate and rated the most popular information source. © 2014 John Wiley & Sons Ltd.

  8. Reliability database development and plant performance improvement effort at Korea Hydro and Nuclear Power Co

    International Nuclear Information System (INIS)

    Oh, S. J.; Hwang, S. W.; Na, J. H.; Lim, H. S.

    2008-01-01

    Nuclear utilities in recent years have focused on improved plant performance and equipment reliability. In U.S., there is a movement toward process integration. Examples are INPO AP-913 equipment reliability program and the standard nuclear performance model developed by NEI. Synergistic effect from an integrated approach can be far greater than as compared to individual effects from each program. In Korea, PSA for all Korean NPPs (Nuclear Power Plants) has been completed. Plant performance monitoring and improvement is an important goal for KHNP (Korea Hydro and Nuclear Power Company) and a risk monitoring system called RIMS has been developed for all nuclear plants. KHNP is in the process of voluntarily implementing maintenance rule program similar to that in U.S. In the future, KHNP would like to expand the effort to equipment reliability program and to achieve highest equipment reliability and improved plant performance. For improving equipment reliability, the current trend is moving toward preventive/predictive maintenance from corrective maintenance. With the emphasis on preventive maintenance, the failure cause and operation history and environment are important. Hence, the development of accurate reliability database is necessary. Furthermore, the database should be updated regularly and maintained as a living program to reflect the current status of equipment reliability. This paper examines the development of reliability database system and its application of maintenance optimization or Risk Informed Application (RIA). (authors)

  9. FINANCIAL INFORMATION, EFFECTS OF FINANCIAL INFORMATION ON ECONOMIC DECISION

    Directory of Open Access Journals (Sweden)

    TAK ISA

    2010-12-01

    Full Text Available Financial information has, indisputably, an important effect in economics. To form an effective capital market, financial information must be reliable and accurate. Misleading financial information always has a negative impact on economic decision taken by users. It is known that financial information as the cornerstone of financial markets, can improve economic performance in several ways. Nowadays we are facing economic crisis due to irregularities of presentation of financial statements to users. Misunderstandings cause economic recession. Detection of fraudulent financial information, is an important issue facing the auditing profession. Currently, bankruptcy of companies around the world, leaves millions of people without jobs, this is caused by financial information which is manipulated by companies. The purpose of this paper is to analyze the effects of errors and manipulation committed in the financial information sector on the real economy. Also one of the purposes of this paper is to analyze error and fraud in financial statements how it effects the real economy and the reasons for committing fraud in financial statements. Also, several suggestions are included in this study about actions that can be taken to prevent errors and manipulation in financial information.

  10. 2017 NREL Photovoltaic Reliability Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-15

    NREL's Photovoltaic (PV) Reliability Workshop (PVRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology -- both critical goals for moving PV technologies deeper into the electricity marketplace.

  11. Based on Weibull Information Fusion Analysis Semiconductors Quality the Key Technology of Manufacturing Execution Systems Reliability

    Science.gov (United States)

    Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai

    2016-05-01

    Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.

  12. Cue reliability and a landmark stability heuristic determine relative weighting between egocentric and allocentric visual information in memory-guided reach.

    Science.gov (United States)

    Byrne, Patrick A; Crawford, J Douglas

    2010-06-01

    It is not known how egocentric visual information (location of a target relative to the self) and allocentric visual information (location of a target relative to external landmarks) are integrated to form reach plans. Based on behavioral data from rodents and humans we hypothesized that the degree of stability in visual landmarks would influence the relative weighting. Furthermore, based on numerous cue-combination studies we hypothesized that the reach system would act like a maximum-likelihood estimator (MLE), where the reliability of both cues determines their relative weighting. To predict how these factors might interact we developed an MLE model that weighs egocentric and allocentric information based on their respective reliabilities, and also on an additional stability heuristic. We tested the predictions of this model in 10 human subjects by manipulating landmark stability and reliability (via variable amplitude vibration of the landmarks and variable amplitude gaze shifts) in three reach-to-touch tasks: an egocentric control (reaching without landmarks), an allocentric control (reaching relative to landmarks), and a cue-conflict task (involving a subtle landmark "shift" during the memory interval). Variability from all three experiments was used to derive parameters for the MLE model, which was then used to simulate egocentric-allocentric weighting in the cue-conflict experiment. As predicted by the model, landmark vibration--despite its lack of influence on pointing variability (and thus allocentric reliability) in the control experiment--had a strong influence on egocentric-allocentric weighting. A reduced model without the stability heuristic was unable to reproduce this effect. These results suggest heuristics for extrinsic cue stability are at least as important as reliability for determining cue weighting in memory-guided reaching.

  13. Development of reliability-based safety enhancement technology

    International Nuclear Information System (INIS)

    Kim, Kil Yoo; Han, Sang Hoon; Jang, Seung Cherl

    2002-04-01

    This project aims to develop critical technologies and the necessary reliability DB for maximizing the economics in the NPP operation with keeping the safety using the information of the risk (or reliability). For the research goal, firstly the four critical technologies(Risk Informed Tech. Spec. Optimization, Risk Informed Inservice Testing, On-line Maintenance, Maintenance Rule) for RIR and A have been developed. Secondly, KIND (Korea Information System for Nuclear Reliability Data) has been developed. Using KIND, YGN 3,4 and UCN 3,4 component reliability DB have been established. A reactor trip history DB for all NPP in Korea also has been developed and analyzed. Finally, a detailed reliability analysis of RPS/ESFAS for KNSP has been performed. With the result of the analysis, the sensitivity analysis also has been performed to optimize the AOT/STI of tech. spec. A statistical analysis procedure and computer code have been developed for the set point drift analysis

  14. Reliability of Visual and Somatosensory Feedback in Skilled Movement: The Role of the Cerebellum.

    Science.gov (United States)

    Mizelle, J C; Oparah, Alexis; Wheaton, Lewis A

    2016-01-01

    The integration of vision and somatosensation is required to allow for accurate motor behavior. While both sensory systems contribute to an understanding of the state of the body through continuous updating and estimation, how the brain processes unreliable sensory information remains to be fully understood in the context of complex action. Using functional brain imaging, we sought to understand the role of the cerebellum in weighting visual and somatosensory feedback by selectively reducing the reliability of each sense individually during a tool use task. We broadly hypothesized upregulated activation of the sensorimotor and cerebellar areas during movement with reduced visual reliability, and upregulated activation of occipital brain areas during movement with reduced somatosensory reliability. As specifically compared to reduced somatosensory reliability, we expected greater activations of ipsilateral sensorimotor cerebellum for intact visual and somatosensory reliability. Further, we expected that ipsilateral posterior cognitive cerebellum would be affected with reduced visual reliability. We observed that reduced visual reliability results in a trend towards the relative consolidation of sensorimotor activation and an expansion of cerebellar activation. In contrast, reduced somatosensory reliability was characterized by the absence of cerebellar activations and a trend towards the increase of right frontal, left parietofrontal activation, and temporo-occipital areas. Our findings highlight the role of the cerebellum for specific aspects of skillful motor performance. This has relevance to understanding basic aspects of brain functions underlying sensorimotor integration, and provides a greater understanding of cerebellar function in tool use motor control.

  15. Human factor reliability program

    International Nuclear Information System (INIS)

    Knoblochova, L.

    2017-01-01

    The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)

  16. A Highly Reliable and Cost-Efficient Multi-Sensor System for Land Vehicle Positioning

    Directory of Open Access Journals (Sweden)

    Xu Li

    2016-05-01

    Full Text Available In this paper, we propose a novel positioning solution for land vehicles which is highly reliable and cost-efficient. The proposed positioning system fuses information from the MEMS-based reduced inertial sensor system (RISS which consists of one vertical gyroscope and two horizontal accelerometers, low-cost GPS, and supplementary sensors and sources. First, pitch and roll angle are accurately estimated based on a vehicle kinematic model. Meanwhile, the negative effect of the uncertain nonlinear drift of MEMS inertial sensors is eliminated by an H∞ filter. Further, a distributed-dual-H∞ filtering (DDHF mechanism is adopted to address the uncertain nonlinear drift of the MEMS-RISS and make full use of the supplementary sensors and sources. The DDHF is composed of a main H∞ filter (MHF and an auxiliary H∞ filter (AHF. Finally, a generalized regression neural network (GRNN module with good approximation capability is specially designed for the MEMS-RISS. A hybrid methodology which combines the GRNN module and the AHF is utilized to compensate for RISS position errors during GPS outages. To verify the effectiveness of the proposed solution, road-test experiments with various scenarios were performed. The experimental results illustrate that the proposed system can achieve accurate and reliable positioning for land vehicles.

  17. The reliability, accuracy and minimal detectable difference of a multi-segment kinematic model of the foot-shoe complex.

    Science.gov (United States)

    Bishop, Chris; Paul, Gunther; Thewlis, Dominic

    2013-04-01

    Kinematic models are commonly used to quantify foot and ankle kinematics, yet no marker sets or models have been proven reliable or accurate when wearing shoes. Further, the minimal detectable difference of a developed model is often not reported. We present a kinematic model that is reliable, accurate and sensitive to describe the kinematics of the foot-shoe complex and lower leg during walking gait. In order to achieve this, a new marker set was established, consisting of 25 markers applied on the shoe and skin surface, which informed a four segment kinematic model of the foot-shoe complex and lower leg. Three independent experiments were conducted to determine the reliability, accuracy and minimal detectable difference of the marker set and model. Inter-rater reliability of marker placement on the shoe was proven to be good to excellent (ICC=0.75-0.98) indicating that markers could be applied reliably between raters. Intra-rater reliability was better for the experienced rater (ICC=0.68-0.99) than the inexperienced rater (ICC=0.38-0.97). The accuracy of marker placement along each axis was <6.7 mm for all markers studied. Minimal detectable difference (MDD90) thresholds were defined for each joint; tibiocalcaneal joint--MDD90=2.17-9.36°, tarsometatarsal joint--MDD90=1.03-9.29° and the metatarsophalangeal joint--MDD90=1.75-9.12°. These thresholds proposed are specific for the description of shod motion, and can be used in future research designed at comparing between different footwear. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Reliability of neural encoding

    DEFF Research Database (Denmark)

    Alstrøm, Preben; Beierholm, Ulrik; Nielsen, Carsten Dahl

    2002-01-01

    The reliability with which a neuron is able to create the same firing pattern when presented with the same stimulus is of critical importance to the understanding of neuronal information processing. We show that reliability is closely related to the process of phaselocking. Experimental results f...

  19. Component reliability analysis for development of component reliability DB of Korean standard NPPs

    International Nuclear Information System (INIS)

    Choi, S. Y.; Han, S. H.; Kim, S. H.

    2002-01-01

    The reliability data of Korean NPP that reflects the plant specific characteristics is necessary for PSA and Risk Informed Application. We have performed a project to develop the component reliability DB and calculate the component reliability such as failure rate and unavailability. We have collected the component operation data and failure/repair data of Korean standard NPPs. We have analyzed failure data by developing a data analysis method which incorporates the domestic data situation. And then we have compared the reliability results with the generic data for the foreign NPPs

  20. Efficient Estimation of Extreme Non-linear Roll Motions using the First-order Reliability Method (FORM)

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    2007-01-01

    In on-board decision support systems efficient procedures are needed for real-time estimation of the maximum ship responses to be expected within the next few hours, given on-line information on the sea state and user defined ranges of possible headings and speeds. For linear responses standard...... frequency domain methods can be applied. To non-linear responses like the roll motion, standard methods like direct time domain simulations are not feasible due to the required computational time. However, the statistical distribution of non-linear ship responses can be estimated very accurately using...... the first-order reliability method (FORM), well-known from structural reliability problems. To illustrate the proposed procedure, the roll motion is modelled by a simplified non-linear procedure taking into account non-linear hydrodynamic damping, time-varying restoring and wave excitation moments...

  1. Accurate guitar tuning by cochlear implant musicians.

    Directory of Open Access Journals (Sweden)

    Thomas Lu

    Full Text Available Modern cochlear implant (CI users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  2. Fast and accurate methods for phylogenomic analyses

    Directory of Open Access Journals (Sweden)

    Warnow Tandy

    2011-10-01

    Full Text Available Abstract Background Species phylogenies are not estimated directly, but rather through phylogenetic analyses of different gene datasets. However, true gene trees can differ from the true species tree (and hence from one another due to biological processes such as horizontal gene transfer, incomplete lineage sorting, and gene duplication and loss, so that no single gene tree is a reliable estimate of the species tree. Several methods have been developed to estimate species trees from estimated gene trees, differing according to the specific algorithmic technique used and the biological model used to explain differences between species and gene trees. Relatively little is known about the relative performance of these methods. Results We report on a study evaluating several different methods for estimating species trees from sequence datasets, simulating sequence evolution under a complex model including indels (insertions and deletions, substitutions, and incomplete lineage sorting. The most important finding of our study is that some fast and simple methods are nearly as accurate as the most accurate methods, which employ sophisticated statistical methods and are computationally quite intensive. We also observe that methods that explicitly consider errors in the estimated gene trees produce more accurate trees than methods that assume the estimated gene trees are correct. Conclusions Our study shows that highly accurate estimations of species trees are achievable, even when gene trees differ from each other and from the species tree, and that these estimations can be obtained using fairly simple and computationally tractable methods.

  3. Monitoring nutritional status accurately and reliably in adolescents with anorexia nervosa.

    Science.gov (United States)

    Martin, Andrew C; Pascoe, Elaine M; Forbes, David A

    2009-01-01

    Accurate assessment of nutritional status is a vital aspect of caring for individuals with anorexia nervosa (AN) and body mass index (BMI) is considered an appropriate and easy to use tool. Because of the intense fear of weight gain, some individuals may attempt to mislead the physician. Mid-upper arm circumference (MUAC) is a simple, objective method of assessing nutritional status. The setting is an eating disorders clinic in a tertiary paediatric hospital in Western Australia. The aim of this study is to evaluate how well MUAC correlates with BMI in adolescents with AN. Prospective observational study to evaluate nutritional status in adolescents with AN. Fifty-five adolescents aged 12-17 years with AN were assessed between January 1, 2004 and January 1, 2006. MUAC was highly correlated with BMI (r = 0.79, P or=20 cm rarely required hospitalisation (negative predictive value 93%). MUAC reflects nutritional status as defined by BMI in adolescents with AN. Lack of consistency between longitudinal measurements of BMI and MUAC should be viewed suspiciously and prompt a more detailed nutritional assessment.

  4. On-Line Self-Calibrating Single Crystal Sapphire Optical Sensor Instrumentation for Accurate and Reliable Coal Gasifier Temperature Measurement

    Energy Technology Data Exchange (ETDEWEB)

    Kristie Cooper; Gary Pickrell; Anbo Wang

    2005-11-01

    This report summarizes technical progress April-September 2005 on the Phase II program ''On-Line Self-Calibrating Single Crystal Sapphire Optical Sensor Instrumentation for Accurate and Reliable Coal Gasifier Temperature Measurement'', funded by the Federal Energy Technology Center of the U.S. Department of Energy, and performed by the Center for Photonics Technology of the Bradley Department of Electrical and Computer Engineering at Virginia Tech. The outcome of the first phase of this program was the selection of broadband polarimetric differential interferometry (BPDI) for further prototype instrumentation development. This approach is based on the measurement of the optical path difference (OPD) between two orthogonally polarized light beams in a single-crystal sapphire disk. The objective of this program is to bring the sensor technology, which has already been demonstrated in the laboratory, to a level where the sensor can be deployed in the harsh industrial environments and will become commercially viable. Due to the difficulties described on the last report, field testing of the BPDI system has not continued to date. However, we have developed an alternative high temperature sensing solution, which is described in this report. The sensing system will be installed and tested at TECO's Polk Power Station. Following a site visit in June 2005, our efforts have been focused on preparing for that field test, including he design of the sensor mechanical packaging, sensor electronics, the data transfer module, and the necessary software codes to accommodate this application.. We are currently ready to start sensor fabrication.

  5. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  6. Sequential decision reliability concept and failure rate assessment

    International Nuclear Information System (INIS)

    Ciftcioglu, O.

    1990-11-01

    Conventionally, a reliability concept is considered together with both each basic unit and their integration in a complicated large scale system such as a nuclear power plant (NPP). Basically, as the plant's operational status is determined by the information obtained from various sensors, the plant's reliability and the risk assessment is closely related to the reliability of the sensory information and hence the sensor components. However, considering the relevant information-processing systems, e.g. fault detection processors, there exists a further question about the reliability of such systems, specifically the reliability of the systems' decision-based outcomes by means of which the further actions are performed. To this end, a general sequential decision reliability concept and the failure rate assessment methodology is introduced. The implications of the methodology are investigated and the importance of the decision reliability concept in system operation is demonstrated by means of sensory signals in real-time from the Borssele NPP in the Netherlands. (author). 21 refs.; 8 figs

  7. Reliability data book

    International Nuclear Information System (INIS)

    Bento, J.P.; Boerje, S.; Ericsson, G.; Hasler, A.; Lyden, C.O.; Wallin, L.; Poern, K.; Aakerlund, O.

    1985-01-01

    The main objective for the report is to improve failure data for reliability calculations as parts of safety analyses for Swedish nuclear power plants. The work is based primarily on evaluations of failure reports as well as information provided by the operation and maintenance staff of each plant. In the report are presented charts of reliability data for: pumps, valves, control rods/rod drives, electrical components, and instruments. (L.E.)

  8. The quadrant method measuring four points is as a reliable and accurate as the quadrant method in the evaluation after anatomical double-bundle ACL reconstruction.

    Science.gov (United States)

    Mochizuki, Yuta; Kaneko, Takao; Kawahara, Keisuke; Toyoda, Shinya; Kono, Norihiko; Hada, Masaru; Ikegami, Hiroyasu; Musha, Yoshiro

    2017-11-20

    The quadrant method was described by Bernard et al. and it has been widely used for postoperative evaluation of anterior cruciate ligament (ACL) reconstruction. The purpose of this research is to further develop the quadrant method measuring four points, which we named four-point quadrant method, and to compare with the quadrant method. Three-dimensional computed tomography (3D-CT) analyses were performed in 25 patients who underwent double-bundle ACL reconstruction using the outside-in technique. The four points in this study's quadrant method were defined as point1-highest, point2-deepest, point3-lowest, and point4-shallowest, in femoral tunnel position. Value of depth and height in each point was measured. Antero-medial (AM) tunnel is (depth1, height2) and postero-lateral (PL) tunnel is (depth3, height4) in this four-point quadrant method. The 3D-CT images were evaluated independently by 2 orthopaedic surgeons. A second measurement was performed by both observers after a 4-week interval. Intra- and inter-observer reliability was calculated by means of intra-class correlation coefficient (ICC). Also, the accuracy of the method was evaluated against the quadrant method. Intra-observer reliability was almost perfect for both AM and PL tunnel (ICC > 0.81). Inter-observer reliability of AM tunnel was substantial (ICC > 0.61) and that of PL tunnel was almost perfect (ICC > 0.81). The AM tunnel position was 0.13% deep, 0.58% high and PL tunnel position was 0.01% shallow, 0.13% low compared to quadrant method. The four-point quadrant method was found to have high intra- and inter-observer reliability and accuracy. This method can evaluate the tunnel position regardless of the shape and morphology of the bone tunnel aperture for use of comparison and can provide measurement that can be compared with various reconstruction methods. The four-point quadrant method of this study is considered to have clinical relevance in that it is a detailed and accurate tool for

  9. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  10. Fast, accurate, and reliable molecular docking with QuickVina 2.

    Science.gov (United States)

    Alhossary, Amr; Handoko, Stephanus Daniel; Mu, Yuguang; Kwoh, Chee-Keong

    2015-07-01

    The need for efficient molecular docking tools for high-throughput screening is growing alongside the rapid growth of drug-fragment databases. AutoDock Vina ('Vina') is a widely used docking tool with parallelization for speed. QuickVina ('QVina 1') then further enhanced the speed via a heuristics, requiring high exhaustiveness. With low exhaustiveness, its accuracy was compromised. We present in this article the latest version of QuickVina ('QVina 2') that inherits both the speed of QVina 1 and the reliability of the original Vina. We tested the efficacy of QVina 2 on the core set of PDBbind 2014. With the default exhaustiveness level of Vina (i.e. 8), a maximum of 20.49-fold and an average of 2.30-fold acceleration with a correlation coefficient of 0.967 for the first mode and 0.911 for the sum of all modes were attained over the original Vina. A tendency for higher acceleration with increased number of rotatable bonds as the design variables was observed. On the accuracy, Vina wins over QVina 2 on 30% of the data with average energy difference of only 0.58 kcal/mol. On the same dataset, GOLD produced RMSD smaller than 2 Å on 56.9% of the data while QVina 2 attained 63.1%. The C++ source code of QVina 2 is available at (www.qvina.org). aalhossary@pmail.ntu.edu.sg Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  12. An examination of reliability critical items in liquid metal reactors: An analysis by the Centralized Reliability Data Organization (CREDO)

    International Nuclear Information System (INIS)

    Humphrys, B.L.; Haire, M.J.; Koger, K.H.; Manneschmidt, J.F.; Setoguchi, K.; Nakai, R.; Okubo, Y.

    1987-01-01

    The Centralized Reliability Data Organization (CREDO) is the largest repository of liquid metal reactor (LMR) component reliability data in the world. It is jointly sponsored by the US Department of Energy (DOE) and the Power Reactor and Nuclear Fuel Development Corporation (PNC) of Japan. The CREDO data base contains information on a population of more than 21,000 components and approximately 1300 event records. A conservative estimation is that the total component operating hours is approaching 3.5 billion hours. Because data gathering for CREDO concentrates on event (failure) information, the work reported here focuses on the reliability information contained in CREDO and the development of reliability critical items lists. That is, components are ranked in prioritized lists from worst to best performers from a reliability standpoint. For the data contained in the CREDO data base, FFTF and JOYO show reliability growth; EBR-II reveals a slight unreliability growth for those components tracked by CREDO. However, tabulations of events which cause reactor shutdowns decrease with time at each site

  13. Models on reliability of non-destructive testing

    International Nuclear Information System (INIS)

    Simola, K.; Pulkkinen, U.

    1998-01-01

    The reliability of ultrasonic inspections has been studied in e.g. international PISC (Programme for the Inspection of Steel Components) exercises. These exercises have produced a large amount of information on the effect of various factors on the reliability of inspections. The information obtained from reliability experiments are used to model the dependency of flaw detection probability on various factors and to evaluate the performance of inspection equipment, including the sizing accuracy. The information from experiments is utilised in a most effective way when mathematical models are applied. Here, some statistical models for reliability of non-destructive tests are introduced. In order to demonstrate the use of inspection reliability models, they have been applied to the inspection results of intergranular stress corrosion cracking (IGSCC) type flaws in PISC III exercise (PISC 1995). The models are applied to both flaw detection frequency data of all inspection teams and to flaw sizing data of one participating team. (author)

  14. Models of Information Security Highly Reliable Computing Systems

    Directory of Open Access Journals (Sweden)

    Vsevolod Ozirisovich Chukanov

    2016-03-01

    Full Text Available Methods of the combined reservation are considered. The models of reliability of systems considering parameters of restoration and prevention of blocks of system are described. Ratios for average quantity prevention and an availability quotient of blocks of system are given.

  15. Are YouTube videos accurate and reliable on basic life support and cardiopulmonary resuscitation?

    Science.gov (United States)

    Yaylaci, Serpil; Serinken, Mustafa; Eken, Cenker; Karcioglu, Ozgur; Yilmaz, Atakan; Elicabuk, Hayri; Dal, Onur

    2014-10-01

    The objective of this study is to investigate reliability and accuracy of the information on YouTube videos related to CPR and BLS in accord with 2010 CPR guidelines. YouTube was queried using four search terms 'CPR', 'cardiopulmonary resuscitation', 'BLS' and 'basic life support' between 2011 and 2013. Sources that uploaded the videos, the record time, the number of viewers in the study period, inclusion of human or manikins were recorded. The videos were rated if they displayed the correct order of resuscitative efforts in full accord with 2010 CPR guidelines or not. Two hundred and nine videos meeting the inclusion criteria after the search in YouTube with four search terms ('CPR', 'cardiopulmonary resuscitation', 'BLS' and 'basic life support') comprised the study sample subjected to the analysis. Median score of the videos is 5 (IQR: 3.5-6). Only 11.5% (n = 24) of the videos were found to be compatible with 2010 CPR guidelines with regard to sequence of interventions. Videos uploaded by 'Guideline bodies' had significantly higher rates of download when compared with the videos uploaded by other sources. Sources of the videos and date of upload (year) were not shown to have any significant effect on the scores received (P = 0.615 and 0.513, respectively). The videos' number of downloads did not differ according to the videos compatible with the guidelines (P = 0.832). The videos downloaded more than 10,000 times had a higher score than the others (P = 0.001). The majority of You-Tube video clips purporting to be about CPR are not relevant educational material. Of those that are focused on teaching CPR, only a small minority optimally meet the 2010 Resucitation Guidelines. © 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  16. Reliability demonstration test planning: A three dimensional consideration

    International Nuclear Information System (INIS)

    Yadav, Om Prakash; Singh, Nanua; Goel, Parveen S.

    2006-01-01

    Increasing customer demand for reliability, fierce market competition on time-to-market and cost, and highly reliable products are making reliability testing more challenging task. This paper presents a systematic approach for identifying critical elements (subsystems and components) of the system and deciding the types of test to be performed to demonstrate reliability. It decomposes the system into three dimensions (i.e. physical, functional and time) and identifies critical elements in the design by allocating system level reliability to each candidate. The decomposition of system level reliability is achieved by using criticality index. The numerical value of criticality index for each candidate is derived based on the information available from failure mode and effects analysis (FMEA) document or warranty data from a prior system. It makes use of this information to develop reliability demonstration test plan for the identified (critical) failure mechanisms and physical elements. It also highlights the benefits of using prior information in order to locate critical spots in the design and in subsequent development of test plans. A case example is presented to demonstrate the proposed approach

  17. Reference gene identification for reliable normalisation of quantitative RT-PCR data in Setaria viridis.

    Science.gov (United States)

    Nguyen, Duc Quan; Eamens, Andrew L; Grof, Christopher P L

    2018-01-01

    Quantitative real-time polymerase chain reaction (RT-qPCR) is the key platform for the quantitative analysis of gene expression in a wide range of experimental systems and conditions. However, the accuracy and reproducibility of gene expression quantification via RT-qPCR is entirely dependent on the identification of reliable reference genes for data normalisation. Green foxtail ( Setaria viridis ) has recently been proposed as a potential experimental model for the study of C 4 photosynthesis and is closely related to many economically important crop species of the Panicoideae subfamily of grasses, including Zea mays (maize), Sorghum bicolor (sorghum) and Sacchurum officinarum (sugarcane). Setaria viridis (Accession 10) possesses a number of key traits as an experimental model, namely; (i) a small sized, sequenced and well annotated genome; (ii) short stature and generation time; (iii) prolific seed production, and; (iv) is amendable to Agrobacterium tumefaciens -mediated transformation. There is currently however, a lack of reference gene expression information for Setaria viridis ( S. viridis ). We therefore aimed to identify a cohort of suitable S. viridis reference genes for accurate and reliable normalisation of S. viridis RT-qPCR expression data. Eleven putative candidate reference genes were identified and examined across thirteen different S. viridis tissues. Of these, the geNorm and NormFinder analysis software identified SERINE / THERONINE - PROTEIN PHOSPHATASE 2A ( PP2A ), 5 '- ADENYLYLSULFATE REDUCTASE 6 ( ASPR6 ) and DUAL SPECIFICITY PHOSPHATASE ( DUSP ) as the most suitable combination of reference genes for the accurate and reliable normalisation of S. viridis RT-qPCR expression data. To demonstrate the suitability of the three selected reference genes, PP2A , ASPR6 and DUSP , were used to normalise the expression of CINNAMYL ALCOHOL DEHYDROGENASE ( CAD ) genes across the same tissues. This approach readily demonstrated the suitably of the three

  18. 2015 NREL Photovoltaic Module Reliability Workshops

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-14

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  19. 2016 NREL Photovoltaic Module Reliability Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-07

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology - both critical goals for moving PV technologies deeper into the electricity marketplace.

  20. Revisiting the online health information reliability debate in the wake of "web 2.0": an inter-disciplinary literature and website review.

    Science.gov (United States)

    Adams, Samantha A

    2010-06-01

    The purpose of this inter-disciplinary literature review was to explore renewed concerns about the reliability of online health information in light of the increasing popularity of web applications that enable more end-user-generated content ("web 2.0"). The findings are based on a literature and web review. Literature was collected at four different points between October 2006 and October 2008 and included 56 sources from 10 academic disciplines. The web review consisted of following 6 blogs (including both new and archived posts, with comments) and one wiki for a period of 1.5 months and assessing the content for relevancy on six points, totaling 63 sources altogether. The reliability issues that are identified with respect to "web 2.0" reiterate more general concerns expressed about the web over the last 15 years. The difference, however, lies in the scope and scale of potential problems. Social scientists have also pointed to new issues that can be especially relevant for use of web 2.0 applications in health care. Specific points of renewed concern include: disclosure of authorship and information quality, anonymity and privacy, and the ability of individuals to apply information to their personal situation. Whether or not end-users understand what social scientists call "negative network externalities" is a new concern. Finally, not all reliability issues are negative-social networking and the shift from text-based information to symbolic information, images or interactive information, are considered to enhance patient education and to provide opportunities to reach diverse groups of patients. Interactive and collaborative web applications undeniably offer new opportunities for reaching patients and other health care consumers by facilitating lay information creation, sharing and retrieval. However, researchers must be careful and critical when incorporating applications or practices from other fields in health care. We must not easily dismiss concerns about

  1. Accurate overlaying for mobile augmented reality

    NARCIS (Netherlands)

    Pasman, W; van der Schaaf, A; Lagendijk, RL; Jansen, F.W.

    1999-01-01

    Mobile augmented reality requires accurate alignment of virtual information with objects visible in the real world. We describe a system for mobile communications to be developed to meet these strict alignment criteria using a combination of computer vision. inertial tracking and low-latency

  2. Reliable retrieval of atmospheric and aquatic parameters in coastal and inland environments from polar-orbiting and geostationary platforms: challenges and opportunities

    Science.gov (United States)

    Stamnes, Knut; Li, Wei; Lin, Zhenyi; Fan, Yongzhen; Chen, Nan; Gatebe, Charles; Ahn, Jae-Hyun; Kim, Wonkook; Stamnes, Jakob J.

    2017-04-01

    Simultaneous retrieval of aerosol and surface properties by means of inverse techniques based on a coupled atmosphere-surface radiative transfer model, neural networks, and optimal estimation can yield considerable improvements in retrieval accuracy in complex aquatic environments compared with traditional methods. Remote sensing of such environments represent specific challenges due (i) the complexity of the atmosphere and water inherent optical properties, (ii) unique bidirectional dependencies of the water-leaving radiance, and (iii) the desire to do retrievals for large solar zenith and viewing angles. We will discuss (a) how challenges related to atmospheric gaseous absorption, absorbing aerosols, and turbid waters can be addressed by using a coupled atmosphere-surface radiative transfer (forward) model in the retrieval process, (b) how the need to correct for bidirectional effects can be accommodated in a systematic and reliable manner, (c) how polarization information can be utilized, (d) how the curvature of the atmosphere can be taken into account, and (e) how neural networks and optimal estimation can be used to obtain fast yet accurate retrievals. Special emphasis will be placed on how information from existing and future sensors deployed on polar-orbiting and geostationary platforms can be obtained in a reliable and accurate manner. The need to provide uncertainty assessments and error budgets will also be discussed.

  3. Management Control for Reliable Financial Information

    Directory of Open Access Journals (Sweden)

    Victoria María Antonieta Martín Granados

    2010-06-01

    Full Text Available The financial information is the document that the administration of a juridical entity issues to know his financial situation. The financial information is useful and confiable for the users of the financial information when this has been prepared under conditions of certainty. This certainty is provided by the administration when it establishes political and procedures of internal control, as well as the surveillance in the accomplishment of the internal control. This control incides in the financial information since it is inherent to the operative flow and extends itself in relevant information, veracious and comparable. This is important for users of the financial information, due to the fact that they take timely and objective decisions.

  4. A three-dimensional sorting reliability algorithm for coastline deformation monitoring, using interferometric data

    International Nuclear Information System (INIS)

    Genderen, J v; Marghany, M

    2014-01-01

    The paper focusses on three-dimensional (3-D) coastline deformation using interferometric synthetic aperture radar data(InSAR). Conventional InSAR procedures were implemented on three repeat passes of ENVISAT ASAR data. Furthermore, the three-dimensional sorting reliabilities algorithm (3D-SRA) were implemented with the phase unwrapping technique. Subsequently, the 3D-SRA was used to eliminate the phase decorrelation impact from the interferograms. The study showed that the performance of the InSAR method using the 3D-SRA algorithm, is better than the conventional InSAR procedure. In conclusion, the integration of the 3D-SRA, together with phase unwrapping, can produce accurate 3-D coastline deformation information

  5. Design methodologies for reliability of SSL LED boards

    NARCIS (Netherlands)

    Jakovenko, J.; Formánek, J.; Perpiñà, X.; Jorda, X.; Vellvehi, M.; Werkhoven, R.J.; Husák, M.; Kunen, J.M.G.; Bancken, P.; Bolt, P.J.; Gasse, A.

    2013-01-01

    This work presents a comparison of various LED board technologies from thermal, mechanical and reliability point of view provided by an accurate 3-D modelling. LED boards are proposed as a possible technology replacement of FR4 LED boards used in 400 lumen retrofit SSL lamps. Presented design

  6. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  7. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  8. [The external evaluation of study quality: the role in maintaining the reliability of laboratory information].

    Science.gov (United States)

    Men'shikov, V V

    2013-08-01

    The external evaluation of quality of clinical laboratory examinations was gradually introduced in USSR medical laboratories since 1970s. In Russia, in the middle of 1990 a unified all-national system of external evaluation quality was organized known as the Federal center of external evaluation of quality at the basis of laboratory of the state research center of preventive medicine. The main positions of policy in this area were neatly formulated in the guidance documents of ministry of Health. Nowadays, the center of external evaluation of quality proposes 100 and more types of control studies and permanently extends their specter starting from interests of different disciplines of clinical medicine. The consistent participation of laboratories in the cycles of external evaluation of quality intrinsically promotes improvement of indicators of properness and precision of analysis results and increases reliability of laboratory information. However, a significant percentage of laboratories does not participate at all in external evaluation of quality or takes part in control process irregularly and in limited number of tests. The managers of a number of medical organizations disregard the application of the proposed possibilities to increase reliability of laboratory information and limit financing of studies in the field of quality control. The article proposes to adopt the national standard on the basis of ISO 17043 "Evaluation of compliance. The common requirements of professional competence testing".

  9. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  10. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  11. Bayesian Reliability Estimation for Deteriorating Systems with Limited Samples Using the Maximum Entropy Approach

    Directory of Open Access Journals (Sweden)

    Ning-Cong Xiao

    2013-12-01

    Full Text Available In this paper the combinations of maximum entropy method and Bayesian inference for reliability assessment of deteriorating system is proposed. Due to various uncertainties, less data and incomplete information, system parameters usually cannot be determined precisely. These uncertainty parameters can be modeled by fuzzy sets theory and the Bayesian inference which have been proved to be useful for deteriorating systems under small sample sizes. The maximum entropy approach can be used to calculate the maximum entropy density function of uncertainty parameters more accurately for it does not need any additional information and assumptions. Finally, two optimization models are presented which can be used to determine the lower and upper bounds of systems probability of failure under vague environment conditions. Two numerical examples are investigated to demonstrate the proposed method.

  12. The usefulness and scientific accuracy of private sector Arabic language patient drug information leaflets.

    Science.gov (United States)

    Sukkari, Sana R; Al Humaidan, Abdullah S; Sasich, Larry D

    2012-07-01

    Inadequate access to useful scientifically accurate patient information is a major cause of the inappropriate use of drugs resulting in serious personal injury and related costs to the health care system. The definition of useful scientifically accurate patient information for prescription drugs was accepted by the US Secretary of the Department of Health and Human Services in 1996 as that derived from or consistent with the US FDA approved professional product label for a drug. Previous quality content studies found that English language patient drug information leaflets distributed by US pharmacies failed to meet minimum criteria defining useful and scientifically accurate information. Evaluation forms containing the explicit elements that define useful scientifically accurate information for three drugs with known serious adverse drug reactions were created based on the current US FDA approved professional product labels. The Arabic language patient drug information leaflets for celecoxib, paroxetine, and lamotrigine were obtained locally and evaluated using a methodology similar to that used in previous quality content patient drug information studies in the US. The Arabic leaflets failed to meet the definition of useful scientifically accurate information. The celecoxib leaflet contained 30% of the required information and the paroxetine and lamotrigine leaflets contained 24% and 20%, respectively. There are several limitations to this study. The Arabic leaflets from only one commercial North American vendor were evaluated and the evaluation included a limited number of drugs. A larger study is necessary to be able to generalize these results. The study results are consistent with those of previous quality content studies of commercially available English patient drug information leaflets. The results have important implications for patients as access to a reliable source of drug information may prevent harm or limit the suffering from serious adverse drug

  13. Determination of accurate metal silicide layer thickness by RBS

    International Nuclear Information System (INIS)

    Kirchhoff, J.F.; Baumann, S.M.; Evans, C.; Ward, I.; Coveney, P.

    1995-01-01

    Rutherford Backscattering Spectrometry (RBS) is a proven useful analytical tool for determining compositional information of a wide variety of materials. One of the most widely utilized applications of RBS is the study of the composition of metal silicides (MSi x ), also referred to as polycides. A key quantity obtained from an analysis of a metal silicide is the ratio of silicon to metal (Si/M). Although compositional information is very reliable in these applications, determination of metal silicide layer thickness by RBS techniques can differ from true layer thicknesses by more than 40%. The cause of these differences lies in how the densities utilized in the RBS analysis are calculated. The standard RBS analysis software packages calculate layer densities by assuming each element's bulk densities weighted by the fractional atomic presence. This calculation causes large thickness discrepancies in metal silicide thicknesses because most films form into crystal structures with distinct densities. Assuming a constant layer density for a full spectrum of Si/M values for metal silicide samples improves layer thickness determination but ignores the underlying physics of the films. We will present results of RBS determination of the thickness various metal silicide films with a range of Si/M values using a physically accurate model for the calculation of layer densities. The thicknesses are compared to scanning electron microscopy (SEM) cross-section micrographs. We have also developed supporting software that incorporates these calculations into routine analyses. (orig.)

  14. Sacrificing information for the greater good

    DEFF Research Database (Denmark)

    Stensbo-Smidt, Kristoffer; Gieseke, Fabian Cristian; Igel, Christian

    2017-01-01

    Sky Survey (SDSS). For estimating sSFRs, we demonstrate that our method produces better estimates than traditional spectral energy distribution (SED) fitting. For estimating photo-z's, we show that our method produces more accurate photo-z's than the method employed by SDSS. The study highlights......Large-scale surveys make huge amounts of photometric data available. Because of the sheer amount of objects, spectral data cannot be obtained for all of them. Therefore it is important to devise techniques for reliably estimating physical properties of objects from photometric information alone....... These estimates are needed to automatically identify interesting objects worth a follow-up investigation as well as to produce the required data for a statistical analysis of the space covered by a survey. We argue that machine learning techniques are suitable to compute these estimates accurately and efficiently...

  15. Development of a setup to enable stable and accurate flow conditions for membrane biofouling studies

    KAUST Repository

    Bucs, Szilard; Farhat, Nadia; Siddiqui, Amber; Valladares Linares, Rodrigo; Radu, Andrea; Kruithof, Joop C.; Vrouwenvelder, Johannes S.

    2015-01-01

    on membrane performance parameters such as feed channel pressure drop. There is a suite of available monitors to study biofouling, but systems to operate monitors have not been well designed to achieve an accurate, constant water flow required for a reliable

  16. Reliability analysis of reinforced concrete grids with nonlinear material behavior

    Energy Technology Data Exchange (ETDEWEB)

    Neves, Rodrigo A [EESC-USP, Av. Trabalhador Sao Carlense, 400, 13566-590 Sao Carlos (Brazil); Chateauneuf, Alaa [LaMI-UBP and IFMA, Campus de Clermont-Fd, Les Cezeaux, BP 265, 63175 Aubiere cedex (France)]. E-mail: alaa.chateauneuf@ifma.fr; Venturini, Wilson S [EESC-USP, Av. Trabalhador Sao Carlense, 400, 13566-590 Sao Carlos (Brazil)]. E-mail: venturin@sc.usp.br; Lemaire, Maurice [LaMI-UBP and IFMA, Campus de Clermont-Fd, Les Cezeaux, BP 265, 63175 Aubiere cedex (France)

    2006-06-15

    Reinforced concrete grids are usually used to support large floor slabs. These grids are characterized by a great number of critical cross-sections, where the overall failure is usually sudden. However, nonlinear behavior of concrete leads to the redistribution of internal forces and accurate reliability assessment becomes mandatory. This paper presents a reliability study on reinforced concrete (RC) grids based on coupling Monte Carlo simulations with the response surface techniques. This approach allows us to analyze real RC grids with large number of failure components. The response surface is used to evaluate the structural safety by using first order reliability methods. The application to simple grids shows the interest of the proposed method and the role of moment redistribution in the reliability assessment.

  17. The Most Likely Nemesis to Timely, Accurate Electronic Information

    Science.gov (United States)

    2002-02-04

    NETWORKS, TRAINING, COMMERCIAL OFF-THE-SHELF, INFORMATION TECHNOLOGY , INTERNET , COMMUNICATIONS EQUIPMENT, ELECTRONIC INFORMATION 15.Abstract...infected over 200,000 Internet computers . While the objective appeared to be to create a log-jam on the Internet and not actually alter information on...Brigadier General Robert M. Shea, USMC, Director for Command, Control, Communications and Computers for the Marine Corps, cites information overload problems

  18. [The analytical reliability of clinical laboratory information and role of the standards in its support].

    Science.gov (United States)

    Men'shikov, V V

    2012-12-01

    The article deals with the factors impacting the reliability of clinical laboratory information. The differences of qualities of laboratory analysis tools produced by various manufacturers are discussed. These characteristics are the causes of discrepancy of the results of laboratory analyses of the same analite. The role of the reference system in supporting the comparability of laboratory analysis results is demonstrated. The project of national standard is presented to regulate the requirements to standards and calibrators for analysis of qualitative and non-metrical characteristics of components of biomaterials.

  19. Problematics of Reliability of Road Rollers

    Science.gov (United States)

    Stawowiak, Michał; Kuczaj, Mariusz

    2018-06-01

    This article refers to the reliability of road rollers used in a selected roadworks company. Information on the method of road rollers service and how the service affects the reliability of these rollers is presented. Attention was paid to the process of the implemented maintenance plan with regard to the machine's operational time. The reliability of road rollers was analyzed by determining and interpreting readiness coefficients.

  20. Direct Calculation of Permeability by High-Accurate Finite Difference and Numerical Integration Methods

    KAUST Repository

    Wang, Yi

    2016-07-21

    Velocity of fluid flow in underground porous media is 6~12 orders of magnitudes lower than that in pipelines. If numerical errors are not carefully controlled in this kind of simulations, high distortion of the final results may occur [1-4]. To fit the high accuracy demands of fluid flow simulations in porous media, traditional finite difference methods and numerical integration methods are discussed and corresponding high-accurate methods are developed. When applied to the direct calculation of full-tensor permeability for underground flow, the high-accurate finite difference method is confirmed to have numerical error as low as 10-5% while the high-accurate numerical integration method has numerical error around 0%. Thus, the approach combining the high-accurate finite difference and numerical integration methods is a reliable way to efficiently determine the characteristics of general full-tensor permeability such as maximum and minimum permeability components, principal direction and anisotropic ratio. Copyright © Global-Science Press 2016.

  1. Nuclear plant reliability data system. 1979 annual reports of cumulative system and component reliability

    International Nuclear Information System (INIS)

    1979-01-01

    The primary purposes of the information in these reports are the following: to provide operating statistics of safety-related systems within a unit which may be used to compare and evaluate reliability performance and to provide failure mode and failure rate statistics on components which may be used in failure mode effects analysis, fault hazard analysis, probabilistic reliability analysis, and so forth

  2. Improvement of the reliability on nondestructive inspection

    International Nuclear Information System (INIS)

    Song, Sung Jin; Kim, Young H.; Lee, Hyang Beom; Shin, Young Kil; Jung, Hyun Jo; Park, Ik Keun; Park, Eun Soo

    2002-03-01

    Retaining reliabilities of nondestructive testing is essential for the life-time maintenance of Nuclear Power Plant. The nondestructive testing methods which are frequently used in the Nuclear Power Plant are eddy current testing for the inspection of steam generator tubes and ultrasonic testing for the inspection of weldments. In order to improve reliabilities of ultrasonic testing and eddy current testing, the subjects carried out in this study are as follows : development of BEM analysis technique for ECT of SG tube, development of neural network technique for the intelligent analysis of ECT flaw signals of SG tubes, development of RFECT technology for the inspection of SG tube, FEM analysis of ultrasonic scattering field, evaluation of statistical reliability of PD-RR test of ultrasonic testing and development of multi-Gaussian beam modeling technique to predict accurate signal of signal beam ultrasonic testing with the efficiency in calculation time

  3. Improvement of the reliability on nondestructive inspection

    Energy Technology Data Exchange (ETDEWEB)

    Song, Sung Jin; Kim, Young H. [Sungkyunkwan Univ., Suwon (Korea, Republic of); Lee, Hyang Beom [Soongsil Univ., Seoul (Korea, Republic of); Shin, Young Kil [Kunsan National Univ., Gunsan (Korea, Republic of); Jung, Hyun Jo [Wonkwang Univ., Iksan (Korea, Republic of); Park, Ik Keun; Park, Eun Soo [Seoul Nationl Univ., Seoul (Korea, Republic of)

    2002-03-15

    Retaining reliabilities of nondestructive testing is essential for the life-time maintenance of Nuclear Power Plant. The nondestructive testing methods which are frequently used in the Nuclear Power Plant are eddy current testing for the inspection of steam generator tubes and ultrasonic testing for the inspection of weldments. In order to improve reliabilities of ultrasonic testing and eddy current testing, the subjects carried out in this study are as follows : development of BEM analysis technique for ECT of SG tube, development of neural network technique for the intelligent analysis of ECT flaw signals of SG tubes, development of RFECT technology for the inspection of SG tube, FEM analysis of ultrasonic scattering field, evaluation of statistical reliability of PD-RR test of ultrasonic testing and development of multi-Gaussian beam modeling technique to predict accurate signal of signal beam ultrasonic testing with the efficiency in calculation time.

  4. ASSESSING AND COMBINING RELIABILITY OF PROTEIN INTERACTION SOURCES

    Science.gov (United States)

    LEACH, SONIA; GABOW, AARON; HUNTER, LAWRENCE; GOLDBERG, DEBRA S.

    2008-01-01

    Integrating diverse sources of interaction information to create protein networks requires strategies sensitive to differences in accuracy and coverage of each source. Previous integration approaches calculate reliabilities of protein interaction information sources based on congruity to a designated ‘gold standard.’ In this paper, we provide a comparison of the two most popular existing approaches and propose a novel alternative for assessing reliabilities which does not require a gold standard. We identify a new method for combining the resultant reliabilities and compare it against an existing method. Further, we propose an extrinsic approach to evaluation of reliability estimates, considering their influence on the downstream tasks of inferring protein function and learning regulatory networks from expression data. Results using this evaluation method show 1) our method for reliability estimation is an attractive alternative to those requiring a gold standard and 2) the new method for combining reliabilities is less sensitive to noise in reliability assignments than the similar existing technique. PMID:17990508

  5. Completeness and reliability of mortality data in Viet Nam: Implications for the national routine health management information system.

    Science.gov (United States)

    Hong, Tran Thi; Phuong Hoa, Nguyen; Walker, Sue M; Hill, Peter S; Rao, Chalapati

    2018-01-01

    Mortality statistics form a crucial component of national Health Management Information Systems (HMIS). However, there are limitations in the availability and quality of mortality data at national level in Viet Nam. This study assessed the completeness of recorded deaths and the reliability of recorded causes of death (COD) in the A6 death registers in the national routine HMIS in Viet Nam. 1477 identified deaths in 2014 were reviewed in two provinces. A capture-recapture method was applied to assess the completeness of the A6 death registers. 1365 household verbal autopsy (VA) interviews were successfully conducted, and these were reviewed by physicians who assigned multiple and underlying cause of death (UCOD). These UCODs from VA were then compared with the CODs recorded in the A6 death registers, using kappa scores to assess the reliability of the A6 death register diagnoses. The overall completeness of the A6 death registers in the two provinces was 89.3% (95%CI: 87.8-90.8). No COD recorded in the A6 death registers demonstrated good reliability. There is very low reliability in recording of cardiovascular deaths (kappa for stroke = 0.47 and kappa for ischaemic heart diseases = 0.42) and diabetes (kappa = 0.33). The reporting of deaths due to road traffic accidents, HIV and some cancers are at a moderate level of reliability with kappa scores ranging between 0.57-0.69 (pViet Nam.

  6. Reliable and Accurate Release of Micro-Sized Objects with a Gripper that Uses the Capillary-Force Method

    Directory of Open Access Journals (Sweden)

    Suzana Uran

    2017-06-01

    Full Text Available There have been recent developments in grippers that are based on capillary force and condensed water droplets. These are used for manipulating micro-sized objects. Recently, one-finger grippers have been produced that are able to reliably grip using the capillary force. To release objects, either the van der Waals, gravitational or inertial-forces method is used. This article presents methods for reliably gripping and releasing micro-objects using the capillary force. The moisture from the surrounding air is condensed into a thin layer of water on the contact surfaces of the objects. From the thin layer of water, a water meniscus between the micro-sized object, the gripper and the releasing surface is created. Consequently, the water meniscus between the object and the releasing surface produces a high enough capillary force to release the micro-sized object from the tip of the one-finger gripper. In this case, either polystyrene, glass beads with diameters between 5–60 µm, or irregularly shaped dust particles of similar sizes were used. 3D structures made up of micro-sized objects could be constructed using this method. This method is reliable for releasing during assembly and also for gripping, when the objects are removed from the top of the 3D structure—the so-called “disassembling gripping” process. The accuracy of the release was lower than 0.5 µm.

  7. Life cycle reliability assessment of new products—A Bayesian model updating approach

    International Nuclear Information System (INIS)

    Peng, Weiwen; Huang, Hong-Zhong; Li, Yanfeng; Zuo, Ming J.; Xie, Min

    2013-01-01

    The rapidly increasing pace and continuously evolving reliability requirements of new products have made life cycle reliability assessment of new products an imperative yet difficult work. While much work has been done to separately estimate reliability of new products in specific stages, a gap exists in carrying out life cycle reliability assessment throughout all life cycle stages. We present a Bayesian model updating approach (BMUA) for life cycle reliability assessment of new products. Novel features of this approach are the development of Bayesian information toolkits by separately including “reliability improvement factor” and “information fusion factor”, which allow the integration of subjective information in a specific life cycle stage and the transition of integrated information between adjacent life cycle stages. They lead to the unique characteristics of the BMUA in which information generated throughout life cycle stages are integrated coherently. To illustrate the approach, an application to the life cycle reliability assessment of a newly developed Gantry Machining Center is shown

  8. Teaching accuracy and reliability for student projects

    Science.gov (United States)

    Fisher, Nick

    2002-09-01

    Physics students at Rugby School follow the Salters Horners A-level course, which involves working on a two-week practical project of their own choosing. Pupils often misunderstand the concepts of accuracy and reliability, believing, for example, that repeating readings makes them more accurate and more reliable, whereas all it does is help to check repeatability. The course emphasizes the ideas of checking anomalous points, improving accuracy and making readings more sensitive. This article describes how we teach pupils in preparation for their projects. Based on many years of running such projects, much of this material is from a short booklet that we give out to pupils, when we train them in practical project skills.

  9. Health Information Needs and Reliability of Sources Among Nondegree Health Sciences Students: A Prerequisite for Designing eHealth Literacy.

    Science.gov (United States)

    Haruna, Hussein; Tshuma, Ndumiso; Hu, Xiao

    Understanding health information needs and health-seeking behavior is a prerequisite for developing an electronic health information literacy (EHIL) or eHealth literacy program for nondegree health sciences students. At present, interest in researching health information needs and reliable sources paradigms has gained momentum in many countries. However, most studies focus on health professionals and students in higher education institutions. The present study was aimed at providing new insight and filling the existing gap by examining health information needs and reliability of sources among nondegree health sciences students in Tanzania. A cross-sectional study was conducted in 15 conveniently selected health training institutions, where 403 health sciences students were participated. Thirty health sciences students were both purposely and conveniently chosen from each health-training institution. The selected students were pursuing nursing and midwifery, clinical medicine, dentistry, environmental health sciences, pharmacy, and medical laboratory sciences courses. Involved students were either in their first year, second year, or third year of study. Health sciences students' health information needs focus on their educational requirements, clinical practice, and personal information. They use print, human, and electronic health information. They lack eHealth research skills in navigating health information resources and have insufficient facilities for accessing eHealth information, a lack of specialists in health information, high costs for subscription electronic information, and unawareness of the availability of free Internet and other online health-related databases. This study found that nondegree health sciences students have limited skills in EHIL. Thus, designing and incorporating EHIL skills programs into the curriculum of nondegree health sciences students is vital. EHIL is a requirement common to all health settings, learning environments, and

  10. Technical information report: Plasma melter operation, reliability, and maintenance analysis

    International Nuclear Information System (INIS)

    Hendrickson, D.W.

    1995-01-01

    This document provides a technical report of operability, reliability, and maintenance of a plasma melter for low-level waste vitrification, in support of the Hanford Tank Waste Remediation System (TWRS) Low-Level Waste (LLW) Vitrification Program. A process description is provided that minimizes maintenance and downtime and includes material and energy balances, equipment sizes and arrangement, startup/operation/maintence/shutdown cycle descriptions, and basis for scale-up to a 200 metric ton/day production facility. Operational requirements are provided including utilities, feeds, labor, and maintenance. Equipment reliability estimates and maintenance requirements are provided which includes a list of failure modes, responses, and consequences

  11. Methods to compute reliabilities for genomic predictions of feed intake

    Science.gov (United States)

    For new traits without historical reference data, cross-validation is often the preferred method to validate reliability (REL). Time truncation is less useful because few animals gain substantial REL after the truncation point. Accurate cross-validation requires separating genomic gain from pedigree...

  12. Methods of Estimation the Reliability and Increasing the Informativeness of the Laboratory Results (Analysis of the Laboratory Case of Measurement the Indicators of Thyroid Function

    Directory of Open Access Journals (Sweden)

    N A Kovyazina

    2014-06-01

    Full Text Available The goal of the study was to demonstrate the multilevel laboratory quality management system and point at the methods of estimating the reliability and increasing the amount of information content of the laboratory results (on the example of the laboratory case. Results. The article examines the stages of laboratory quality management which has helped to estimate the reliability of the results of determining Free T3, Free T4 and TSH. The measurement results are presented by the expanded uncertainty and the evaluation of the dynamics. Conclusion. Compliance with mandatory measures for laboratory quality management system enables laboratories to obtain reliable results and calculate the parameters that are able to increase the amount of information content of laboratory tests in clinical decision making.

  13. Photovoltaic performance and reliability workshop

    Energy Technology Data Exchange (ETDEWEB)

    Mrig, L. [ed.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  14. Application of reliability methods in Ontario Hydro

    International Nuclear Information System (INIS)

    Jeppesen, R.; Ravishankar, T.J.

    1985-01-01

    Ontario Hydro have established a reliability program in support of its substantial nuclear program. Application of the reliability program to achieve both production and safety goals is described. The value of such a reliability program is evident in the record of Ontario Hydro's operating nuclear stations. The factors which have contributed to the success of the reliability program are identified as line management's commitment to reliability; selective and judicious application of reliability methods; establishing performance goals and monitoring the in-service performance; and collection, distribution, review and utilization of performance information to facilitate cost-effective achievement of goals and improvements. (orig.)

  15. Effectiveness of different approaches to disseminating traveler information on travel time reliability.

    Science.gov (United States)

    2014-01-01

    The second Strategic Highway Research Program (SHRP 2) Reliability program aims to improve trip time reliability by reducing the frequency and effects of events that cause travel times to fluctuate unpredictably. Congestion caused by unreliable, or n...

  16. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  17. Reliability of Using Motion Sensors to Measure Children’s Physical Activity Levels in Exergaming

    Directory of Open Access Journals (Sweden)

    Nan Zeng

    2018-05-01

    Full Text Available Objectives: This study examined the reliability of two objective measurement tools in assessing children’s physical activity (PA levels in an exergaming setting. Methods: A total of 377 children (190 girls, Mage = 8.39, SD = 1.55 attended the 30-min exergaming class every other day for 18 weeks. Children’s PA levels were concurrently measured by NL-1000 pedometer and ActiGraph GT3X accelerometer, while children’s steps per min and time engaged in sedentary, light, and moderate-to-vigorous PA were estimated, respectively. Results: The results of intraclass correlation coefficient (ICC indicated a low degree of reliability (single measures ICC = 0.03 in accelerometers. ANOVA did detect a possible learning effect for 27 classes (p < 0.01, and the single measures ICC was 0.20 for pedometers. Moreover, there was no significant positive relationship between steps per min and time spent in moderate-to-vigorous physical activity (MVPA. Finally, only 1.3% variance was explained by pedometer as a predictor using Hierarchical Linear Modeling to further explore the relationship between pedometer and accelerometer data. Conclusions: The NL-1000 pedometers and ActiGraph GT3X accelerometers have low reliability in assessing elementary school children’s PA levels during exergaming. More research is warranted in determining the reliable and accurate measurement information regarding the use of modern devices in exergaming setting.

  18. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  19. Accurate monoenergetic electron parameters of laser wakefield in a bubble model

    Science.gov (United States)

    Raheli, A.; Rahmatallahpur, S. H.

    2012-11-01

    A reliable analytical expression for the potential of plasma waves with phase velocities near the speed of light is derived. The presented spheroid cavity model is more consistent than the previous spherical and ellipsoidal model and it explains the mono-energetic electron trajectory more accurately, especially at the relativistic region. As a result, the quasi-mono-energetic electrons output beam interacting with the laser plasma can be more appropriately described with this model.

  20. A reliable method for the stability analysis of structures ...

    African Journals Online (AJOL)

    The detection of structural configurations with singular tangent stiffness matrix is essential because they can be unstable. The secondary paths, especially in unstable buckling, can play the most important role in the loss of stability and collapse of the structure. A new method for reliable detection and accurate computation of ...

  1. Reliability analysis in intelligent machines

    Science.gov (United States)

    Mcinroy, John E.; Saridis, George N.

    1990-01-01

    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  2. Scaled CMOS Technology Reliability Users Guide

    Science.gov (United States)

    White, Mark

    2010-01-01

    The desire to assess the reliability of emerging scaled microelectronics technologies through faster reliability trials and more accurate acceleration models is the precursor for further research and experimentation in this relevant field. The effect of semiconductor scaling on microelectronics product reliability is an important aspect to the high reliability application user. From the perspective of a customer or user, who in many cases must deal with very limited, if any, manufacturer's reliability data to assess the product for a highly-reliable application, product-level testing is critical in the characterization and reliability assessment of advanced nanometer semiconductor scaling effects on microelectronics reliability. A methodology on how to accomplish this and techniques for deriving the expected product-level reliability on commercial memory products are provided.Competing mechanism theory and the multiple failure mechanism model are applied to the experimental results of scaled SDRAM products. Accelerated stress testing at multiple conditions is applied at the product level of several scaled memory products to assess the performance degradation and product reliability. Acceleration models are derived for each case. For several scaled SDRAM products, retention time degradation is studied and two distinct soft error populations are observed with each technology generation: early breakdown, characterized by randomly distributed weak bits with Weibull slope (beta)=1, and a main population breakdown with an increasing failure rate. Retention time soft error rates are calculated and a multiple failure mechanism acceleration model with parameters is derived for each technology. Defect densities are calculated and reflect a decreasing trend in the percentage of random defective bits for each successive product generation. A normalized soft error failure rate of the memory data retention time in FIT/Gb and FIT/cm2 for several scaled SDRAM generations is

  3. Financial reporting reliability concept and impact factors in its determination

    Directory of Open Access Journals (Sweden)

    Vygivska I.М.

    2017-03-01

    Full Text Available The article sets the goal to assess the reliability of financial reporting essence and summarize the factors that affect it. The article grounds the reliability of financial reporting urgency and characterizes its dual nature (on one hand, it is accounting containing reliable, reasonable data, characterized by the absence of errors and can be correctly perceived by users, on the other hand, reliable financial statements are considered to be formed according to the rules regulated by normative documents. It is proved that making decisions by interested users based on financial statements is closely related to information risk that arises because of the restrictions affect the flow of information on which management decisions are taken (accounting information that is difficult to understand by unskilled users, the impact of human factor, the level of information importance. The author establishes that the reliability of financial reporting is affected by many factors whose influence can be seen both outside, and within an enterprise. The main factors include: informational, political, legal, human, organizational and financial and they should be taken into account while making management decisions.

  4. "A Comparison of Consensus, Consistency, and Measurement Approaches to Estimating Interrater Reliability"

    OpenAIRE

    Steven E. Stemler

    2004-01-01

    This article argues that the general practice of describing interrater reliability as a single, unified concept is..at best imprecise, and at worst potentially misleading. Rather than representing a single concept, different..statistical methods for computing interrater reliability can be more accurately classified into one of three..categories based upon the underlying goals of analysis. The three general categories introduced and..described in this paper are: 1) consensus estimates, 2) cons...

  5. Using reliability analysis to support decision making\\ud in phased mission systems

    OpenAIRE

    Zhang, Yang; Prescott, Darren

    2017-01-01

    Due to the environments in which they will operate, future autonomous systems must be capable of reconfiguring quickly and safely following faults or environmental changes. Past research has shown how, by considering autonomous systems to perform phased missions, reliability analysis can support decision making by allowing comparison of the probability of success of different missions following reconfiguration. Binary Decision Diagrams (BDDs) offer fast, accurate reliability analysis that cou...

  6. Reliability Evaluation of Service-Oriented Architecture Systems Considering Fault-Tolerance Designs

    Directory of Open Access Journals (Sweden)

    Kuan-Li Peng

    2014-01-01

    strategies. Sensitivity analysis of SOA at both coarse and fine grain levels is also studied, which can be used to efficiently identify the critical parts within the system. Two SOA system scenarios based on real industrial practices are studied. Experimental results show that the proposed SOA model can be used to accurately depict the behavior of SOA systems. Additionally, a sensitivity analysis that quantizes the effects of system structure as well as fault tolerance on the overall reliability is also studied. On the whole, the proposed reliability modeling and analysis framework may help the SOA system service provider to evaluate the overall system reliability effectively and also make smarter improvement plans by focusing resources on enhancing reliability-sensitive parts within the system.

  7. Reliability of operating WWER monitoring systems

    International Nuclear Information System (INIS)

    Yastrebenetsky, M.A.; Goldrin, V.M.; Garagulya, A.V.

    1996-01-01

    The elaboration of WWER monitoring systems reliability measures is described in this paper. The evaluation is based on the statistical data about failures what have collected at the Ukrainian operating nuclear power plants (NPP). The main attention is devoted to radiation safety monitoring system and unit information computer system, what collects information from different sensors and system of the unit. Reliability measures were used for decision the problems, connected with life extension of the instruments, and for other purposes. (author). 6 refs, 6 figs

  8. Reliability of operating WWER monitoring systems

    Energy Technology Data Exchange (ETDEWEB)

    Yastrebenetsky, M A; Goldrin, V M; Garagulya, A V [Ukrainian State Scientific Technical Center of Nuclear and Radiation Safety, Kharkov (Ukraine). Instrumentation and Control Systems Dept.

    1997-12-31

    The elaboration of WWER monitoring systems reliability measures is described in this paper. The evaluation is based on the statistical data about failures what have collected at the Ukrainian operating nuclear power plants (NPP). The main attention is devoted to radiation safety monitoring system and unit information computer system, what collects information from different sensors and system of the unit. Reliability measures were used for decision the problems, connected with life extension of the instruments, and for other purposes. (author). 6 refs, 6 figs.

  9. Reliability of Maximal Strength Testing in Novice Weightlifters

    Science.gov (United States)

    Loehr, James A.; Lee, Stuart M. C.; Feiveson, Alan H.; Ploutz-Snyder, Lori L.

    2009-01-01

    The one repetition maximum (1RM) is a criterion measure of muscle strength. However, the reliability of 1RM testing in novice subjects has received little attention. Understanding this information is crucial to accurately interpret changes in muscle strength. To evaluate the test-retest reliability of a squat (SQ), heel raise (HR), and deadlift (DL) 1RM in novice subjects. Twenty healthy males (31 plus or minus 5 y, 179.1 plus or minus 6.1 cm, 81.4 plus or minus 10.6 kg) with no weight training experience in the previous six months participated in four 1RM testing sessions, with each session separated by 5-7 days. SQ and HR 1RM were conducted using a smith machine; DL 1RM was assessed using free weights. Session 1 was considered a familiarization and was not included in the statistical analyses. Repeated measures analysis of variance with Tukey fs post-hoc tests were used to detect between-session differences in 1RM (p.0.05). Test-retest reliability was evaluated by intraclass correlation coefficients (ICC). During Session 2, the SQ and DL 1RM (SQ: 90.2 }4.3, DL: 75.9 }3.3 kg) were less than Session 3 (SQ: 95.3 }4.1, DL: 81.5 plus or minus 3.5 kg) and Session 4 (SQ: 96.6 }4.0, DL: 82.4 }3.9 kg), but there were no differences between Session 3 and Session 4. HR 1RM measured during Session 2 (150.1 }3.7 kg) and Session 3 (152.5 }3.9 kg) were not different from one another, but both were less than Session 4 (157.5 }3.8 kg). The reliability (ICC) of 1RM measures for Sessions 2-4 were 0.88, 0.83, and 0.87, for SQ, HR, and DL, respectively. When considering only Sessions 3 and 4, the reliability was 0.93, 0.91, and 0.86 for SQ, HR, and DL, respectively. One familiarization session and 2 test sessions (for SQ and DL) were required to obtain excellent reliability (ICC greater than or equal to 0.90) in 1RM values with novice subjects. We were unable to attain this level of reliability following 3 HR testing sessions therefore additional sessions may be required to obtain an

  10. Influence Of Inspection Intervals On Mechanical System Reliability

    International Nuclear Information System (INIS)

    Zilberman, B.

    1998-01-01

    In this paper a methodology of reliability analysis of mechanical systems with latent failures is described. Reliability analysis of such systems must include appropriate usage of check intervals for latent failure detection. The methodology suggests, that based on system logic the analyst decides at the beginning if a system can fail actively or latently and propagates this approach through all system levels. All inspections are assumed to be perfect (all failures are detected and repaired and no new failures are introduced as a result of the maintenance). Additional assumptions are that mission time is much smaller, than check intervals and all components have constant failure rates. Analytical expressions for reliability calculates are provided, based on fault tree and Markov modeling techniques (for two and three redundant systems with inspection intervals). The proposed methodology yields more accurate results than are obtained by not using check intervals or using half check interval times. The conventional analysis assuming that at the beginning of each mission system is as new, give an optimistic prediction of system reliability. Some examples of reliability calculations of mechanical systems with latent failures and establishing optimum check intervals are provided

  11. Calculating system reliability with SRFYDO

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, Jerome [Los Alamos National Laboratory; Anderson - Cook, Christine M [Los Alamos National Laboratory; Klamann, Richard M [Los Alamos National Laboratory

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  12. NASA reliability preferred practices for design and test

    Science.gov (United States)

    1991-01-01

    Given here is a manual that was produced to communicate within the aerospace community design practices that have contributed to NASA mission success. The information represents the best technical advice that NASA has to offer on reliability design and test practices. Topics covered include reliability practices, including design criteria, test procedures, and analytical techniques that have been applied to previous space flight programs; and reliability guidelines, including techniques currently applied to space flight projects, where sufficient information exists to certify that the technique will contribute to mission success.

  13. Construct validity and reliability of automated body reaction test ...

    African Journals Online (AJOL)

    Automated Body Reaction Test (ABRT) is a new device for skills and physical assessment instrument to measure ability on react, move quickly and accurately in accordance with stimulus. A total of 474 subjects aged 7-17 years old were randomly selected for the construct validity (n=330) and reliability (n=144). The ABRT ...

  14. DATMAN: A reliability data analysis program using Bayesian updating

    International Nuclear Information System (INIS)

    Becker, M.; Feltus, M.A.

    1996-01-01

    Preventive maintenance (PM) techniques focus on the prevention of failures, in particular, system components that are important to plant functions. Reliability-centered maintenance (RCM) improves on the PM techniques by introducing a set of guidelines by which to evaluate the system functions. It also minimizes intrusive maintenance, labor, and equipment downtime without sacrificing system performance when its function is essential for plant safety. Both the PM and RCM approaches require that system reliability data be updated as more component failures and operation time are acquired. Systems reliability and the likelihood of component failures can be calculated by Bayesian statistical methods, which can update these data. The DATMAN computer code has been developed at Penn State to simplify the Bayesian analysis by performing tedious calculations needed for RCM reliability analysis. DATMAN reads data for updating, fits a distribution that best fits the data, and calculates component reliability. DATMAN provides a user-friendly interface menu that allows the user to choose from several common prior and posterior distributions, insert new failure data, and visually select the distribution that matches the data most accurately

  15. Reliability-based design code calibration for concrete containment structures

    International Nuclear Information System (INIS)

    Han, B.K.; Cho, H.N.; Chang, S.P.

    1991-01-01

    In this study, a load combination criteria for design and a probability-based reliability analysis were proposed on the basis of a FEM-based random vibration analysis. The limit state model defined for the study is a serviceability limit state of the crack failure that causes the emission of radioactive materials, and the results are compared with the case of strength limit state. More accurate reliability analyses under various dynamic loads such as earthquake loads were made possible by incorporating the FEM and random vibration theory, which is different from the conventional reliability analysis method. The uncertainties in loads and resistance available in Korea and the references were adapted to the situation of Korea, and especially in case of earthquake, the design earthquake was assessed based on the available data for the probabilistic description of earthquake ground acceleration in the Korea peninsula. The SAP V-2 is used for a three-dimensional finite element analysis of concrete containment structure, and the reliability analysis is carried out by modifying HRAS reliability analysis program for this study. (orig./GL)

  16. Adaptation of the ToxRTool to Assess the Reliability of Toxicology Studies Conducted with Genetically Modified Crops and Implications for Future Safety Testing.

    Science.gov (United States)

    Koch, Michael S; DeSesso, John M; Williams, Amy Lavin; Michalek, Suzanne; Hammond, Bruce

    2016-01-01

    To determine the reliability of food safety studies carried out in rodents with genetically modified (GM) crops, a Food Safety Study Reliability Tool (FSSRTool) was adapted from the European Centre for the Validation of Alternative Methods' (ECVAM) ToxRTool. Reliability was defined as the inherent quality of the study with regard to use of standardized testing methodology, full documentation of experimental procedures and results, and the plausibility of the findings. Codex guidelines for GM crop safety evaluations indicate toxicology studies are not needed when comparability of the GM crop to its conventional counterpart has been demonstrated. This guidance notwithstanding, animal feeding studies have routinely been conducted with GM crops, but their conclusions on safety are not always consistent. To accurately evaluate potential risks from GM crops, risk assessors need clearly interpretable results from reliable studies. The development of the FSSRTool, which provides the user with a means of assessing the reliability of a toxicology study to inform risk assessment, is discussed. Its application to the body of literature on GM crop food safety studies demonstrates that reliable studies report no toxicologically relevant differences between rodents fed GM crops or their non-GM comparators.

  17. Reliable design of electronic equipment an engineering guide

    CERN Document Server

    Natarajan, Dhanasekharan

    2014-01-01

    This book explains reliability techniques with examples from electronics design for the benefit of engineers. It presents the application of de-rating, FMEA, overstress analyses and reliability improvement tests for designing reliable electronic equipment. Adequate information is provided for designing computerized reliability database system to support the application of the techniques by designers. Pedantic terms and the associated mathematics of reliability engineering discipline are excluded for the benefit of comprehensiveness and practical applications. This book offers excellent support

  18. The Effect of Incorrect Reliability Information on Expectations, Perceptions, and Use of Automation.

    Science.gov (United States)

    Barg-Walkow, Laura H; Rogers, Wendy A

    2016-03-01

    We examined how providing artificially high or low statements about automation reliability affected expectations, perceptions, and use of automation over time. One common method of introducing automation is providing explicit statements about the automation's capabilities. Research is needed to understand how expectations from such introductions affect perceptions and use of automation. Explicit-statement introductions were manipulated to set higher-than (90%), same-as (75%), or lower-than (60%) levels of expectations in a dual-task scenario with 75% reliable automation. Two experiments were conducted to assess expectations, perceptions, compliance, reliance, and task performance over (a) 2 days and (b) 4 days. The baseline assessments showed initial expectations of automation reliability matched introduced levels of expectation. For the duration of each experiment, the lower-than groups' perceptions were lower than the actual automation reliability. However, the higher-than groups' perceptions were no different from actual automation reliability after Day 1 in either study. There were few differences between groups for automation use, which generally stayed the same or increased with experience using the system. Introductory statements describing artificially low automation reliability have a long-lasting impact on perceptions about automation performance. Statements including incorrect automation reliability do not appear to affect use of automation. Introductions should be designed according to desired outcomes for expectations, perceptions, and use of the automation. Low expectations have long-lasting effects. © 2015, Human Factors and Ergonomics Society.

  19. Design and experimentation of an empirical multistructure framework for accurate, sharp and reliable hydrological ensembles

    Science.gov (United States)

    Seiller, G.; Anctil, F.; Roy, R.

    2017-09-01

    This paper outlines the design and experimentation of an Empirical Multistructure Framework (EMF) for lumped conceptual hydrological modeling. This concept is inspired from modular frameworks, empirical model development, and multimodel applications, and encompasses the overproduce and select paradigm. The EMF concept aims to reduce subjectivity in conceptual hydrological modeling practice and includes model selection in the optimisation steps, reducing initial assumptions on the prior perception of the dominant rainfall-runoff transformation processes. EMF generates thousands of new modeling options from, for now, twelve parent models that share their functional components and parameters. Optimisation resorts to ensemble calibration, ranking and selection of individual child time series based on optimal bias and reliability trade-offs, as well as accuracy and sharpness improvement of the ensemble. Results on 37 snow-dominated Canadian catchments and 20 climatically-diversified American catchments reveal the excellent potential of the EMF in generating new individual model alternatives, with high respective performance values, that may be pooled efficiently into ensembles of seven to sixty constitutive members, with low bias and high accuracy, sharpness, and reliability. A group of 1446 new models is highlighted to offer good potential on other catchments or applications, based on their individual and collective interests. An analysis of the preferred functional components reveals the importance of the production and total flow elements. Overall, results from this research confirm the added value of ensemble and flexible approaches for hydrological applications, especially in uncertain contexts, and open up new modeling possibilities.

  20. Markerless motion capture can provide reliable 3D gait kinematics in the sagittal and frontal plane

    DEFF Research Database (Denmark)

    Sandau, Martin; Koblauch, Henrik; Moeslund, Thomas B.

    2014-01-01

    Estimating 3D joint rotations in the lower extremities accurately and reliably remains unresolved in markerless motion capture, despite extensive studies in the past decades. The main problems have been ascribed to the limited accuracy of the 3D reconstructions. Accordingly, the purpose of the pr......Estimating 3D joint rotations in the lower extremities accurately and reliably remains unresolved in markerless motion capture, despite extensive studies in the past decades. The main problems have been ascribed to the limited accuracy of the 3D reconstructions. Accordingly, the purpose...... subjects in whom hip, knee and ankle joint were analysed. Flexion/extension angles as well as hip abduction/adduction closely resembled those obtained from the marker based system. However, the internal/external rotations, knee abduction/adduction and ankle inversion/eversion were less reliable....

  1. An economic perspective on the reliability of lighting systems in building with highly efficient energy: A case study

    International Nuclear Information System (INIS)

    Salata, F.; Lieto Vollaro, A. de; Ferraro, A.

    2014-01-01

    Highlights: • Proper design of efficient lighting systems. • The reliability and durability of the light sources. • Maintenance of lighting systems. • Quality standards of LED lamps. • Optimum economic choice of light sources. - Abstract: The performance of lighting system must be calculated in order to determine the energy requirements of the building. In the normative [EN 12464-1] are established lighting requirements which have effects on energy needs. The European standard [EN 15193] provides guidance on that evaluation. The easiest way to comply with reduction of energy requirements leads to the replacement of traditional lamps with LED ones, but if we calculate also the reliability parameters, the economic return is not guaranteed. Using bibliographic data, we have compared lighting’s results for a museum (LED lamps versus CFL and halogen lamps). The objective function of the study is to optimize the energy consumption of lighting systems, but at the same time to assess the reliability (MTTF of the lamps) of these systems. Without accurate information about this last parameters, the right choice of the lamps cannot be done successfully

  2. Validity and reliability testing of two instruments to measure breast cancer patients' concerns and information needs relating to radiation therapy

    Directory of Open Access Journals (Sweden)

    Kristjanson Linda J

    2007-11-01

    Full Text Available Abstract Background It is difficult to determine the most effective approach to patient education or tailor education interventions for patients in radiotherapy without tools that assess patients' specific radiation therapy information needs and concerns. Therefore, the aim of this study was to develop psychometrically sound tools to adequately determine the concerns and information needs of cancer patients during radiation therapy. Patients and Methods Two tools were developed to (1 determine patients concerns about radiation therapy (RT Concerns Scale and (2 ascertain patient's information needs at different time point during their radiation therapy (RT Information Needs Scale. Tools were based on previous research by the authors, published literature on breast cancer and radiation therapy and information behaviour research. Thirty-one breast cancer patients completed the questionnaire on one occasion and thirty participants completed the questionnaire on a second occasion to facilitate test-retest reliability. One participant's responses were removed from the analysis. Results were analysed for content validity, internal consistency and stability over time. Results Both tools demonstrated high internal consistency and adequate stability over time. The nine items in the RT Concerns Scale were retained because they met all pre-set psychometric criteria. Two items were deleted from the RT Information Needs Scale because they did not meet content validity criteria and did not achieve pre-specified criteria for internal consistency. This tool now contains 22 items. Conclusion This paper provides preliminary data suggesting that the two tools presented are reliable and valid and would be suitable for use in trials or in the clinical setting.

  3. The FLUKA code: An accurate simulation tool for particle therapy

    CERN Document Server

    Battistoni, Giuseppe; Böhlen, Till T; Cerutti, Francesco; Chin, Mary Pik Wai; Dos Santos Augusto, Ricardo M; Ferrari, Alfredo; Garcia Ortega, Pablo; Kozlowska, Wioletta S; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically-based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in-vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with bot...

  4. Distribution-level electricity reliability: Temporal trends using statistical analysis

    International Nuclear Information System (INIS)

    Eto, Joseph H.; LaCommare, Kristina H.; Larsen, Peter; Todd, Annika; Fisher, Emily

    2012-01-01

    This paper helps to address the lack of comprehensive, national-scale information on the reliability of the U.S. electric power system by assessing trends in U.S. electricity reliability based on the information reported by the electric utilities on power interruptions experienced by their customers. The research analyzes up to 10 years of electricity reliability information collected from 155 U.S. electric utilities, which together account for roughly 50% of total U.S. electricity sales. We find that reported annual average duration and annual average frequency of power interruptions have been increasing over time at a rate of approximately 2% annually. We find that, independent of this trend, installation or upgrade of an automated outage management system is correlated with an increase in the reported annual average duration of power interruptions. We also find that reliance on IEEE Standard 1366-2003 is correlated with higher reported reliability compared to reported reliability not using the IEEE standard. However, we caution that we cannot attribute reliance on the IEEE standard as having caused or led to higher reported reliability because we could not separate the effect of reliance on the IEEE standard from other utility-specific factors that may be correlated with reliance on the IEEE standard. - Highlights: ► We assess trends in electricity reliability based on the information reported by the electric utilities. ► We use rigorous statistical techniques to account for utility-specific differences. ► We find modest declines in reliability analyzing interruption duration and frequency experienced by utility customers. ► Installation or upgrade of an OMS is correlated to an increase in reported duration of power interruptions. ► We find reliance in IEEE Standard 1366 is correlated with higher reported reliability.

  5. Novel serologic biomarkers provide accurate estimates of recent Plasmodium falciparum exposure for individuals and communities.

    Science.gov (United States)

    Helb, Danica A; Tetteh, Kevin K A; Felgner, Philip L; Skinner, Jeff; Hubbard, Alan; Arinaitwe, Emmanuel; Mayanja-Kizza, Harriet; Ssewanyana, Isaac; Kamya, Moses R; Beeson, James G; Tappero, Jordan; Smith, David L; Crompton, Peter D; Rosenthal, Philip J; Dorsey, Grant; Drakeley, Christopher J; Greenhouse, Bryan

    2015-08-11

    Tools to reliably measure Plasmodium falciparum (Pf) exposure in individuals and communities are needed to guide and evaluate malaria control interventions. Serologic assays can potentially produce precise exposure estimates at low cost; however, current approaches based on responses to a few characterized antigens are not designed to estimate exposure in individuals. Pf-specific antibody responses differ by antigen, suggesting that selection of antigens with defined kinetic profiles will improve estimates of Pf exposure. To identify novel serologic biomarkers of malaria exposure, we evaluated responses to 856 Pf antigens by protein microarray in 186 Ugandan children, for whom detailed Pf exposure data were available. Using data-adaptive statistical methods, we identified combinations of antibody responses that maximized information on an individual's recent exposure. Responses to three novel Pf antigens accurately classified whether an individual had been infected within the last 30, 90, or 365 d (cross-validated area under the curve = 0.86-0.93), whereas responses to six antigens accurately estimated an individual's malaria incidence in the prior year. Cross-validated incidence predictions for individuals in different communities provided accurate stratification of exposure between populations and suggest that precise estimates of community exposure can be obtained from sampling a small subset of that community. In addition, serologic incidence predictions from cross-sectional samples characterized heterogeneity within a community similarly to 1 y of continuous passive surveillance. Development of simple ELISA-based assays derived from the successful selection strategy outlined here offers the potential to generate rich epidemiologic surveillance data that will be widely accessible to malaria control programs.

  6. A new efficient algorithm for computing the imprecise reliability of monotone systems

    International Nuclear Information System (INIS)

    Utkin, Lev V.

    2004-01-01

    Reliability analysis of complex systems by partial information about reliability of components and by different conditions of independence of components may be carried out by means of the imprecise probability theory which provides a unified framework (natural extension, lower and upper previsions) for computing the system reliability. However, the application of imprecise probabilities to reliability analysis meets with a complexity of optimization problems which have to be solved for obtaining the system reliability measures. Therefore, an efficient simplified algorithm to solve and decompose the optimization problems is proposed in the paper. This algorithm allows us to practically implement reliability analysis of monotone systems under partial and heterogeneous information about reliability of components and under conditions of the component independence or the lack of information about independence. A numerical example illustrates the algorithm

  7. Full 3-D stratigraphic inversion with a priori information: a powerful way to optimize data integration

    Energy Technology Data Exchange (ETDEWEB)

    Grizon, L.; Leger, M.; Dequirez, P.Y.; Dumont, F.; Richard, V.

    1998-12-31

    Integration between seismic and geological data is crucial to ensure that a reservoir study is accurate and reliable. To reach this goal, there is used a post-stack stratigraphic inversion with a priori information. The global cost-function combines two types of constraints. One is relevant to seismic amplitudes, and the other to an a priori impedance model. This paper presents this flexible and interpretative inversion to determine acoustic impedances constrained by seismic data, log data and geologic information. 5 refs., 8 figs.

  8. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    Science.gov (United States)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  9. Reliability databases: State-of-the-art and perspectives

    DEFF Research Database (Denmark)

    Akhmedjanov, Farit

    2001-01-01

    The report gives a history of development and an overview of the existing reliability databases. This overview also describes some other (than computer databases) sources of reliability and failures information, e.g. reliability handbooks, but the mainattention is paid to standard models...... and software packages containing the data mentioned. The standards corresponding to collection and exchange of reliability data are observed too. Finally, perspective directions in such data sources development areshown....

  10. Reliability and optimization of structural systems

    International Nuclear Information System (INIS)

    Thoft-Christensen, P.

    1987-01-01

    The proceedings contain 28 papers presented at the 1st working conference. The working conference was organized by the IFIP Working Group 7.5. The proceedings also include 4 papers which were submitted, but for various reasons not presented at the working conference. The working conference was attended by 50 participants from 18 countries. The conference was the first scientific meeting of the new IFIP Working Group 7.5 on 'Reliability and Optimization of Structural Systems'. The purpose of the Working Group 7.5 is to promote modern structural system optimization and reliability theory, to advance international cooperation in the field of structural system optimization and reliability theory, to stimulate research, development and application of structural system optimization and reliability theory, to further the dissemination and exchange of information on reliability and optimization of structural system optimization and reliability theory, and to encourage education in structural system optimization and reliability theory. (orig./HP)

  11. Redefining reliability

    International Nuclear Information System (INIS)

    Paulson, S.L.

    1995-01-01

    Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas

  12. The reliability of commonly used electrophysiology measures.

    Science.gov (United States)

    Brown, K E; Lohse, K R; Mayer, I M S; Strigaro, G; Desikan, M; Casula, E P; Meunier, S; Popa, T; Lamy, J-C; Odish, O; Leavitt, B R; Durr, A; Roos, R A C; Tabrizi, S J; Rothwell, J C; Boyd, L A; Orth, M

    Electrophysiological measures can help understand brain function both in healthy individuals and in the context of a disease. Given the amount of information that can be extracted from these measures and their frequent use, it is essential to know more about their inherent reliability. To understand the reliability of electrophysiology measures in healthy individuals. We hypothesized that measures of threshold and latency would be the most reliable and least susceptible to methodological differences between study sites. Somatosensory evoked potentials from 112 control participants; long-latency reflexes, transcranial magnetic stimulation with resting and active motor thresholds, motor evoked potential latencies, input/output curves, and short-latency sensory afferent inhibition and facilitation from 84 controls were collected at 3 visits over 24 months at 4 Track-On HD study sites. Reliability was assessed using intra-class correlation coefficients for absolute agreement, and the effects of reliability on statistical power are demonstrated for different sample sizes and study designs. Measures quantifying latencies, thresholds, and evoked responses at high stimulator intensities had the highest reliability, and required the smallest sample sizes to adequately power a study. Very few between-site differences were detected. Reliability and susceptibility to between-site differences should be evaluated for electrophysiological measures before including them in study designs. Levels of reliability vary substantially across electrophysiological measures, though there are few between-site differences. To address this, reliability should be used in conjunction with theoretical calculations to inform sample size and ensure studies are adequately powered to detect true change in measures of interest. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Uncertainties and reliability theories for reactor safety

    International Nuclear Information System (INIS)

    Veneziano, D.

    1975-01-01

    What makes the safety problem of nuclear reactors particularly challenging is the demand for high levels of reliability and the limitation of statistical information. The latter is an unfortunate circumstance, which forces deductive theories of reliability to use models and parameter values with weak factual support. The uncertainty about probabilistic models and parameters which are inferred from limited statistical evidence can be quantified and incorporated rationally into inductive theories of reliability. In such theories, the starting point is the information actually available, as opposed to an estimated probabilistic model. But, while the necessity of introducing inductive uncertainty into reliability theories has been recognized by many authors, no satisfactory inductive theory is presently available. The paper presents: a classification of uncertainties and of reliability models for reactor safety; a general methodology to include these uncertainties into reliability analysis; a discussion about the relative advantages and the limitations of various reliability theories (specifically, of inductive and deductive, parametric and nonparametric, second-moment and full-distribution theories). For example, it is shown that second-moment theories, which were originally suggested to cope with the scarcity of data, and which have been proposed recently for the safety analysis of secondary containment vessels, are the least capable of incorporating statistical uncertainty. The focus is on reliability models for external threats (seismic accelerations and tornadoes). As an application example, the effect of statistical uncertainty on seismic risk is studied using parametric full-distribution models

  14. Reliability of fossil-fuel and nuclear power installations

    International Nuclear Information System (INIS)

    1983-01-01

    The conference heard a total of 37 papers of which 24 were inputted in INIS. The subject area was mainly the use of reliability information systems and the production of data banks for these systems, the application of the reliability theory and the reliability analysis of equipment and systems of nuclear power plants. (J.P.)

  15. Dynamic reliability of digital-based transmitters

    Energy Technology Data Exchange (ETDEWEB)

    Brissaud, Florent, E-mail: florent.brissaud.2007@utt.f [Institut National de l' Environnement Industriel et des Risques (INERIS), Parc Technologique Alata, BP 2, 60550 Verneuil-en-Halatte (France) and Universite de Technologie de Troyes - UTT, Institut Charles Delaunay - ICD and UMR CNRS 6279 STMR, 12 rue Marie Curie, BP 2060, 10010 Troyes Cedex (France); Smidts, Carol [Ohio State University (OSU), Nuclear Engineering Program, Department of Mechanical Engineering, Scott Laboratory, 201 W 19th Ave, Columbus OH 43210 (United States); Barros, Anne; Berenguer, Christophe [Universite de Technologie de Troyes (UTT), Institut Charles Delaunay (ICD) and UMR CNRS 6279 STMR, 12 rue Marie Curie, BP 2060, 10010 Troyes Cedex (France)

    2011-07-15

    Dynamic reliability explicitly handles the interactions between the stochastic behaviour of system components and the deterministic behaviour of process variables. While dynamic reliability provides a more efficient and realistic way to perform probabilistic risk assessment than 'static' approaches, its industrial level applications are still limited. Factors contributing to this situation are the inherent complexity of the theory and the lack of a generic platform. More recently the increased use of digital-based systems has also introduced additional modelling challenges related to specific interactions between system components. Typical examples are the 'intelligent transmitters' which are able to exchange information, and to perform internal data processing and advanced functionalities. To make a contribution to solving these challenges, the mathematical framework of dynamic reliability is extended to handle the data and information which are processed and exchanged between systems components. Stochastic deviations that may affect system properties are also introduced to enhance the modelling of failures. A formalized Petri net approach is then presented to perform the corresponding reliability analyses using numerical methods. Following this formalism, a versatile model for the dynamic reliability modelling of digital-based transmitters is proposed. Finally the framework's flexibility and effectiveness is demonstrated on a substantial case study involving a simplified model of a nuclear fast reactor.

  16. RELIABILITY ASSESSMENT OF ENTROPY METHOD FOR SYSTEM CONSISTED OF IDENTICAL EXPONENTIAL UNITS

    Institute of Scientific and Technical Information of China (English)

    Sun Youchao; Shi Jun

    2004-01-01

    The reliability assessment of unit-system near two levels is the most important content in the reliability multi-level synthesis of complex systems. Introducing the information theory into system reliability assessment, using the addible characteristic of information quantity and the principle of equivalence of information quantity, an entropy method of data information conversion is presented for the system consisted of identical exponential units. The basic conversion formulae of entropy method of unit test data are derived based on the principle of information quantity equivalence. The general models of entropy method synthesis assessment for system reliability approximate lower limits are established according to the fundamental principle of the unit reliability assessment. The applications of the entropy method are discussed by way of practical examples. Compared with the traditional methods, the entropy method is found to be valid and practicable and the assessment results are very satisfactory.

  17. Reliability data collection and processing for Romanian TRIGA-SSR 14MW

    International Nuclear Information System (INIS)

    Mladin, Daniela; Mladin, Mirea; Cristea, Dumitru

    2002-01-01

    The use of site specific reliability data for PSA use is highly recommended because it enhances the accurateness and credibility of the risk analysis. In order to obtain the database statistic items for the reactor components it is necessary to: Develop a brief reactor operation history; Identify the components which can be monitored and their specific failure modes: Clearly define the component boundary; Select and run through the sources of recorded information related to failures and operational details; Process the data; The paper presents how these steps are completed for obtaining failure rates and confidence intervals limits (95% and 5%) for a series of Romanian TRIGA components such as: pumps, motors, valves, compressors, fans, etc. The identification of component boundary and failure rates modes is performed according to the IAEA guides for research reactors database. (author)

  18. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    Science.gov (United States)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  19. Role of information systems in public health services.

    Science.gov (United States)

    Hartshorne, J E; Carstens, I L

    1990-07-01

    The purpose of this review is to establish a conceptual framework on the role of information systems in public health care. Information is indispensable for effective management and development of health services and therefore considered as an important operational asset or resource. A Health Information System is mainly required to support management and operations at four levels: namely transactional and functional; operational control; management planning and control; and strategic planning. To provide the necessary information needs of users at these levels of management in the health care system, a structured information system coupled with appropriate information technology is required. Adequate and relevant information is needed regarding population characteristics, resources available and expended, output and outcome of health care activities. Additionally information needs to be reliable, accurate, timely, easily accessible and presented in a compact and meaningful form. With a well-planned health information system health authorities would be in a position to provide a quality, cost-effective and efficient health service for as many people as need it, optimal utilisation of resources and to maintain and improve the community's health status.

  20. The establish and application of equipment reliability database in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Zheng Wei; Li He

    2006-03-01

    Take the case of Daya Bay Nuclear Power Plant, the collecting and handling of equipment reliability data, the calculation method of reliability parameters and the establish and application of reliability databases, etc. are discussed. The data source involved the design information of the equipment, the operation information, the maintenance information and periodically test record, etc. Equipment reliability database built on a base of the operation experience. It provided the valid tool for thoroughly and objectively recording the operation history and the present condition of various equipment of the plant; supervising the appearance of the equipment, especially the safety-related equipment, provided the very practical worth information for enhancing the safety and availability management of the equipment and insuring the safety and economic operation of the plant; and provided the essential data for the research and applications in safety management, reliability analysis, probabilistic safety assessment, reliability centered maintenance and economic management in nuclear power plant. (authors)

  1. PV Systems Reliability Final Technical Report.

    Energy Technology Data Exchange (ETDEWEB)

    Lavrova, Olga [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flicker, Jack David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Jay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Armijo, Kenneth Miguel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Sigifredo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schindelholz, Eric John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sorensen, Neil R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Yang, Benjamin Bing-Yeh [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The continued exponential growth of photovoltaic technologies paves a path to a solar-powered world, but requires continued progress toward low-cost, high-reliability, high-performance photovoltaic (PV) systems. High reliability is an essential element in achieving low-cost solar electricity by reducing operation and maintenance (O&M) costs and extending system lifetime and availability, but these attributes are difficult to verify at the time of installation. Utilities, financiers, homeowners, and planners are demanding this information in order to evaluate their financial risk as a prerequisite to large investments. Reliability research and development (R&D) is needed to build market confidence by improving product reliability and by improving predictions of system availability, O&M cost, and lifetime. This project is focused on understanding, predicting, and improving the reliability of PV systems. The two areas being pursued include PV arc-fault and ground fault issues, and inverter reliability.

  2. Fast and accurate edge orientation processing during object manipulation

    Science.gov (United States)

    Flanagan, J Randall; Johansson, Roland S

    2018-01-01

    Quickly and accurately extracting information about a touched object’s orientation is a critical aspect of dexterous object manipulation. However, the speed and acuity of tactile edge orientation processing with respect to the fingertips as reported in previous perceptual studies appear inadequate in these respects. Here we directly establish the tactile system’s capacity to process edge-orientation information during dexterous manipulation. Participants extracted tactile information about edge orientation very quickly, using it within 200 ms of first touching the object. Participants were also strikingly accurate. With edges spanning the entire fingertip, edge-orientation resolution was better than 3° in our object manipulation task, which is several times better than reported in previous perceptual studies. Performance remained impressive even with edges as short as 2 mm, consistent with our ability to precisely manipulate very small objects. Taken together, our results radically redefine the spatial processing capacity of the tactile system. PMID:29611804

  3. On the reliability of finite element solutions

    International Nuclear Information System (INIS)

    Prasad, K.S.R.K.

    1975-01-01

    The extent of reliability of the finite element method for analysis of nuclear reactor structures, and that of reactor vessels in particular and the need for the engineer to guard against the pitfalls that may arise out of both physical and mathematical models have been high-lighted. A systematic way of checking the model to obtain reasonably accurate solutions is presented. Quite often sophisticated elements are suggested for specific design and stress concentration problems. The desirability or otherwise of these elements, their scope and utility vis-a-vis the use of large stack of conventional elements are discussed from the view point of stress analysts. The methods of obtaining a check on the reliability of the finite element solutions either through modelling changes or an extrapolation technique are discussed. (author)

  4. MOV reliability evaluation and periodic verification scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  5. MOV reliability evaluation and periodic verification scheduling

    International Nuclear Information System (INIS)

    Bunte, B.D.

    1996-01-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs

  6. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  7. Assessing Ambiguity of Context Data in Intelligent Environments: Towards a More Reliable Context Managing System

    Directory of Open Access Journals (Sweden)

    Diego López-de-Ipiña

    2012-04-01

    Full Text Available Modeling and managing correctly the user context in Smart Environments is important to achieve robust and reliable systems. When modeling reality we must take into account its ambiguous nature. Considering the uncertainty and vagueness in context data information it is possible to attain a more precise picture of the environment, thus leading to a more accurate inference process. To achieve these goals we present an ontology that models the ambiguity in intelligent environments and a data fusion and inference process that takes advantage of that extra information to provide better results. Our system can assess the certainty of the captured measurements, discarding the unreliable ones and combining the rest into a unified vision of the current user context. It also models the vagueness of the system, combining it with the uncertainty to obtain a richer inference process.

  8. LocARNA-P: Accurate boundary prediction and improved detection of structural RNAs

    DEFF Research Database (Denmark)

    Will, Sebastian; Joshi, Tejal; Hofacker, Ivo L.

    2012-01-01

    Current genomic screens for noncoding RNAs (ncRNAs) predict a large number of genomic regions containing potential structural ncRNAs. The analysis of these data requires highly accurate prediction of ncRNA boundaries and discrimination of promising candidate ncRNAs from weak predictions. Existing...... methods struggle with these goals because they rely on sequence-based multiple sequence alignments, which regularly misalign RNA structure and therefore do not support identification of structural similarities. To overcome this limitation, we compute columnwise and global reliabilities of alignments based...... on sequence and structure similarity; we refer to these structure-based alignment reliabilities as STARs. The columnwise STARs of alignments, or STAR profiles, provide a versatile tool for the manual and automatic analysis of ncRNAs. In particular, we improve the boundary prediction of the widely used nc...

  9. Integrating generation and transmission networks reliability for unit commitment solution

    International Nuclear Information System (INIS)

    Jalilzadeh, S.; Shayeghi, H.; Hadadian, H.

    2009-01-01

    This paper presents a new method with integration of generation and transmission networks reliability for the solution of unit commitment (UC) problem. In fact, in order to have a more accurate assessment of system reserve requirement, in addition to unavailability of generation units, unavailability of transmission lines are also taken into account. In this way, evaluation of the required spinning reserve (SR) capacity is performed by applying reliability constraints based on loss of load probability and expected energy not supplied (EENS) indices. Calculation of the above parameters is accomplished by employing a novel procedure based on the linear programming which it also minimizes them to achieve optimum level of the SR capacity and consequently a cost-benefit reliability constrained UC schedule. In addition, a powerful solution technique called 'integer-coded genetic algorithm (ICGA)' is being used for the solution of the proposed method. Numerical results on the IEEE reliability test system show that the consideration of transmission network unavailability has an important influence on reliability indices of the UC schedules

  10. Chip-Level Electromigration Reliability for Cu Interconnects

    International Nuclear Information System (INIS)

    Gall, M.; Oh, C.; Grinshpon, A.; Zolotov, V.; Panda, R.; Demircan, E.; Mueller, J.; Justison, P.; Ramakrishna, K.; Thrasher, S.; Hernandez, R.; Herrick, M.; Fox, R.; Boeck, B.; Kawasaki, H.; Haznedar, H.; Ku, P.

    2004-01-01

    Even after the successful introduction of Cu-based metallization, the electromigration (EM) failure risk has remained one of the most important reliability concerns for most advanced process technologies. Ever increasing operating current densities and the introduction of low-k materials in the backend process scheme are some of the issues that threaten reliable, long-term operation at elevated temperatures. The traditional method of verifying EM reliability only through current density limit checks is proving to be inadequate in general, or quite expensive at the best. A Statistical EM Budgeting (SEB) methodology has been proposed to assess more realistic chip-level EM reliability from the complex statistical distribution of currents in a chip. To be valuable, this approach requires accurate estimation of currents for all interconnect segments in a chip. However, no efficient technique to manage the complexity of such a task for very large chip designs is known. We present an efficient method to estimate currents exhaustively for all interconnects in a chip. The proposed method uses pre-characterization of cells and macros, and steps to identify and filter out symmetrically bi-directional interconnects. We illustrate the strength of the proposed approach using a high-performance microprocessor design for embedded applications as a case study

  11. Reliability and Measurement Error of Tensiomyography to Assess Mechanical Muscle Function: A Systematic Review.

    Science.gov (United States)

    Martín-Rodríguez, Saúl; Loturco, Irineu; Hunter, Angus M; Rodríguez-Ruiz, David; Munguia-Izquierdo, Diego

    2017-12-01

    Martín-Rodríguez, S, Loturco, I, Hunter, AM, Rodríguez-Ruiz, D, and Munguia-Izquierdo, D. Reliability and measurement error of tensiomyography to assess mechanical muscle function: A systematic review. J Strength Cond Res 31(12): 3524-3536, 2017-Interest in studying mechanical skeletal muscle function through tensiomyography (TMG) has increased in recent years. This systematic review aimed to (a) report the reliability and measurement error of all TMG parameters (i.e., maximum radial displacement of the muscle belly [Dm], contraction time [Tc], delay time [Td], half-relaxation time [½ Tr], and sustained contraction time [Ts]) and (b) to provide critical reflection on how to perform accurate and appropriate measurements for informing clinicians, exercise professionals, and researchers. A comprehensive literature search was performed of the Pubmed, Scopus, Science Direct, and Cochrane databases up to July 2017. Eight studies were included in this systematic review. Meta-analysis could not be performed because of the low quality of the evidence of some studies evaluated. Overall, the review of the 9 studies involving 158 participants revealed high relative reliability (intraclass correlation coefficient [ICC]) for Dm (0.91-0.99); moderate-to-high ICC for Ts (0.80-0.96), Tc (0.70-0.98), and ½ Tr (0.77-0.93); and low-to-high ICC for Td (0.60-0.98), independently of the evaluated muscles. In addition, absolute reliability (coefficient of variation [CV]) was low for all TMG parameters except for ½ Tr (CV = >20%), whereas measurement error indexes were high for this parameter. In conclusion, this study indicates that 3 of the TMG parameters (Dm, Td, and Tc) are highly reliable, whereas ½ Tr demonstrate insufficient reliability, and thus should not be used in future studies.

  12. Validity and reliability testing of two instruments to measure breast cancer patients' concerns and information needs relating to radiation therapy

    International Nuclear Information System (INIS)

    Halkett, Georgia KB; Kristjanson, Linda J

    2007-01-01

    It is difficult to determine the most effective approach to patient education or tailor education interventions for patients in radiotherapy without tools that assess patients' specific radiation therapy information needs and concerns. Therefore, the aim of this study was to develop psychometrically sound tools to adequately determine the concerns and information needs of cancer patients during radiation therapy. Two tools were developed to (1) determine patients concerns about radiation therapy (RT Concerns Scale) and (2) ascertain patient's information needs at different time point during their radiation therapy (RT Information Needs Scale). Tools were based on previous research by the authors, published literature on breast cancer and radiation therapy and information behaviour research. Thirty-one breast cancer patients completed the questionnaire on one occasion and thirty participants completed the questionnaire on a second occasion to facilitate test-retest reliability. One participant's responses were removed from the analysis. Results were analysed for content validity, internal consistency and stability over time. Both tools demonstrated high internal consistency and adequate stability over time. The nine items in the RT Concerns Scale were retained because they met all pre-set psychometric criteria. Two items were deleted from the RT Information Needs Scale because they did not meet content validity criteria and did not achieve pre-specified criteria for internal consistency. This tool now contains 22 items. This paper provides preliminary data suggesting that the two tools presented are reliable and valid and would be suitable for use in trials or in the clinical setting

  13. Accurate quasiparticle calculation of x-ray photoelectron spectra of solids.

    Science.gov (United States)

    Aoki, Tsubasa; Ohno, Kaoru

    2018-05-31

    It has been highly desired to provide an accurate and reliable method to calculate core electron binding energies (CEBEs) of crystals and to understand the final state screening effect on a core hole in high resolution x-ray photoelectron spectroscopy (XPS), because the ΔSCF method cannot be simply used for bulk systems. We propose to use the quasiparticle calculation based on many-body perturbation theory for this problem. In this study, CEBEs of band-gapped crystals, silicon, diamond, β-SiC, BN, and AlP, are investigated by means of the GW approximation (GWA) using the full ω integration and compared with the preexisting XPS data. The screening effect on a deep core hole is also investigated in detail by evaluating the relaxation energy (RE) from the core and valence contributions separately. Calculated results show that not only the valence electrons but also the core electrons have an important contribution to the RE, and the GWA have a tendency to underestimate CEBEs due to the excess RE. This underestimation can be improved by introducing the self-screening correction to the GWA. The resulting C1s, B1s, N1s, Si2p, and Al2p CEBEs are in excellent agreement with the experiments within 1 eV absolute error range. The present self-screening corrected GW approach has the capability to achieve the highly accurate prediction of CEBEs without any empirical parameter for band-gapped crystals, and provide a more reliable theoretical approach than the conventional ΔSCF-DFT method.

  14. Accurate quasiparticle calculation of x-ray photoelectron spectra of solids

    Science.gov (United States)

    Aoki, Tsubasa; Ohno, Kaoru

    2018-05-01

    It has been highly desired to provide an accurate and reliable method to calculate core electron binding energies (CEBEs) of crystals and to understand the final state screening effect on a core hole in high resolution x-ray photoelectron spectroscopy (XPS), because the ΔSCF method cannot be simply used for bulk systems. We propose to use the quasiparticle calculation based on many-body perturbation theory for this problem. In this study, CEBEs of band-gapped crystals, silicon, diamond, β-SiC, BN, and AlP, are investigated by means of the GW approximation (GWA) using the full ω integration and compared with the preexisting XPS data. The screening effect on a deep core hole is also investigated in detail by evaluating the relaxation energy (RE) from the core and valence contributions separately. Calculated results show that not only the valence electrons but also the core electrons have an important contribution to the RE, and the GWA have a tendency to underestimate CEBEs due to the excess RE. This underestimation can be improved by introducing the self-screening correction to the GWA. The resulting C1s, B1s, N1s, Si2p, and Al2p CEBEs are in excellent agreement with the experiments within 1 eV absolute error range. The present self-screening corrected GW approach has the capability to achieve the highly accurate prediction of CEBEs without any empirical parameter for band-gapped crystals, and provide a more reliable theoretical approach than the conventional ΔSCF-DFT method.

  15. ISO 14000 information platform. Quarterly report, January 1--March 31, 1998

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-09-01

    GETE, through its ISO 14000 Information Platform, globeNet{trademark} (www.iso14000.net), seeks to provide FETC and the US Department of Energy (DOE) with a timely, accurate, and reliable information resource on the issue of environmental management standards. This resource allows FETC to keep track of the latest developments in the ISO 14000 arena, in DOE as well as throughout the federal complex and industry. Utilizing this information, FETC and DOE can make critical decisions on their own use and implementation of the ISO 14000 standards. The information platform also provides FETC with a forum to present information on its activities, such as the DOE ISO 14000 Pilot Projects, and as an information gathering and dissemination resource for its publications. GETE works with an array of national and international contacts and content sources to gather information and provide daily updates on ISO 14000 and related environmental management issues.

  16. Accident Sequence Evaluation Program: Human reliability analysis procedure

    International Nuclear Information System (INIS)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs

  17. Identification of reliable gridded reference data for statistical downscaling methods in Alberta

    Science.gov (United States)

    Eum, H. I.; Gupta, A.

    2017-12-01

    Climate models provide essential information to assess impacts of climate change at regional and global scales. However, statistical downscaling methods have been applied to prepare climate model data for various applications such as hydrologic and ecologic modelling at a watershed scale. As the reliability and (spatial and temporal) resolution of statistically downscaled climate data mainly depend on a reference data, identifying the most reliable reference data is crucial for statistical downscaling. A growing number of gridded climate products are available for key climate variables which are main input data to regional modelling systems. However, inconsistencies in these climate products, for example, different combinations of climate variables, varying data domains and data lengths and data accuracy varying with physiographic characteristics of the landscape, have caused significant challenges in selecting the most suitable reference climate data for various environmental studies and modelling. Employing various observation-based daily gridded climate products available in public domain, i.e. thin plate spline regression products (ANUSPLIN and TPS), inverse distance method (Alberta Townships), and numerical climate model (North American Regional Reanalysis) and an optimum interpolation technique (Canadian Precipitation Analysis), this study evaluates the accuracy of the climate products at each grid point by comparing with the Adjusted and Homogenized Canadian Climate Data (AHCCD) observations for precipitation, minimum and maximum temperature over the province of Alberta. Based on the performance of climate products at AHCCD stations, we ranked the reliability of these publically available climate products corresponding to the elevations of stations discretized into several classes. According to the rank of climate products for each elevation class, we identified the most reliable climate products based on the elevation of target points. A web-based system

  18. Geometric optimisation of an accurate cosine correcting optic fibre coupler for solar spectral measurement

    Science.gov (United States)

    Cahuantzi, Roberto; Buckley, Alastair

    2017-09-01

    Making accurate and reliable measurements of solar irradiance is important for understanding performance in the photovoltaic energy sector. In this paper, we present design details and performance of a number of fibre optic couplers for use in irradiance measurement systems employing remote light sensors applicable for either spectrally resolved or broadband measurement. The angular and spectral characteristics of different coupler designs are characterised and compared with existing state-of-the-art commercial technology. The new coupler designs are fabricated from polytetrafluorethylene (PTFE) rods and operate through forward scattering of incident sunlight on the front surfaces of the structure into an optic fibre located in a cavity to the rear of the structure. The PTFE couplers exhibit up to 4.8% variation in scattered transmission intensity between 425 nm and 700 nm and show minimal specular reflection, making the designs accurate and reliable over the visible region. Through careful geometric optimization near perfect cosine dependence on the angular response of the coupler can be achieved. The PTFE designs represent a significant improvement over the state of the art with less than 0.01% error compared with ideal cosine response for angles of incidence up to 50°.

  19. Geometric optimisation of an accurate cosine correcting optic fibre coupler for solar spectral measurement.

    Science.gov (United States)

    Cahuantzi, Roberto; Buckley, Alastair

    2017-09-01

    Making accurate and reliable measurements of solar irradiance is important for understanding performance in the photovoltaic energy sector. In this paper, we present design details and performance of a number of fibre optic couplers for use in irradiance measurement systems employing remote light sensors applicable for either spectrally resolved or broadband measurement. The angular and spectral characteristics of different coupler designs are characterised and compared with existing state-of-the-art commercial technology. The new coupler designs are fabricated from polytetrafluorethylene (PTFE) rods and operate through forward scattering of incident sunlight on the front surfaces of the structure into an optic fibre located in a cavity to the rear of the structure. The PTFE couplers exhibit up to 4.8% variation in scattered transmission intensity between 425 nm and 700 nm and show minimal specular reflection, making the designs accurate and reliable over the visible region. Through careful geometric optimization near perfect cosine dependence on the angular response of the coupler can be achieved. The PTFE designs represent a significant improvement over the state of the art with less than 0.01% error compared with ideal cosine response for angles of incidence up to 50°.

  20. Kajian Model Kesuksesan Sistem Informasi Delone & Mclean Pada Pengguna Sistem Informasi Akuntansi Accurate Di Kota Sukabumi

    OpenAIRE

    Hudin, Jamal Maulana; Riana, Dwiza

    2016-01-01

    Accurate accounting information system is one of accounting information systems used in the sixcompanies in the city of Sukabumi. DeLone and McLean information system success model is asuitable model to measure the success of the application of information systems in an organizationor company. This study will analyze factors that measure the success of DeLone & McLeaninformation systems model to the users of the Accurate accounting information systems in sixcompanies in the city of Sukabumi. ...

  1. Fuel reliability experience in Finland

    International Nuclear Information System (INIS)

    Kekkonen, L.

    2015-01-01

    Four nuclear reactors have operated in Finland now for 35-38 years. The two VVER-440 units at Loviisa Nuclear Power Plant are operated by Fortum and two BWR’s in Olkiluoto are operated by Teollisuuden Voima Oyj (TVO). The fuel reliability experience of the four reactors operating currently in Finland has been very good and the fuel failure rates have been very low. Systematic inspection of spent fuel assemblies, and especially all failed assemblies, is a good practice that is employed in Finland in order to improve fuel reliability and operational safety. Investigation of the root cause of fuel failures is important in developing ways to prevent similar failures in the future. The operational and fuel reliability experience at the Loviisa Nuclear Power Plant has been reported also earlier in the international seminars on WWER Fuel Performance, Modelling and Experimental Support. In this paper the information on fuel reliability experience at Loviisa NPP is updated and also a short summary of the fuel reliability experience at Olkiluoto NPP is given. Keywords: VVER-440, fuel reliability, operational experience, poolside inspections, fuel failure identification. (author)

  2. Analysis of Parking Reliability Guidance of Urban Parking Variable Message Sign System

    OpenAIRE

    Zhenyu Mei; Ye Tian; Dongping Li

    2012-01-01

    Operators of parking guidance and information systems (PGIS) often encounter difficulty in determining when and how to provide reliable car park availability information to drivers. Reliability has become a key factor to ensure the benefits of urban PGIS. The present paper is the first to define the guiding parking reliability of urban parking variable message signs (VMSs). By analyzing the parking choice under guiding and optional parking lots, a guiding parking reliability model was constru...

  3. A method of bias correction for maximal reliability with dichotomous measures.

    Science.gov (United States)

    Penev, Spiridon; Raykov, Tenko

    2010-02-01

    This paper is concerned with the reliability of weighted combinations of a given set of dichotomous measures. Maximal reliability for such measures has been discussed in the past, but the pertinent estimator exhibits a considerable bias and mean squared error for moderate sample sizes. We examine this bias, propose a procedure for bias correction, and develop a more accurate asymptotic confidence interval for the resulting estimator. In most empirically relevant cases, the bias correction and mean squared error correction can be performed simultaneously. We propose an approximate (asymptotic) confidence interval for the maximal reliability coefficient, discuss the implementation of this estimator, and investigate the mean squared error of the associated asymptotic approximation. We illustrate the proposed methods using a numerical example.

  4. High accurate time system of the Low Latitude Meridian Circle.

    Science.gov (United States)

    Yang, Jing; Wang, Feng; Li, Zhiming

    In order to obtain the high accurate time signal for the Low Latitude Meridian Circle (LLMC), a new GPS accurate time system is developed which include GPS, 1 MC frequency source and self-made clock system. The second signal of GPS is synchronously used in the clock system and information can be collected by a computer automatically. The difficulty of the cancellation of the time keeper can be overcomed by using this system.

  5. A self-interaction-free local hybrid functional: Accurate binding energies vis-à-vis accurate ionization potentials from Kohn-Sham eigenvalues

    International Nuclear Information System (INIS)

    Schmidt, Tobias; Kümmel, Stephan; Kraisler, Eli; Makmal, Adi; Kronik, Leeor

    2014-01-01

    We present and test a new approximation for the exchange-correlation (xc) energy of Kohn-Sham density functional theory. It combines exact exchange with a compatible non-local correlation functional. The functional is by construction free of one-electron self-interaction, respects constraints derived from uniform coordinate scaling, and has the correct asymptotic behavior of the xc energy density. It contains one parameter that is not determined ab initio. We investigate whether it is possible to construct a functional that yields accurate binding energies and affords other advantages, specifically Kohn-Sham eigenvalues that reliably reflect ionization potentials. Tests for a set of atoms and small molecules show that within our local-hybrid form accurate binding energies can be achieved by proper optimization of the free parameter in our functional, along with an improvement in dissociation energy curves and in Kohn-Sham eigenvalues. However, the correspondence of the latter to experimental ionization potentials is not yet satisfactory, and if we choose to optimize their prediction, a rather different value of the functional's parameter is obtained. We put this finding in a larger context by discussing similar observations for other functionals and possible directions for further functional development that our findings suggest

  6. Photovoltaic Module Reliability Workshop 2010: February 18-19, 2010

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, J.

    2013-11-01

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  7. Photovoltaic Module Reliability Workshop 2011: February 16-17, 2011

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.

    2013-11-01

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  8. Photovoltaic Module Reliability Workshop 2013: February 26-27, 2013

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.

    2013-10-01

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  9. Photovoltaic Module Reliability Workshop 2014: February 25-26, 2014

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.

    2014-02-01

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  10. Reliability of Soft Tissue Model Based Implant Surgical Guides; A Methodological Mistake.

    Science.gov (United States)

    Sabour, Siamak; Dastjerdi, Elahe Vahid

    2012-08-20

    Abstract We were interested to read the paper by Maney P and colleagues published in the July 2012 issue of J Oral Implantol. The authors aimed to assess the reliability of soft tissue model based implant surgical guides reported that the accuracy was evaluated using software. 1 I found the manuscript title of Maney P, et al. incorrect and misleading. Moreover, they reported twenty-two sites (46.81%) were considered accurate (13 of 24 maxillary and 9 of 23 mandibular sites). As the authors point out in their conclusion, Soft tissue models do not always provide sufficient accuracy for implant surgical guide fabrication.Reliability (precision) and validity (accuracy) are two different methodological issues in researches. Sensitivity, specificity, PPV, NPV, likelihood ratio positive (true positive/false negative) and likelihood ratio negative (false positive/ true negative) as well as odds ratio (true results\\false results - preferably more than 50) are among the tests to evaluate the validity (accuracy) of a single test compared to a gold standard.2-4 It is not clear that the reported twenty-two sites (46.81%) which were considered accurate related to which of the above mentioned estimates for validity analysis. Reliability (repeatability or reproducibility) is being assessed by different statistical tests such as Pearson r, least square and paired t.test which all of them are among common mistakes in reliability analysis 5. Briefly, for quantitative variable Intra Class Correlation Coefficient (ICC) and for qualitative variables weighted kappa should be used with caution because kappa has its own limitation too. Regarding reliability or agreement, it is good to know that for computing kappa value, just concordant cells are being considered, whereas discordant cells should also be taking into account in order to reach a correct estimation of agreement (Weighted kappa).2-4 As a take home message, for reliability and validity analysis, appropriate tests should be

  11. A Deep Learning Framework for Robust and Accurate Prediction of ncRNA-Protein Interactions Using Evolutionary Information.

    Science.gov (United States)

    Yi, Hai-Cheng; You, Zhu-Hong; Huang, De-Shuang; Li, Xiao; Jiang, Tong-Hai; Li, Li-Ping

    2018-06-01

    The interactions between non-coding RNAs (ncRNAs) and proteins play an important role in many biological processes, and their biological functions are primarily achieved by binding with a variety of proteins. High-throughput biological techniques are used to identify protein molecules bound with specific ncRNA, but they are usually expensive and time consuming. Deep learning provides a powerful solution to computationally predict RNA-protein interactions. In this work, we propose the RPI-SAN model by using the deep-learning stacked auto-encoder network to mine the hidden high-level features from RNA and protein sequences and feed them into a random forest (RF) model to predict ncRNA binding proteins. Stacked assembling is further used to improve the accuracy of the proposed method. Four benchmark datasets, including RPI2241, RPI488, RPI1807, and NPInter v2.0, were employed for the unbiased evaluation of five established prediction tools: RPI-Pred, IPMiner, RPISeq-RF, lncPro, and RPI-SAN. The experimental results show that our RPI-SAN model achieves much better performance than other methods, with accuracies of 90.77%, 89.7%, 96.1%, and 99.33%, respectively. It is anticipated that RPI-SAN can be used as an effective computational tool for future biomedical researches and can accurately predict the potential ncRNA-protein interacted pairs, which provides reliable guidance for biological research. Copyright © 2018 The Author(s). Published by Elsevier Inc. All rights reserved.

  12. The importance of accurate meteorological input fields and accurate planetary boundary layer parameterizations, tested against ETEX-1

    International Nuclear Information System (INIS)

    Brandt, J.; Ebel, A.; Elbern, H.; Jakobs, H.; Memmesheimer, M.; Mikkelsen, T.; Thykier-Nielsen, S.; Zlatev, Z.

    1997-01-01

    Atmospheric transport of air pollutants is, in principle, a well understood process. If information about the state of the atmosphere is given in all details (infinitely accurate information about wind speed, etc.) and infinitely fast computers are available then the advection equation could in principle be solved exactly. This is, however, not the case: discretization of the equations and input data introduces some uncertainties and errors in the results. Therefore many different issues have to be carefully studied in order to diminish these uncertainties and to develop an accurate transport model. Some of these are e.g. the numerical treatment of the transport equation, accuracy of the mean meteorological input fields and parameterizations of sub-grid scale phenomena (as e.g. parameterizations of the 2 nd and higher order turbulence terms in order to reach closure in the perturbation equation). A tracer model for studying transport and dispersion of air pollution caused by a single but strong source is under development. The model simulations from the first ETEX release illustrate the differences caused by using various analyzed fields directly in the tracer model or using a meteorological driver. Also different parameterizations of the mixing height and the vertical exchange are compared. (author)

  13. Reliability analysis and operator modelling

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    1996-01-01

    The paper considers the state of operator modelling in reliability analysis. Operator models are needed in reliability analysis because operators are needed in process control systems. HRA methods must therefore be able to account both for human performance variability and for the dynamics of the interaction. A selected set of first generation HRA approaches is briefly described in terms of the operator model they use, their classification principle, and the actual method they propose. In addition, two examples of second generation methods are also considered. It is concluded that first generation HRA methods generally have very simplistic operator models, either referring to the time-reliability relationship or to elementary information processing concepts. It is argued that second generation HRA methods must recognise that cognition is embedded in a context, and be able to account for that in the way human reliability is analysed and assessed

  14. Reliability and Maintainability (RAM) Training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Packard, Michael H. (Editor)

    2000-01-01

    The theme of this manual is failure physics-the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low-cost reliable products. In a broader sense the manual should do more. It should underscore the urgent need CI for mature attitudes toward reliability. Five of the chapters were originally presented as a classroom course to over 1000 Martin Marietta engineers and technicians. Another four chapters and three appendixes have been added, We begin with a view of reliability from the years 1940 to 2000. Chapter 2 starts the training material with a review of mathematics and a description of what elements contribute to product failures. The remaining chapters elucidate basic reliability theory and the disciplines that allow us to control and eliminate failures.

  15. Isokinetic Strength and Endurance Tests used Pre- and Post-Spaceflight: Test-Retest Reliability

    Science.gov (United States)

    Laughlin, Mitzi S.; Lee, Stuart M. C.; Loehr, James A.; Amonette, William E.

    2009-01-01

    To assess changes in muscular strength and endurance after microgravity exposure, NASA measures isokinetic strength and endurance across multiple sessions before and after long-duration space flight. Accurate interpretation of pre- and post-flight measures depends upon the reliability of each measure. The purpose of this study was to evaluate the test-retest reliability of the NASA International Space Station (ISS) isokinetic protocol. Twenty-four healthy subjects (12 M/12 F, 32.0 +/- 5.6 years) volunteered to participate. Isokinetic knee, ankle, and trunk flexion and extension strength as well as endurance of the knee flexors and extensors were measured using a Cybex NORM isokinetic dynamometer. The first weekly session was considered a familiarization session. Data were collected and analyzed for weeks 2-4. Repeated measures analysis of variance (alpha=0.05) was used to identify weekly differences in isokinetic measures. Test-retest reliability was evaluated by intraclass correlation coefficients (ICC) (3,1). No significant differences were found between weeks in any of the strength measures and the reliability of the strength measures were all considered excellent (ICC greater than 0.9), except for concentric ankle dorsi-flexion (ICC=0.67). Although a significant difference was noted in weekly endurance measures of knee extension (p less than 0.01), the reliability of endurance measure by week were considered excellent for knee flexion (ICC=0.97) and knee extension (ICC=0.96). Except for concentric ankle dorsi-flexion, the isokinetic strength and endurance measures are highly reliable when following the NASA ISS protocol. This protocol should allow accurate interpretation isokinetic data even with a small number of crew members.

  16. NDE reliability gains from combining eddy-current and ultrasonic testing

    International Nuclear Information System (INIS)

    Horn, D.; Mayo, W.R.

    1999-01-01

    We investigate statistical methods for combining the results of two complementary inspection techniques, eddy-current and ultrasonic testing. The reliability of rejection/acceptance decisions based on combined information is compared with that based on each inspection technique individually. The measured reliability increases with the amount of information incorporated in the decision. (author)

  17. Evaluation of the reliability of Levine method of wound swab for ...

    African Journals Online (AJOL)

    The aim of this paper is to evaluate the reliability of Levine swab in accurate identification of microorganisms present in a wound and identify the necessity for further studies in this regard. Methods: A semi structured questionnaire was administered and physical examination was performed on patients with chronic wounds ...

  18. Inter-Rater Reliability of Historical Data Collected by Non-Medical Research Assistants and Physicians in Patients with Acute Abdominal Pain

    Directory of Open Access Journals (Sweden)

    Mills, Angela M

    2009-02-01

    Full Text Available OBJECTIVES: In many academic emergency departments (ED, physicians are asked to record clinical data for research that may be time consuming and distracting from patient care. We hypothesized that non-medical research assistants (RAs could obtain historical information from patients with acute abdominal pain as accurately as physicians.METHODS: Prospective comparative study conducted in an academic ED of 29 RAs to 32 resident physicians (RPs to assess inter-rater reliability in obtaining historical information in abdominal pain patients. Historical features were independently recorded on standardized data forms by a RA and RP blinded to each others' answers. Discrepancies were resolved by a third person (RA who asked the patient to state the correct answer on a third questionnaire, constituting the "criterion standard." Inter-rater reliability was assessed using kappa statistics (kappa and percent crude agreement (CrA.RESULTS: Sixty-five patients were enrolled (mean age 43. Of 43 historical variables assessed, the median agreement was moderate (kappa 0.59 [Interquartile range 0.37-0.69]; CrA 85.9% and varied across data categories: initial pain location (kappa 0.61 [0.59-0.73]; CrA 87.7%, current pain location (kappa 0.60 [0.47-0.67]; CrA 82.8%, past medical history (kappa 0.60 [0.48-0.74]; CrA 93.8%, associated symptoms (kappa 0.38 [0.37-0.74]; CrA 87.7%, and aggravating/alleviating factors (kappa 0.09 [-0.01-0.21]; CrA 61.5%. When there was disagreement between the RP and the RA, the RA more often agreed with the criterion standard (64% [55-71%] than the RP (36% [29-45%].CONCLUSION: Non-medical research assistants who focus on clinical research are often more accurate than physicians, who may be distracted by patient care responsibilities, at obtaining historical information from ED patients with abdominal pain.

  19. Reliability of the MicroScan WalkAway PC21 panel in identifying and detecting oxacillin resistance in clinical coagulase-negative staphylococci strains.

    Science.gov (United States)

    Olendzki, A N; Barros, E M; Laport, M S; Dos Santos, K R N; Giambiagi-Demarval, M

    2014-01-01

    The purpose of this study was to determine the reliability of the MicroScan WalkAway PosCombo21 (PC21) system for the identification of coagulase-negative staphylococci (CNS) strains and the detection of oxacillin resistance. Using molecular and phenotypic methods, 196 clinical strains were evaluated. The automated system demonstrated 100 % reliability for the identification of the clinical strains Staphylococcus haemolyticus, Staphylococcus hominis and Staphylococcus cohnii; 98.03 % reliability for the identification of Staphylococcus epidermidis; 70 % reliability for the identification of Staphylococcus lugdunensis; 40 % reliability for the identification of Staphylococcus warneri; and 28.57 % reliability for the identification of Staphylococcus capitis, but no reliability for the identification of Staphylococcus auricularis, Staphylococcus simulans and Staphylococcus xylosus. We concluded that the automated system provides accurate results for the more common CNS species but often fails to accurately identify less prevalent species. For the detection of oxacillin resistance, the automated system showed 100 % specificity and 90.22 % sensitivity. Thus, the PC21 panel detects oxacillin-resistant strains, but is limited by the heteroresistance that is observed when using most phenotypic methods.

  20. A fast and reliable method for simultaneous waveform, amplitude and latency estimation of single-trial EEG/MEG data.

    Directory of Open Access Journals (Sweden)

    Wouter D Weeda

    Full Text Available The amplitude and latency of single-trial EEG/MEG signals may provide valuable information concerning human brain functioning. In this article we propose a new method to reliably estimate single-trial amplitude and latency of EEG/MEG signals. The advantages of the method are fourfold. First, no a-priori specified template function is required. Second, the method allows for multiple signals that may vary independently in amplitude and/or latency. Third, the method is less sensitive to noise as it models data with a parsimonious set of basis functions. Finally, the method is very fast since it is based on an iterative linear least squares algorithm. A simulation study shows that the method yields reliable estimates under different levels of latency variation and signal-to-noise ratioÕs. Furthermore, it shows that the existence of multiple signals can be correctly determined. An application to empirical data from a choice reaction time study indicates that the method describes these data accurately.

  1. Novel approach for evaluation of service reliability for electricity customers

    Institute of Scientific and Technical Information of China (English)

    JIANG; John; N

    2009-01-01

    Understanding reliability value for electricity customer is important to market-based reliability management. This paper proposes a novel approach to evaluate the reliability for electricity customers by using indifference curve between economic compensation for power interruption and service reliability of electricity. Indifference curve is formed by calculating different planning schemes of network expansion for different reliability requirements of customers, which reveals the information about economic values for different reliability levels for electricity customers, so that the reliability based on market supply demand mechanism can be established and economic signals can be provided for reliability management and enhancement.

  2. Integrated reliability condition monitoring and maintenance of equipment

    CERN Document Server

    Osarenren, John

    2015-01-01

    Consider a Viable and Cost-Effective Platform for the Industries of the Future (IOF) Benefit from improved safety, performance, and product deliveries to your customers. Achieve a higher rate of equipment availability, performance, product quality, and reliability. Integrated Reliability: Condition Monitoring and Maintenance of Equipment incorporates reliable engineering and mathematical modeling to help you move toward sustainable development in reliability condition monitoring and maintenance. This text introduces a cost-effective integrated reliability growth monitor, integrated reliability degradation monitor, technological inheritance coefficient sensors, and a maintenance tool that supplies real-time information for predicting and preventing potential failures of manufacturing processes and equipment. The author highlights five key elements that are essential to any improvement program: improving overall equipment and part effectiveness, quality, and reliability; improving process performance with maint...

  3. Proceedings of the workshop on reliability data collection

    International Nuclear Information System (INIS)

    1999-01-01

    The main purpose of the Workshop was to provide a forum for exchanging information and experience on Reliability Data Collection and analysis to support Living Probabilistic Safety Assessments (LPSA). The Workshop is divided into four sessions which titles are: Session 1: Reliability Data - Database Systems (3 papers), Session 2: Reliability Data Collection for PSA (5 papers), Session 3: NPP Data Collection (3 papers), Session 4: Reliability Data Assessment (Part 1: General - 2 papers; Part 2: CCF - 2 papers; Part 3: Reactor Protection Systems / External Event Data - 2 papers; Part 4: Human Errors - 2 papers)

  4. An adaptive neuro fuzzy model for estimating the reliability of component-based software systems

    Directory of Open Access Journals (Sweden)

    Kirti Tyagi

    2014-01-01

    Full Text Available Although many algorithms and techniques have been developed for estimating the reliability of component-based software systems (CBSSs, much more research is needed. Accurate estimation of the reliability of a CBSS is difficult because it depends on two factors: component reliability and glue code reliability. Moreover, reliability is a real-world phenomenon with many associated real-time problems. Soft computing techniques can help to solve problems whose solutions are uncertain or unpredictable. A number of soft computing approaches for estimating CBSS reliability have been proposed. These techniques learn from the past and capture existing patterns in data. The two basic elements of soft computing are neural networks and fuzzy logic. In this paper, we propose a model for estimating CBSS reliability, known as an adaptive neuro fuzzy inference system (ANFIS, that is based on these two basic elements of soft computing, and we compare its performance with that of a plain FIS (fuzzy inference system for different data sets.

  5. Interactive Reliability-Based Optimal Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle; Siemaszko, A.

    1994-01-01

    Interactive design/optimization of large, complex structural systems is considered. The objective function is assumed to model the expected costs. The constraints are reliability-based and/or related to deterministic code requirements. Solution of this optimization problem is divided in four main...... tasks, namely finite element analyses, sensitivity analyses, reliability analyses and application of an optimization algorithm. In the paper it is shown how these four tasks can be linked effectively and how existing information on design variables, Lagrange multipliers and the Hessian matrix can...

  6. Accurate halo-galaxy mocks from automatic bias estimation and particle mesh gravity solvers

    Science.gov (United States)

    Vakili, Mohammadjavad; Kitaura, Francisco-Shu; Feng, Yu; Yepes, Gustavo; Zhao, Cheng; Chuang, Chia-Hsun; Hahn, ChangHoon

    2017-12-01

    Reliable extraction of cosmological information from clustering measurements of galaxy surveys requires estimation of the error covariance matrices of observables. The accuracy of covariance matrices is limited by our ability to generate sufficiently large number of independent mock catalogues that can describe the physics of galaxy clustering across a wide range of scales. Furthermore, galaxy mock catalogues are required to study systematics in galaxy surveys and to test analysis tools. In this investigation, we present a fast and accurate approach for generation of mock catalogues for the upcoming galaxy surveys. Our method relies on low-resolution approximate gravity solvers to simulate the large-scale dark matter field, which we then populate with haloes according to a flexible non-linear and stochastic bias model. In particular, we extend the PATCHY code with an efficient particle mesh algorithm to simulate the dark matter field (the FASTPM code), and with a robust MCMC method relying on the EMCEE code for constraining the parameters of the bias model. Using the haloes in the BigMultiDark high-resolution N-body simulation as a reference catalogue, we demonstrate that our technique can model the bivariate probability distribution function (counts-in-cells), power spectrum and bispectrum of haloes in the reference catalogue. Specifically, we show that the new ingredients permit us to reach percentage accuracy in the power spectrum up to k ∼ 0.4 h Mpc-1 (within 5 per cent up to k ∼ 0.6 h Mpc-1) with accurate bispectra improving previous results based on Lagrangian perturbation theory.

  7. Application of Reliability in Breakwater Design

    DEFF Research Database (Denmark)

    Christiani, Erik

    methods to design certain types of breakwaters. Reliability analyses of the main armour and toe berm interaction is exemplified to show the effect of a multiple set of failure mechanisms. First the limit state equations of the main armour and toe interaction are derived from laboratory tests performed...... response, but in one area information has been lacking; bearing capacity has not been treated in depth in a probabilistic manner for breakwaters. Reliability analysis of conventional rubble mound breakwaters and conventional vertical breakwaters is exemplified for the purpose of establishing new ways...... by Bologna University. Thereafter a multiple system of failure for the interaction is established. Relevant stochastic parameters are characterized prior to the reliability evaluation. Application of reliability in crown wall design is illustrated by deriving relevant single foundation failure modes...

  8. Mission Reliability Estimation for Repairable Robot Teams

    Science.gov (United States)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  9. Reliability issues at the LHC

    CERN Multimedia

    CERN. Geneva. Audiovisual Unit; Gillies, James D

    2002-01-01

    The Lectures on reliability issues at the LHC will be focused on five main Modules on five days. Module 1: Basic Elements in Reliability Engineering Some basic terms, definitions and methods, from components up to the system and the plant, common cause failures and human factor issues. Module 2: Interrelations of Reliability & Safety (R&S) Reliability and risk informed approach, living models, risk monitoring. Module 3: The ideal R&S Process for Large Scale Systems From R&S goals via the implementation into the system to the proof of the compliance. Module 4: Some Applications of R&S on LHC Master logic, anatomy of risk, cause - consequence diagram, decomposition and aggregation of the system. Module 5: Lessons learned from R&S Application in various Technologies Success stories, pitfalls, constrains in data and methods, limitations per se, experienced in aviation, space, process, nuclear, offshore and transport systems and plants. The Lectures will reflect in summary the compromise in...

  10. A computational Bayesian approach to dependency assessment in system reliability

    International Nuclear Information System (INIS)

    Yontay, Petek; Pan, Rong

    2016-01-01

    Due to the increasing complexity of engineered products, it is of great importance to develop a tool to assess reliability dependencies among components and systems under the uncertainty of system reliability structure. In this paper, a Bayesian network approach is proposed for evaluating the conditional probability of failure within a complex system, using a multilevel system configuration. Coupling with Bayesian inference, the posterior distributions of these conditional probabilities can be estimated by combining failure information and expert opinions at both system and component levels. Three data scenarios are considered in this study, and they demonstrate that, with the quantification of the stochastic relationship of reliability within a system, the dependency structure in system reliability can be gradually revealed by the data collected at different system levels. - Highlights: • A Bayesian network representation of system reliability is presented. • Bayesian inference methods for assessing dependencies in system reliability are developed. • Complete and incomplete data scenarios are discussed. • The proposed approach is able to integrate reliability information from multiple sources at multiple levels of the system.

  11. Distributed Pedestrian Detection Alerts Based on Data Fusion with Accurate Localization

    Directory of Open Access Journals (Sweden)

    Arturo de la Escalera

    2013-09-01

    Full Text Available Among Advanced Driver Assistance Systems (ADAS pedestrian detection is a common issue due to the vulnerability of pedestrians in the event of accidents. In the present work, a novel approach for pedestrian detection based on data fusion is presented. Data fusion helps to overcome the limitations inherent to each detection system (computer vision and laser scanner and provides accurate and trustable tracking of any pedestrian movement. The application is complemented by an efficient communication protocol, able to alert vehicles in the surroundings by a fast and reliable communication. The combination of a powerful location, based on a GPS with inertial measurement, and accurate obstacle localization based on data fusion has allowed locating the detected pedestrians with high accuracy. Tests proved the viability of the detection system and the efficiency of the communication, even at long distances. By the use of the alert communication, dangerous situations such as occlusions or misdetections can be avoided.

  12. Distributed pedestrian detection alerts based on data fusion with accurate localization.

    Science.gov (United States)

    García, Fernando; Jiménez, Felipe; Anaya, José Javier; Armingol, José María; Naranjo, José Eugenio; de la Escalera, Arturo

    2013-09-04

    Among Advanced Driver Assistance Systems (ADAS) pedestrian detection is a common issue due to the vulnerability of pedestrians in the event of accidents. In the present work, a novel approach for pedestrian detection based on data fusion is presented. Data fusion helps to overcome the limitations inherent to each detection system (computer vision and laser scanner) and provides accurate and trustable tracking of any pedestrian movement. The application is complemented by an efficient communication protocol, able to alert vehicles in the surroundings by a fast and reliable communication. The combination of a powerful location, based on a GPS with inertial measurement, and accurate obstacle localization based on data fusion has allowed locating the detected pedestrians with high accuracy. Tests proved the viability of the detection system and the efficiency of the communication, even at long distances. By the use of the alert communication, dangerous situations such as occlusions or misdetections can be avoided.

  13. Improving photometric redshift estimation using GPZ: size information, post processing, and improved photometry

    Science.gov (United States)

    Gomes, Zahra; Jarvis, Matt J.; Almosallam, Ibrahim A.; Roberts, Stephen J.

    2018-03-01

    The next generation of large-scale imaging surveys (such as those conducted with the Large Synoptic Survey Telescope and Euclid) will require accurate photometric redshifts in order to optimally extract cosmological information. Gaussian Process for photometric redshift estimation (GPZ) is a promising new method that has been proven to provide efficient, accurate photometric redshift estimations with reliable variance predictions. In this paper, we investigate a number of methods for improving the photometric redshift estimations obtained using GPZ (but which are also applicable to others). We use spectroscopy from the Galaxy and Mass Assembly Data Release 2 with a limiting magnitude of r Program Data Release 1 and find that it produces significant improvements in accuracy, similar to the effect of including additional features.

  14. Reliability assessment of complex mechatronic systems using a modified nonparametric belief propagation algorithm

    International Nuclear Information System (INIS)

    Zhong, X.; Ichchou, M.; Saidi, A.

    2010-01-01

    Various parametric skewed distributions are widely used to model the time-to-failure (TTF) in the reliability analysis of mechatronic systems, where many items are unobservable due to the high cost of testing. Estimating the parameters of those distributions becomes a challenge. Previous research has failed to consider this problem due to the difficulty of dependency modeling. Recently the methodology of Bayesian networks (BNs) has greatly contributed to the reliability analysis of complex systems. In this paper, the problem of system reliability assessment (SRA) is formulated as a BN considering the parameter uncertainty. As the quantitative specification of BN, a normal distribution representing the stochastic nature of TTF distribution is learned to capture the interactions between the basic items and their output items. The approximation inference of our continuous BN model is performed by a modified version of nonparametric belief propagation (NBP) which can avoid using a junction tree that is inefficient for the mechatronic case because of the large treewidth. After reasoning, we obtain the marginal posterior density of each TTF model parameter. Other information from diverse sources and expert priors can be easily incorporated in this SRA model to achieve more accurate results. Simulation in simple and complex cases of mechatronic systems demonstrates that the posterior of the parameter network fits the data well and the uncertainty passes effectively through our BN based SRA model by using the modified NBP.

  15. Reliability, validity and usefulness of 30-15 Intermittent Fitness Test in Female Soccer Players

    Directory of Open Access Journals (Sweden)

    Nedim Čović

    2016-11-01

    Full Text Available PURPOSE: The aim of this study was to examine the reliability, validity and usefulness of the 30-15IFT in competitive female soccer players. METHODS: Seventeen elite female soccer players participated in the study. A within subject test-retest study design was utilized to assess the reliability of the 30-15 intermittent fitness test (IFT. Seven days prior to 30-15IFT, subjects performed a continuous aerobic running test (CT under laboratory conditions to assess the criterion validity of the 30-15IFT. End running velocity (VCT and VIFT, peak heart rate (HRpeak and maximal oxygen consumption (VO2max were collected and/or estimated for both tests. RESULTS: VIFT (ICC = 0.91; CV = 1.8%, HRpeak (ICC = 0.94; CV = 1.2%, and VO2max (ICC = 0.94; CV = 1.6% obtained from the 30-15IFT were all deemed highly reliable (p>0.05. Pearson product moment correlations between the CT and 30-15IFT for VO2max, HRpeak and end running velocity were large (r = 0.67, p=0.013, very large (r = 0.77, p=0.02 and large (r = 0.57, p=0.042, respectively. CONCLUSION: Current findings suggest that the 30 -15IFT is a valid and reliable intermittent aerobic fitness test of elite female soccer players. The findings have also provided practitioners with evidence to support the accurate detection of meaningful individual changes in VIFT of 0.5 km/h (1 stage and HRpeak of 2 bpm. This information may assist coaches in monitoring ‘real’ aerobic fitness changes to better inform training of female intermittent team sport athletes. Lastly, coaches could use the 30-15IFT as a practical alternative to laboratory based assessments to assess and monitor intermittent aerobic fitness changes in their athletes. Keywords: 30-15 intermittent fitness test, aerobic, cardiorespiratory fitness, intermittent activity, soccer, high intensity interval training.

  16. 78 FR 44909 - Regional Reliability Standard BAL-002-WECC-2-Contingency Reserve

    Science.gov (United States)

    2013-07-25

    ...\\ Mandatory Reliability Standards for the Bulk-Power System, Order No. 693, FERC Stats. & Regs. ] 31,242...-002-WECC-2 (Contingency Reserve). The North American Electric Reliability Corporation (NERC) and... (Technical Information), Office of Electric Reliability, Division of Reliability Standards, Federal Energy...

  17. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  18. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia; Grelle, Austin

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), a systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.

  19. Reliability Estimation of Aero-engine Based on Mixed Weibull Distribution Model

    Science.gov (United States)

    Yuan, Zhongda; Deng, Junxiang; Wang, Dawei

    2018-02-01

    Aero-engine is a complex mechanical electronic system, based on analysis of reliability of mechanical electronic system, Weibull distribution model has an irreplaceable role. Till now, only two-parameter Weibull distribution model and three-parameter Weibull distribution are widely used. Due to diversity of engine failure modes, there is a big error with single Weibull distribution model. By contrast, a variety of engine failure modes can be taken into account with mixed Weibull distribution model, so it is a good statistical analysis model. Except the concept of dynamic weight coefficient, in order to make reliability estimation result more accurately, three-parameter correlation coefficient optimization method is applied to enhance Weibull distribution model, thus precision of mixed distribution reliability model is improved greatly. All of these are advantageous to popularize Weibull distribution model in engineering applications.

  20. Reliable computer systems design and evaluatuion

    CERN Document Server

    Siewiorek, Daniel

    2014-01-01

    Enhance your hardware/software reliabilityEnhancement of system reliability has been a major concern of computer users and designers ¦ and this major revision of the 1982 classic meets users' continuing need for practical information on this pressing topic. Included are case studies of reliablesystems from manufacturers such as Tandem, Stratus, IBM, and Digital, as well as coverage of special systems such as the Galileo Orbiter fault protection system and AT&T telephone switching processors.

  1. Structural reliability of atomic power plant

    International Nuclear Information System (INIS)

    Klemin, A.I.; Polyakov, E.F.

    1980-01-01

    In 1978 the first specialized technical manual ''Technique of Calculating the Structural Reliability of an Atomic Power Plant and Its Systems in the Design Stage'' was developed. The present article contains information about the main characteristics and capabilities of the manual. The manual gives recommendations concerning the calculations of the reliability of such specific systems as the reactor control and safety system, the system of instrumentation and automatic control, and safety systems. 2 refs

  2. Reliability analysis of operator's monitoring behavior in digital main control room of nuclear power plants and its application

    International Nuclear Information System (INIS)

    Zhang Li; Hu Hong; Li Pengcheng; Jiang Jianjun; Yi Cannan; Chen Qingqing

    2015-01-01

    In order to build a quantitative model to analyze operators' monitoring behavior reliability of digital main control room of nuclear power plants, based on the analysis of the design characteristics of digital main control room of a nuclear power plant and operator's monitoring behavior, and combining with operators' monitoring behavior process, monitoring behavior reliability was divided into three parts including information transfer reliability among screens, inside-screen information sampling reliability and information detection reliability. Quantitative calculation model of information transfer reliability among screens was established based on Senders's monitoring theory; the inside screen information sampling reliability model was established based on the allocation theory of attention resources; and considering the performance shaping factor causality, a fuzzy Bayesian method was presented to quantify information detection reliability and an example of application was given. The results show that the established model of monitoring behavior reliability gives an objective description for monitoring process, which can quantify the monitoring reliability and overcome the shortcomings of traditional methods. Therefore, it provides theoretical support for operator's monitoring behavior reliability analysis in digital main control room of nuclear power plants and improves the precision of human reliability analysis. (authors)

  3. A study on the real-time reliability of on-board equipment of train control system

    Science.gov (United States)

    Zhang, Yong; Li, Shiwei

    2018-05-01

    Real-time reliability evaluation is conducive to establishing a condition based maintenance system for the purpose of guaranteeing continuous train operation. According to the inherent characteristics of the on-board equipment, the connotation of reliability evaluation of on-board equipment is defined and the evaluation index of real-time reliability is provided in this paper. From the perspective of methodology and practical application, the real-time reliability of the on-board equipment is discussed in detail, and the method of evaluating the realtime reliability of on-board equipment at component level based on Hidden Markov Model (HMM) is proposed. In this method the performance degradation data is used directly to realize the accurate perception of the hidden state transition process of on-board equipment, which can achieve a better description of the real-time reliability of the equipment.

  4. Using an electronic prescribing system to ensure accurate medication lists in a large multidisciplinary medical group.

    Science.gov (United States)

    Stock, Ron; Scott, Jim; Gurtel, Sharon

    2009-05-01

    Although medication safety has largely focused on reducing medication errors in hospitals, the scope of adverse drug events in the outpatient setting is immense. A fundamental problem occurs when a clinician lacks immediate access to an accurate list of the medications that a patient is taking. Since 2001, PeaceHealth Medical Group (PHMG), a multispecialty physician group, has been using an electronic prescribing system that includes medication-interaction warnings and allergy checks. Yet, most practitioners recognized the remaining potential for error, especially because there was no assurance regarding the accuracy of information on the electronic medical record (EMR)-generated medication list. PeaceHealth developed and implemented a standardized approach to (1) review and reconcile the medication list for every patient at each office visit and (2) report on the results obtained within the PHMG clinics. In 2005, PeaceHealth established the ambulatory medication reconciliation project to develop a reliable, efficient process for maintaining accurate patient medication lists. Each of PeaceHealth's five regions created a medication reconciliation task force to redesign its clinical practice, incorporating the systemwide aims and agreed-on key process components for every ambulatory visit. Implementation of the medication reconciliation process at the PHMG clinics resulted in a substantial increase in the number of accurate medication lists, with fewer discrepancies between what the patient is actually taking and what is recorded in the EMR. The PeaceHealth focus on patient safety, and particularly the reduction of medication errors, has involved a standardized approach for reviewing and reconciling medication lists for every patient visiting a physician office. The standardized processes can be replicated at other ambulatory clinics-whether or not electronic tools are available.

  5. Accident Sequence Evaluation Program: Human reliability analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  6. Validity and reliability of naturalistic driving scene categorization Judgments from crowdsourcing.

    Science.gov (United States)

    Cabrall, Christopher D D; Lu, Zhenji; Kyriakidis, Miltos; Manca, Laura; Dijksterhuis, Chris; Happee, Riender; de Winter, Joost

    2018-05-01

    A common challenge with processing naturalistic driving data is that humans may need to categorize great volumes of recorded visual information. By means of the online platform CrowdFlower, we investigated the potential of crowdsourcing to categorize driving scene features (i.e., presence of other road users, straight road segments, etc.) at greater scale than a single person or a small team of researchers would be capable of. In total, 200 workers from 46 different countries participated in 1.5days. Validity and reliability were examined, both with and without embedding researcher generated control questions via the CrowdFlower mechanism known as Gold Test Questions (GTQs). By employing GTQs, we found significantly more valid (accurate) and reliable (consistent) identification of driving scene items from external workers. Specifically, at a small scale CrowdFlower Job of 48 three-second video segments, an accuracy (i.e., relative to the ratings of a confederate researcher) of 91% on items was found with GTQs compared to 78% without. A difference in bias was found, where without GTQs, external workers returned more false positives than with GTQs. At a larger scale CrowdFlower Job making exclusive use of GTQs, 12,862 three-second video segments were released for annotation. Infeasible (and self-defeating) to check the accuracy of each at this scale, a random subset of 1012 categorizations was validated and returned similar levels of accuracy (95%). In the small scale Job, where full video segments were repeated in triplicate, the percentage of unanimous agreement on the items was found significantly more consistent when using GTQs (90%) than without them (65%). Additionally, in the larger scale Job (where a single second of a video segment was overlapped by ratings of three sequentially neighboring segments), a mean unanimity of 94% was obtained with validated-as-correct ratings and 91% with non-validated ratings. Because the video segments overlapped in full for

  7. Supersonic shear imaging provides a reliable measurement of resting muscle shear elastic modulus

    International Nuclear Information System (INIS)

    Lacourpaille, Lilian; Hug, François; Bouillard, Killian; Nordez, Antoine; Hogrel, Jean-Yves

    2012-01-01

    The aim of the present study was to assess the reliability of shear elastic modulus measurements performed using supersonic shear imaging (SSI) in nine resting muscles (i.e. gastrocnemius medialis, tibialis anterior, vastus lateralis, rectus femoris, triceps brachii, biceps brachii, brachioradialis, adductor pollicis obliquus and abductor digiti minimi) of different architectures and typologies. Thirty healthy subjects were randomly assigned to the intra-session reliability (n = 20), inter-day reliability (n = 21) and the inter-observer reliability (n = 16) experiments. Muscle shear elastic modulus ranged from 2.99 (gastrocnemius medialis) to 4.50 kPa (adductor digiti minimi and tibialis anterior). On the whole, very good reliability was observed, with a coefficient of variation (CV) ranging from 4.6% to 8%, except for the inter-operator reliability of adductor pollicis obliquus (CV = 11.5%). The intraclass correlation coefficients were good (0.871 ± 0.045 for the intra-session reliability, 0.815 ± 0.065 for the inter-day reliability and 0.709 ± 0.141 for the inter-observer reliability). Both the reliability and the ease of use of SSI make it a potentially interesting technique that would be of benefit to fundamental, applied and clinical research projects that need an accurate assessment of muscle mechanical properties. (note)

  8. Re-assessing reliability based on survived loads

    NARCIS (Netherlands)

    Schweckendiek, T.

    2011-01-01

    The reliability of flood defenses is often dictated by large uncertainties in the hydraulic loading and the structural resistance. Additional information decreases uncertainty, however, acquiring it is often costly. One source of information, even though in many cases readily available, is hardly

  9. Accurate method of the magnetic field measurement of quadrupole magnets

    International Nuclear Information System (INIS)

    Kumada, M.; Sakai, I.; Someya, H.; Sasaki, H.

    1983-01-01

    We present an accurate method of the magnetic field measurement of the quadrupole magnet. The method of obtaining the information of the field gradient and the effective focussing length is given. A new scheme to obtain the information of the skew field components is also proposed. The relative accuracy of the measurement was 1 x 10 -4 or less. (author)

  10. Vehicle State Information Estimation with the Unscented Kalman Filter

    Directory of Open Access Journals (Sweden)

    Hongbin Ren

    2014-01-01

    Full Text Available The vehicle state information plays an important role in the vehicle active safety systems; this paper proposed a new concept to estimate the instantaneous vehicle speed, yaw rate, tire forces, and tire kinemics information in real time. The estimator is based on the 3DoF vehicle model combined with the piecewise linear tire model. The estimator is realized using the unscented Kalman filter (UKF, since it is based on the unscented transfer technique and considers high order terms during the measurement and update stage. The numerical simulations are carried out to further investigate the performance of the estimator under high friction and low friction road conditions in the MATLAB/Simulink combined with the Carsim environment. The simulation results are compared with the numerical results from Carsim software, which indicate that UKF can estimate the vehicle state information accurately and in real time; the proposed estimation will provide the necessary and reliable state information to the vehicle controller in the future.

  11. Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria

    Science.gov (United States)

    Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong

    2017-08-01

    In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.

  12. An integrated approach to human reliability analysis -- decision analytic dynamic reliability model

    International Nuclear Information System (INIS)

    Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.

    1999-01-01

    The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis

  13. Evaluation of Quality and Readability of Health Information Websites Identified through India's Major Search Engines.

    Science.gov (United States)

    Raj, S; Sharma, V L; Singh, A J; Goel, S

    2016-01-01

    Background. The available health information on websites should be reliable and accurate in order to make informed decisions by community. This study was done to assess the quality and readability of health information websites on World Wide Web in India. Methods. This cross-sectional study was carried out in June 2014. The key words "Health" and "Information" were used on search engines "Google" and "Yahoo." Out of 50 websites (25 from each search engines), after exclusion, 32 websites were evaluated. LIDA tool was used to assess the quality whereas the readability was assessed using Flesch Reading Ease Score (FRES), Flesch-Kincaid Grade Level (FKGL), and SMOG. Results. Forty percent of websites (n = 13) were sponsored by government. Health On the Net Code of Conduct (HONcode) certification was present on 50% (n = 16) of websites. The mean LIDA score (74.31) was average. Only 3 websites scored high on LIDA score. Only five had readability scores at recommended sixth-grade level. Conclusion. Most health information websites had average quality especially in terms of usability and reliability and were written at high readability levels. Efforts are needed to develop the health information websites which can help general population in informed decision making.

  14. Information criteria and higher Eigenmode estimation in Monte Carlo calculations

    International Nuclear Information System (INIS)

    Nease, B. R.; Ueki, T.

    2007-01-01

    Recently developed Monte Carlo methods of estimating the dominance ratio (DR) rely on autoregressive (AR) fittings of a computed time series. This time series is obtained by applying a projection vector to the fission source distribution of the problem. The AR fitting order necessary to accurately extract the mode corresponding to DR is dependent on the number of fission source bins used. This makes it necessary to examine the convergence of DR as the AR fitting order increases. Therefore, we have investigated if the AR fitting order determined by information criteria can be reliably used to estimate DR. Two information criteria have been investigated: Improved Akaike Information Criteria (AICc) and Minimum Descriptive Length Criteria (MDL). These criteria appear to work well when applied to computations with fine bin structure where the projection vector is applied. (authors)

  15. Improving preimplantation genetic diagnosis (PGD) reliability by selection of sperm donor with the most informative haplotype.

    Science.gov (United States)

    Malcov, Mira; Gold, Veronica; Peleg, Sagit; Frumkin, Tsvia; Azem, Foad; Amit, Ami; Ben-Yosef, Dalit; Yaron, Yuval; Reches, Adi; Barda, Shimi; Kleiman, Sandra E; Yogev, Leah; Hauser, Ron

    2017-04-26

    The study is aimed to describe a novel strategy that increases the accuracy and reliability of PGD in patients using sperm donation by pre-selecting the donor whose haplotype does not overlap the carrier's one. A panel of 4-9 informative polymorphic markers, flanking the mutation in carriers of autosomal dominant/X-linked disorders, was tested in DNA of sperm donors before PGD. Whenever the lengths of donors' repeats overlapped those of the women, additional donors' DNA samples were analyzed. The donor that demonstrated the minimal overlapping with the patient was selected for IVF. In 8 out of 17 carriers the markers of the initially chosen donors overlapped the patients' alleles and 2-8 additional sperm donors for each patient were haplotyped. The selection of additional sperm donors increased the number of informative markers and reduced misdiagnosis risk from 6.00% ± 7.48 to 0.48% ±0.68. The PGD results were confirmed and no misdiagnosis was detected. Our study demonstrates that pre-selecting a sperm donor whose haplotype has minimal overlapping with the female's haplotype, is critical for reducing the misdiagnosis risk and ensuring a reliable PGD. This strategy may contribute to prevent the transmission of affected IVF-PGD embryos using a simple and economical procedure. All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. DNA testing of donors was approved by the institutional Helsinki committee (registration number 319-08TLV, 2008). The present study was approved by the institutional Helsinki committee (registration number 0385-13TLV, 2013).

  16. Photovoltaic Module Reliability Workshop 2012: February 28 - March 1, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.

    2013-11-01

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  17. Time-variant reliability assessment through equivalent stochastic process transformation

    International Nuclear Information System (INIS)

    Wang, Zequn; Chen, Wei

    2016-01-01

    Time-variant reliability measures the probability that an engineering system successfully performs intended functions over a certain period of time under various sources of uncertainty. In practice, it is computationally prohibitive to propagate uncertainty in time-variant reliability assessment based on expensive or complex numerical models. This paper presents an equivalent stochastic process transformation approach for cost-effective prediction of reliability deterioration over the life cycle of an engineering system. To reduce the high dimensionality, a time-independent reliability model is developed by translating random processes and time parameters into random parameters in order to equivalently cover all potential failures that may occur during the time interval of interest. With the time-independent reliability model, an instantaneous failure surface is attained by using a Kriging-based surrogate model to identify all potential failure events. To enhance the efficacy of failure surface identification, a maximum confidence enhancement method is utilized to update the Kriging model sequentially. Then, the time-variant reliability is approximated using Monte Carlo simulations of the Kriging model where system failures over a time interval are predicted by the instantaneous failure surface. The results of two case studies demonstrate that the proposed approach is able to accurately predict the time evolution of system reliability while requiring much less computational efforts compared with the existing analytical approach. - Highlights: • Developed a new approach for time-variant reliability analysis. • Proposed a novel stochastic process transformation procedure to reduce the dimensionality. • Employed Kriging models with confidence-based adaptive sampling scheme to enhance computational efficiency. • The approach is effective for handling random process in time-variant reliability analysis. • Two case studies are used to demonstrate the efficacy

  18. Rapid and reliable protein structure determination via chemical shift threading.

    Science.gov (United States)

    Hafsa, Noor E; Berjanskii, Mark V; Arndt, David; Wishart, David S

    2018-01-01

    Protein structure determination using nuclear magnetic resonance (NMR) spectroscopy can be both time-consuming and labor intensive. Here we demonstrate how chemical shift threading can permit rapid, robust, and accurate protein structure determination using only chemical shift data. Threading is a relatively old bioinformatics technique that uses a combination of sequence information and predicted (or experimentally acquired) low-resolution structural data to generate high-resolution 3D protein structures. The key motivations behind using NMR chemical shifts for protein threading lie in the fact that they are easy to measure, they are available prior to 3D structure determination, and they contain vital structural information. The method we have developed uses not only sequence and chemical shift similarity but also chemical shift-derived secondary structure, shift-derived super-secondary structure, and shift-derived accessible surface area to generate a high quality protein structure regardless of the sequence similarity (or lack thereof) to a known structure already in the PDB. The method (called E-Thrifty) was found to be very fast (often chemical shift refinement, these results suggest that protein structure determination, using only NMR chemical shifts, is becoming increasingly practical and reliable. E-Thrifty is available as a web server at http://ethrifty.ca .

  19. HiRel: Hybrid Automated Reliability Predictor (HARP) integrated reliability tool system, (version 7.0). Volume 1: HARP introduction and user's guide

    Science.gov (United States)

    Bavuso, Salvatore J.; Rothmann, Elizabeth; Dugan, Joanne Bechta; Trivedi, Kishor S.; Mittal, Nitin; Boyd, Mark A.; Geist, Robert M.; Smotherman, Mark D.

    1994-01-01

    The Hybrid Automated Reliability Predictor (HARP) integrated Reliability (HiRel) tool system for reliability/availability prediction offers a toolbox of integrated reliability/availability programs that can be used to customize the user's application in a workstation or nonworkstation environment. HiRel consists of interactive graphical input/output programs and four reliability/availability modeling engines that provide analytical and simulative solutions to a wide host of reliable fault-tolerant system architectures and is also applicable to electronic systems in general. The tool system was designed to be compatible with most computing platforms and operating systems, and some programs have been beta tested, within the aerospace community for over 8 years. Volume 1 provides an introduction to the HARP program. Comprehensive information on HARP mathematical models can be found in the references.

  20. Finite Element Modelling of a Field-Sensed Magnetic Suspended System for Accurate Proximity Measurement Based on a Sensor Fusion Algorithm with Unscented Kalman Filter.

    Science.gov (United States)

    Chowdhury, Amor; Sarjaš, Andrej

    2016-09-15

    The presented paper describes accurate distance measurement for a field-sensed magnetic suspension system. The proximity measurement is based on a Hall effect sensor. The proximity sensor is installed directly on the lower surface of the electro-magnet, which means that it is very sensitive to external magnetic influences and disturbances. External disturbances interfere with the information signal and reduce the usability and reliability of the proximity measurements and, consequently, the whole application operation. A sensor fusion algorithm is deployed for the aforementioned reasons. The sensor fusion algorithm is based on the Unscented Kalman Filter, where a nonlinear dynamic model was derived with the Finite Element Modelling approach. The advantage of such modelling is a more accurate dynamic model parameter estimation, especially in the case when the real structure, materials and dimensions of the real-time application are known. The novelty of the paper is the design of a compact electro-magnetic actuator with a built-in low cost proximity sensor for accurate proximity measurement of the magnetic object. The paper successively presents a modelling procedure with the finite element method, design and parameter settings of a sensor fusion algorithm with Unscented Kalman Filter and, finally, the implementation procedure and results of real-time operation.

  1. Waste package reliability analysis

    International Nuclear Information System (INIS)

    Pescatore, C.; Sastre, C.

    1983-01-01

    Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table

  2. Methods of Estimation the Reliability and Increasing the Informativeness of the Laboratory Results (Analysis of the Laboratory Case of Measurement the Indicators of Thyroid Function)

    OpenAIRE

    N A Kovyazina; N A Alhutova; N N Zybina; N M Kalinina

    2014-01-01

    The goal of the study was to demonstrate the multilevel laboratory quality management system and point at the methods of estimating the reliability and increasing the amount of information content of the laboratory results (on the example of the laboratory case). Results. The article examines the stages of laboratory quality management which has helped to estimate the reliability of the results of determining Free T3, Free T4 and TSH. The measurement results are presented by the expanded unce...

  3. Towards accurate prediction of unbalance response, oil whirl and oil whip of flexible rotors supported by hydrodynamic bearings

    NARCIS (Netherlands)

    Eling, R.P.T.; te Wierik, M.; van Ostayen, R.A.J.; Rixen, D.J.

    2016-01-01

    Journal bearings are used to support rotors in a wide range of applications. In order to ensure reliable operation, accurate analyses of these rotor-bearing systems are crucial. Coupled analysis of the rotor and the journal bearing is essential in the case that the rotor is flexible. The accuracy of

  4. Reliability assessment using Bayesian networks. Case study on quantative reliability estimation of a software-based motor protection relay

    International Nuclear Information System (INIS)

    Helminen, A.; Pulkkinen, U.

    2003-06-01

    In this report a quantitative reliability assessment of motor protection relay SPAM 150 C has been carried out. The assessment focuses to the methodological analysis of the quantitative reliability assessment using the software-based motor protection relay as a case study. The assessment method is based on Bayesian networks and tries to take the full advantage of the previous work done in a project called Programmable Automation System Safety Integrity assessment (PASSI). From the results and experiences achieved during the work it is justified to claim that the assessment method presented in the work enables a flexible use of qualitative and quantitative elements of reliability related evidence in a single reliability assessment. At the same time the assessment method is a concurrent way of reasoning one's beliefs and references about the reliability of the system. Full advantage of the assessment method is taken when using the method as a way to cultivate the information related to the reliability of software-based systems. The method can also be used as a communicational instrument in a licensing process of software-based systems. (orig.)

  5. The availability of reliable information about medicines in Serbia for health professionals summary of product characteristics

    Directory of Open Access Journals (Sweden)

    Đukić Ljiljana C.

    2015-01-01

    Full Text Available Introduction Today, there are many drugs for the treatment of a large number of indicator areas. Significant financial resources are invested in research with the aim of introducing reliable therapeutics to therapy. Therefore, it is necessary to provide health care professionals exact information about new therapies. The overall process of scientific data, ideas and information exchange is possible through numerous communications of modern IT tools. Methodology According to the Law, key information on registered drug is included in the Summary of Product Characteristics (SPC for health professionals, which is harmonized with EU directives and regulations (SmPC.Protocol content and structure of the information provided in SPC is determined in the guidelines of the EU, therefore, a unique set of data is established for all the drugs registered in Serbia. Topic This paper presents the key segments of SPC, with special reference to the description of the regulations that are required for data related to indications, mechanism of action, dosage, contraindications, side effects, interactions and other important information regarding the profile of the drug, which are standardized and harmonized with the structure of identical documents which operate at the EU level, or EMEA. Conclusions SPC is the regulatory determined technical document on medicinal products in the RS in which there are listed scientifically proven, clinical and pharmacological data and information on the profile of the drug, which are essential for health professionals - doctors and pharmacists in the implementation of pharmacotherapy in our society. This document is the starting point for the development of applied Pharmacoinformatics and it includes a range of activities important for the development of appropriate manuals and makes available data and information for monitoring indicators of the national policy on drugs and modern effective drugs treatment.

  6. Reliability-based optimization of an active vibration controller using evolutionary algorithms

    Science.gov (United States)

    Saraygord Afshari, Sajad; Pourtakdoust, Seid H.

    2017-04-01

    Many modern industrialized systems such as aircrafts, rotating turbines, satellite booms, etc. cannot perform their desired tasks accurately if their uninhibited structural vibrations are not controlled properly. Structural health monitoring and online reliability calculations are emerging new means to handle system imposed uncertainties. As stochastic forcing are unavoidable, in most engineering systems, it is often needed to take them into the account for the control design process. In this research, smart material technology is utilized for structural health monitoring and control in order to keep the system in a reliable performance range. In this regard, a reliability-based cost function is assigned for both controller gain optimization as well as sensor placement. The proposed scheme is implemented and verified for a wing section. Comparison of results for the frequency responses is considered to show potential applicability of the presented technique.

  7. Guidelines for Reporting Reliability and Agreement Studies (GRRAS) were proposed

    DEFF Research Database (Denmark)

    Kottner, Jan; Audigé, Laurent; Brorson, Stig

    2011-01-01

    Results of reliability and agreement studies are intended to provide information about the amount of error inherent in any diagnosis, score, or measurement. The level of reliability and agreement among users of scales, instruments, or classifications is widely unknown. Therefore, there is a need ......, standards, or guidelines for reporting reliability and agreement in the health care and medical field are lacking. The objective was to develop guidelines for reporting reliability and agreement studies....

  8. Reliability of the Alzheimer's disease assessment scale (ADAS-Cog) in longitudinal studies.

    Science.gov (United States)

    Khan, Anzalee; Yavorsky, Christian; DiClemente, Guillermo; Opler, Mark; Liechti, Stacy; Rothman, Brian; Jovic, Sofija

    2013-11-01

    Considering the scarcity of longitudinal assessments of reliability, there is need for a more precise understanding of cognitive decline in Alzheimer's Disease (AD). The primary goal was to assess longitudinal changes in inter-rater reliability, test retest reliability and internal consistency of scores of the ADAS-Cog. 2,618 AD subjects were enrolled in seven randomized, double-blind, placebo-controlled, multicenter-trials from 1986 to 2009. Reliability, internal-consistency and cross-sectional analysis of ADAS-Cog and MMSE across seven visits were examined. Intra-class correlation (ICC) for ADAS-Cog was moderate to high supporting their reliability. Absolute Agreement ICCs 0.392 (Visit-7) to 0.806 (Visit-2) showed a progressive decrease in correlations across time. Item analysis revealed a decrease in item correlations, with the lowest correlations for Visit 7 for Commands (ICC=0.148), Comprehension (ICC=0.092), Spoken Language (ICC=0.044). Suitable assessment of AD treatments is maintained through accurate measurement of clinically significant outcomes. Targeted rater education ADAS-Cog items over-time can improve ability to administer and score the scale.

  9. Reliability Information Analysis Center 1st Quarter 2007, Technical Area Task (TAT) Report

    Science.gov (United States)

    2007-02-05

    Library or [twField/Test 217Plus Ally w/ a.romtu DAAData Experience Data Need t( rdito Trqnd • s aa(Model) develol analisis Mappng & ANLED217Plu...of collected reliability data and have discovered that even with sparse data, analysis of the data shows clustering of reliability data by equipment...intended search target. Conceptually cluster discovered data to allow more detailed analysis by equipment type. For example, it may be useful to

  10. Culture Representation in Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    David Gertman; Julie Marble; Steven Novack

    2006-12-01

    Understanding human-system response is critical to being able to plan and predict mission success in the modern battlespace. Commonly, human reliability analysis has been used to predict failures of human performance in complex, critical systems. However, most human reliability methods fail to take culture into account. This paper takes an easily understood state of the art human reliability analysis method and extends that method to account for the influence of culture, including acceptance of new technology, upon performance. The cultural parameters used to modify the human reliability analysis were determined from two standard industry approaches to cultural assessment: Hofstede’s (1991) cultural factors and Davis’ (1989) technology acceptance model (TAM). The result is called the Culture Adjustment Method (CAM). An example is presented that (1) reviews human reliability assessment with and without cultural attributes for a Supervisory Control and Data Acquisition (SCADA) system attack, (2) demonstrates how country specific information can be used to increase the realism of HRA modeling, and (3) discusses the differences in human error probability estimates arising from cultural differences.

  11. A universal and reliable assay for molecular sex identification of three-spined sticklebacks (Gasterosteus aculeatus).

    Science.gov (United States)

    Toli, E-A; Calboli, F C F; Shikano, T; Merilä, J

    2016-11-01

    In heterogametic species, biological differences between the two sexes are ubiquitous, and hence, errors in sex identification can be a significant source of noise and bias in studies where sex-related sources of variation are of interest or need to be controlled for. We developed and validated a universal multimarker assay for reliable sex identification of three-spined sticklebacks (Gasterosteus aculeatus). The assay makes use of genotype scores from three sex-linked loci and utilizes Bayesian probabilistic inference to identify sex of the genotyped individuals. The results, validated with 286 phenotypically sexed individuals from six populations of sticklebacks representing all major genetic lineages (cf. Pacific, Atlantic and Japan Sea), indicate that in contrast to commonly used single-marker-based sex identification assays, the developed multimarker assay should be 100% accurate. As the markers in the assay can be scored from agarose gels, it provides a quick and cost-efficient tool for universal sex identification of three-spined sticklebacks. The general principle of combining information from multiple markers to improve the reliability of sex identification is transferable and can be utilized to develop and validate similar assays for other species. © 2016 John Wiley & Sons Ltd.

  12. Standardization of reliability reporting for cochlear implants: an interim report.

    Science.gov (United States)

    Backous, Douglas D; Watson, Stacey D

    2007-04-01

    still in situ due to patient choice not to be re-implanted are considered category C and included in CSR reports. Implants that cannot be classified at explant are placed in an "under investigation" category while evaluation is completed. If no classification is made by 6 months, these devices will be included in the CSR report. Notification to the implant center regarding "in" or "out of specification" will be made within 60 d of the explant arriving at the manufacturer with final root cause of failure reported to centers when complete. Information will be passed on to patients by members of the implant team. A standardized form will be created to provide the manufacturers with necessary patient information to guide reliability analysis, including performance after re-implant. The standard for reliability reporting described in this paper improves patient care by presenting data which are understandable to clinicians delivering cochlear implant services. It fosters fair and accurate reporting without discriminating or granting perceived advantage to any manufacturer. This standard provides a basis for reporting research related to or including device reliability in the medical literature.

  13. Reliability estimation for check valves and other components

    International Nuclear Information System (INIS)

    McElhaney, K.L.; Staunton, R.H.

    1996-01-01

    For years the nuclear industry has depended upon component operational reliability information compiled from reliability handbooks and other generic sources as well as private databases generated by recognized experts both within and outside the nuclear industry. Regrettably, these technical bases lacked the benefit of large-scale operational data and comprehensive data verification, and did not take into account the parameters and combinations of parameters that affect the determination of failure rates. This paper briefly examines the historic use of generic component reliability data, its sources, and its limitations. The concept of using a single failure rate for a particular component type is also examined. Particular emphasis is placed on check valves due to the information available on those components. The Appendix presents some of the results of the extensive analyses done by Oak Ridge National Laboratory (ORNL) on check valve performance

  14. Validation of the Malay version of the Amsterdam Preoperative Anxiety and Information Scale (APAIS).

    Science.gov (United States)

    Mohd Fahmi, Z; Lai, L L; Loh, P S

    2015-08-01

    Preoperative anxiety is a significant problem worldwide that may affect patients' surgical outcome. By using a simple and reliable tool such as the Amsterdam Preoperative Anxiety and Information Scale (APAIS), anaesthesiologists would be able to assess preoperative anxiety adequately and accurately. The purpose of this study was to develop and validate the Malay version of APAIS (Malay-APAIS), and assess the factors associated with higher anxiety scores. The authors performed forward and backward translation of APAIS into Malay and then tested on 200 patients in the anaesthetic clinic of University Malaya Medical Centre. Psychometric analysis was performed with factor analysis, internal consistency and correlation with Spielberger's State-Trait Anxiety Inventory (STAI-state). A good correlation was shown with STAI-state (r = 0.59). Anxiety and need for information both emerged with high internal consistency (Cronbach's alpha 0.93 and 0.90 respectively). Female gender, surgery with a higher risk and need for information were found to be associated with higher anxiety scores. On the other hand, previous experience with surgery had lower need for information. The Malay-APAIS is a valid and reliable tool for the assessment of patients' preoperative anxiety and their need for information. By understanding and measuring patient's concerns objectively, the perioperative management will improve to a much higher standard of care.

  15. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  16. The instantaneous linear motion information measurement method based on inertial sensors for ships

    Science.gov (United States)

    Yang, Xu; Huang, Jing; Gao, Chen; Quan, Wei; Li, Ming; Zhang, Yanshun

    2018-05-01

    Ship instantaneous line motion information is the important foundation for ship control, which needs to be measured accurately. For this purpose, an instantaneous line motion measurement method based on inertial sensors is put forward for ships. By introducing a half-fixed coordinate system to realize the separation between instantaneous line motion and ship master movement, the instantaneous line motion acceleration of ships can be obtained with higher accuracy. Then, the digital high-pass filter is applied to suppress the velocity error caused by the low frequency signal such as schuler period. Finally, the instantaneous linear motion displacement of ships can be measured accurately. Simulation experimental results show that the method is reliable and effective, and can realize the precise measurement of velocity and displacement of instantaneous line motion for ships.

  17. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  18. Frame-of-Reference Training: Establishing Reliable Assessment of Teaching Effectiveness.

    Science.gov (United States)

    Newman, Lori R; Brodsky, Dara; Jones, Richard N; Schwartzstein, Richard M; Atkins, Katharyn Meredith; Roberts, David H

    2016-01-01

    Frame-of-reference (FOR) training has been used successfully to teach faculty how to produce accurate and reliable workplace-based ratings when assessing a performance. We engaged 21 Harvard Medical School faculty members in our pilot and implementation studies to determine the effectiveness of using FOR training to assess health professionals' teaching performances. All faculty were novices at rating their peers' teaching effectiveness. Before FOR training, we asked participants to evaluate a recorded lecture using a criterion-based peer assessment of medical lecturing instrument. At the start of training, we discussed the instrument and emphasized its precise behavioral standards. During training, participants practiced rating lectures and received immediate feedback on how well they categorized and scored performances as compared with expert-derived scores of the same lectures. At the conclusion of the training, we asked participants to rate a post-training recorded lecture to determine agreement with the experts' scores. Participants and experts had greater rating agreement for the post-training lecture compared with the pretraining lecture. Through this investigation, we determined that FOR training is a feasible method to teach faculty how to accurately and reliably assess medical lectures. Medical school instructors and continuing education presenters should have the opportunity to be observed and receive feedback from trained peer observers. Our results show that it is possible to use FOR rater training to teach peer observers how to accurately rate medical lectures. The process is time efficient and offers the prospect for assessment and feedback beyond traditional learner evaluation of instruction.

  19. An Effective Method to Accurately Calculate the Phase Space Factors for β"-β"- Decay

    International Nuclear Information System (INIS)

    Horoi, Mihai; Neacsu, Andrei

    2016-01-01

    Accurate calculations of the electron phase space factors are necessary for reliable predictions of double-beta decay rates and for the analysis of the associated electron angular and energy distributions. We present an effective method to calculate these phase space factors that takes into account the distorted Coulomb field of the daughter nucleus, yet it allows one to easily calculate the phase space factors with good accuracy relative to the most exact methods available in the recent literature.

  20. Reliability metrics extraction for power electronics converter stressed by thermal cycles

    DEFF Research Database (Denmark)

    Ma, Ke; Choi, Uimin; Blaabjerg, Frede

    2017-01-01

    Due to the continuous demands for highly reliable and cost-effective power conversion, the quantified reliability performances of the power electronics converter are becoming emerging needs. The existing reliability modelling approaches for the power electronics converter mainly focuses on the pr...... performance of power electronics system. The final predicted results showed good accuracy with much more reliability information compared to the existing approaches, and the quantified reliability correlation to the mission profiles of converter is mathematically established....

  1. Reliability of community health worker collected data for planning and policy in a peri-urban area of Kisumu, Kenya.

    Science.gov (United States)

    Otieno, C F; Kaseje, D; Ochieng', B M; Githae, M N

    2012-02-01

    community. Our general objective of this article is to investigate the validity and reliability of Community Based Information, and we deal with research question "What is the reliability of data collected at the Community level by Community health workers?". The methods which we use to find an reliable answer to this question is "Ten percent of all households visited by CHWs for data collection were recollected by a technically trained team. Test/retest method was applied to the data to establish reliability. The Kappa score, sensitivity, specificity and positive predictive values were also used to measure reliability". Finally our findings are as follows: Latrine availability and Antenatal care presented good correspondence between the two sets of data. This was also true for exclusive breast feeding indicator. Measles immunization coverage showed less consistency than the rest of the child health indicators. At last we conclude and recommend that CHWs can accurately and reliably collect household data which can be used for health decisions and actions especially in resource poor settings where other approaches to population based data are too expensive.

  2. Short-Term and Medium-Term Reliability Evaluation for Power Systems With High Penetration of Wind Power

    DEFF Research Database (Denmark)

    Ding, Yi; Singh, Chanan; Goel, Lalit

    2014-01-01

    reliability evaluation techniques for power systems are well developed. These techniques are more focused on steady-state (time-independent) reliability evaluation and have been successfully applied in power system planning and expansion. In the operational phase, however, they may be too rough......The expanding share of the fluctuating and less predictable wind power generation can introduce complexities in power system reliability evaluation and management. This entails a need for the system operator to assess the system status more accurately for securing real-time balancing. The existing...... an approximation of the time-varying behavior of power systems with high penetration of wind power. This paper proposes a time-varying reliability assessment technique. Time-varying reliability models for wind farms, conventional generating units, and rapid start-up generating units are developed and represented...

  3. NDE reliability and SAFT-UT final development

    International Nuclear Information System (INIS)

    Doctor, S.R.; Deffenbaugh, J.D.; Good, M.S.; Green, E.R.; Heasler, P.G.; Reid, L.D.; Simonen, F.A.; Spanner, J.C.; Taylor, T.T.; Vo, T.V.

    1990-01-01

    The Evaluation and Improvement of NDE Reliability for Inservice Inspection of Light Water Reactors (NDE Reliability) program at the Pacific Northwest Laboratory (PNL) was established by the US Nuclear Regulatory Commission (NRC) to determine the reliability of current inservice inspection (ISI) techniques and to develop recommendations that will ensure a suitably high inspection reliability. This is a progress report covering the programmatic work from October 1987 through September 1988. The program for Validation and Technology Transfer for SAFT-UT is designed to accomplish the final step of moving research results into beneficial application. Accomplishments for FY88 in Synthetic Aperture Focusing of Ultrasonic Test data (SAFT-UT) under this program are discussed in this paper. The information is treated under the copies of Code Activities, Field Validation, and Seminars. (orig.)

  4. Reliability in maintenance and design of elastomer sealed closures

    International Nuclear Information System (INIS)

    Lake, W.H.

    1978-01-01

    The methods of reliability are considered for maintenance and design of elastomer sealed containment closures. Component reliability is used to establish a replacement schedule for system maintenance. Reliability data on elastomer seals is used to evaluate the common practice of annual replacement, and to calculate component reliability values for several typical shipment time periods. System reliability methods are used to examine the relative merits of typical closure designs. These include single component and redundant seal closure, with and without closure verification testing. The paper presents a general method of quantifying the merits of closure designs through the use of reliability analysis, which is a probabilistic technique. The reference list offers a general source of information in the field of reliability, and should offer the opportunity to extend the procedures discussed in this paper to other design safety applications

  5. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  6. Analysis of Parking Reliability Guidance of Urban Parking Variable Message Sign System

    Directory of Open Access Journals (Sweden)

    Zhenyu Mei

    2012-01-01

    Full Text Available Operators of parking guidance and information systems (PGIS often encounter difficulty in determining when and how to provide reliable car park availability information to drivers. Reliability has become a key factor to ensure the benefits of urban PGIS. The present paper is the first to define the guiding parking reliability of urban parking variable message signs (VMSs. By analyzing the parking choice under guiding and optional parking lots, a guiding parking reliability model was constructed. A mathematical program was formulated to determine the guiding parking reliability of VMS. The procedures were applied to a numerical example, and the factors that affect guiding reliability were analyzed. The quantitative changes of the parking berths and the display conditions of VMS were found to be the most important factors influencing guiding reliability. The parking guiding VMS achieved the best benefit when the parking supply was close to or was less than the demand. The combination of a guiding parking reliability model and parking choice behavior offers potential for PGIS operators to reduce traffic congestion in central city areas.

  7. How Do Qataris Source Health Information?

    Directory of Open Access Journals (Sweden)

    Sopna M Choudhury

    population. Internet search engines can be utilized to guide users to websites, developed and monitored by healthcare providers, to help convey reliable and accurate health information to Qatar's growing population.

  8. Component aging and reliability trends in Loviisa Nuclear Power Plant

    International Nuclear Information System (INIS)

    Jankala, K.E.; Vaurio, J.K.

    1989-01-01

    A plant-specific reliability data collection and analysis system has been developed at the Loviisa Nuclear Power Plant to perform tests for component aging and analysis of reliability trends. The system yields both mean values an uncertainty distribution information for reliability parameters to be used in the PSA project underway and in living-PSA applications. Several different trend models are included in the reliability analysis system. Simple analytical expressions have been derived from the parameters of these models, and their variances have been obtained using the information matrix. This paper is focused on the details of the learning/aging models and the estimation of their parameters and statistical accuracies. Applications to the historical data of the Loviisa plant are presented. The results indicate both up- and down-trends in failure rates as well as individuality between nominally identical components

  9. Application of fault tree analysis for customer reliability assessment of a distribution power system

    International Nuclear Information System (INIS)

    Abdul Rahman, Fariz; Varuttamaseni, Athi; Kintner-Meyer, Michael; Lee, John C.

    2013-01-01

    A new method is developed for predicting customer reliability of a distribution power system using the fault tree approach with customer weighted values of component failure frequencies and downtimes. Conventional customer reliability prediction of the electric grid employs the system average (SA) component failure frequency and downtime that are weighted by only the quantity of the components in the system. These SA parameters are then used to calculate the reliability and availability of components in the system, and eventually to find the effect on customer reliability. Although this approach is intuitive, information is lost regarding customer disturbance experiences when customer information is not utilized in the SA parameter calculations, contributing to inaccuracies when predicting customer reliability indices in our study. Hence our new approach directly incorporates customer disturbance information in component failure frequency and downtime calculations by weighting these parameters with information of customer interruptions. This customer weighted (CW) approach significantly improves the prediction of customer reliability indices when applied to our reliability model with fault tree and two-state Markov chain formulations. Our method has been successfully applied to an actual distribution power system that serves over 2.1 million customers. Our results show an improved benchmarking performance on the system average interruption frequency index (SAIFI) by 26% between the SA-based and CW-based reliability calculations. - Highlights: ► We model the reliability of a power system with fault tree and two-state Markov chain. ► We propose using customer weighted component failure frequencies and downtimes. ► Results show customer weighted values perform superior to component average values. ► This method successfully incorporates customer disturbance information into the model.

  10. Reliability of Bluetooth Technology for Travel Time Estimation

    DEFF Research Database (Denmark)

    Araghi, Bahar Namaki; Olesen, Jonas Hammershøj; Krishnan, Rajesh

    2015-01-01

    . However, their corresponding impacts on accuracy and reliability of estimated travel time have not been evaluated. In this study, a controlled field experiment is conducted to collect both Bluetooth and GPS data for 1000 trips to be used as the basis for evaluation. Data obtained by GPS logger is used...... to calculate actual travel time, referred to as ground truth, and to geo-code the Bluetooth detection events. In this setting, reliability is defined as the percentage of devices captured per trip during the experiment. It is found that, on average, Bluetooth-enabled devices will be detected 80% of the time......-range antennae detect Bluetooth-enabled devices in a closer location to the sensor, thus providing a more accurate travel time estimate. However, the smaller the size of the detection zone, the lower the penetration rate, which could itself influence the accuracy of estimates. Therefore, there has to be a trade...

  11. An integrated reliability management system for nuclear power plants

    International Nuclear Information System (INIS)

    Kimura, T.; Shimokawa, H.; Matsushima, H.

    1998-01-01

    The responsibility in the nuclear field of the Government, utilities and manufactures has increased in the past years due to the need of stable operation and great reliability of nuclear power plants. The need to improve the reliability is not only for the new plants but also for those now running. So, several measures have been taken to improve reliability. In particular, the plant manufactures have developed a reliability management system for each phase (planning, construction, maintenance and operation) and these have been integrated as a unified system. This integrated reliability management system for nuclear power plants contains information about plant performance, failures and incidents which have occurred in the plants. (author)

  12. Evaluation of Quality and Readability of Health Information Websites Identified through India’s Major Search Engines

    Directory of Open Access Journals (Sweden)

    S. Raj

    2016-01-01

    Full Text Available Background. The available health information on websites should be reliable and accurate in order to make informed decisions by community. This study was done to assess the quality and readability of health information websites on World Wide Web in India. Methods. This cross-sectional study was carried out in June 2014. The key words “Health” and “Information” were used on search engines “Google” and “Yahoo.” Out of 50 websites (25 from each search engines, after exclusion, 32 websites were evaluated. LIDA tool was used to assess the quality whereas the readability was assessed using Flesch Reading Ease Score (FRES, Flesch-Kincaid Grade Level (FKGL, and SMOG. Results. Forty percent of websites (n=13 were sponsored by government. Health On the Net Code of Conduct (HONcode certification was present on 50% (n=16 of websites. The mean LIDA score (74.31 was average. Only 3 websites scored high on LIDA score. Only five had readability scores at recommended sixth-grade level. Conclusion. Most health information websites had average quality especially in terms of usability and reliability and were written at high readability levels. Efforts are needed to develop the health information websites which can help general population in informed decision making.

  13. Evaluation for nuclear safety-critical software reliability of DCS

    International Nuclear Information System (INIS)

    Liu Ying

    2015-01-01

    With the development of control and information technology at NPPs, software reliability is important because software failure is usually considered as one form of common cause failures in Digital I and C Systems (DCS). The reliability analysis of DCS, particularly qualitative and quantitative evaluation on the nuclear safety-critical software reliability belongs to a great challenge. To solve this problem, not only comprehensive evaluation model and stage evaluation models are built in this paper, but also prediction and sensibility analysis are given to the models. It can make besement for evaluating the reliability and safety of DCS. (author)

  14. FRELIB, Failure Reliability Index Calculation

    International Nuclear Information System (INIS)

    Parkinson, D.B.; Oestergaard, C.

    1984-01-01

    1 - Description of problem or function: Calculation of the reliability index given the failure boundary. A linearization point (design point) is found on the failure boundary for a stationary reliability index (min) and a stationary failure probability density function along the failure boundary, provided that the basic variables are normally distributed. 2 - Method of solution: Iteration along the failure boundary which must be specified - together with its partial derivatives with respect to the basic variables - by the user in a subroutine FSUR. 3 - Restrictions on the complexity of the problem: No distribution information included (first-order-second-moment-method). 20 basic variables (could be extended)

  15. Reliability of Current Biokinetic and Dosimetric Models for Radionuclides: A Pilot Study

    Energy Technology Data Exchange (ETDEWEB)

    Leggett, Richard Wayne [ORNL; Eckerman, Keith F [ORNL; Meck, Robert A. [U.S. Nuclear Regulatory Commission

    2008-10-01

    This report describes the results of a pilot study of the reliability of the biokinetic and dosimetric models currently used by the U.S. Nuclear Regulatory Commission (NRC) as predictors of dose per unit internal or external exposure to radionuclides. The study examines the feasibility of critically evaluating the accuracy of these models for a comprehensive set of radionuclides of concern to the NRC. Each critical evaluation would include: identification of discrepancies between the models and current databases; characterization of uncertainties in model predictions of dose per unit intake or unit external exposure; characterization of variability in dose per unit intake or unit external exposure; and evaluation of prospects for development of more accurate models. Uncertainty refers here to the level of knowledge of a central value for a population, and variability refers to quantitative differences between different members of a population. This pilot study provides a critical assessment of models for selected radionuclides representing different levels of knowledge of dose per unit exposure. The main conclusions of this study are as follows: (1) To optimize the use of available NRC resources, the full study should focus on radionuclides most frequently encountered in the workplace or environment. A list of 50 radionuclides is proposed. (2) The reliability of a dose coefficient for inhalation or ingestion of a radionuclide (i.e., an estimate of dose per unit intake) may depend strongly on the specific application. Multiple characterizations of the uncertainty in a dose coefficient for inhalation or ingestion of a radionuclide may be needed for different forms of the radionuclide and different levels of information of that form available to the dose analyst. (3) A meaningful characterization of variability in dose per unit intake of a radionuclide requires detailed information on the biokinetics of the radionuclide and hence is not feasible for many infrequently

  16. A systematic review of reliability and objective criterion-related validity of physical activity questionnaires

    NARCIS (Netherlands)

    Helmerhorst, Hendrik J. F.; Brage, Søren; Warren, Janet; Besson, Herve; Ekelund, Ulf

    2012-01-01

    Physical inactivity is one of the four leading risk factors for global mortality. Accurate measurement of physical activity (PA) and in particular by physical activity questionnaires (PAQs) remains a challenge. The aim of this paper is to provide an updated systematic review of the reliability and

  17. Reliability measures in item response theory: manifest versus latent correlation functions.

    Science.gov (United States)

    Milanzi, Elasma; Molenberghs, Geert; Alonso, Ariel; Verbeke, Geert; De Boeck, Paul

    2015-02-01

    For item response theory (IRT) models, which belong to the class of generalized linear or non-linear mixed models, reliability at the scale of observed scores (i.e., manifest correlation) is more difficult to calculate than latent correlation based reliability, but usually of greater scientific interest. This is not least because it cannot be calculated explicitly when the logit link is used in conjunction with normal random effects. As such, approximations such as Fisher's information coefficient, Cronbach's α, or the latent correlation are calculated, allegedly because it is easy to do so. Cronbach's α has well-known and serious drawbacks, Fisher's information is not meaningful under certain circumstances, and there is an important but often overlooked difference between latent and manifest correlations. Here, manifest correlation refers to correlation between observed scores, while latent correlation refers to correlation between scores at the latent (e.g., logit or probit) scale. Thus, using one in place of the other can lead to erroneous conclusions. Taylor series based reliability measures, which are based on manifest correlation functions, are derived and a careful comparison of reliability measures based on latent correlations, Fisher's information, and exact reliability is carried out. The latent correlations are virtually always considerably higher than their manifest counterparts, Fisher's information measure shows no coherent behaviour (it is even negative in some cases), while the newly introduced Taylor series based approximations reflect the exact reliability very closely. Comparisons among the various types of correlations, for various IRT models, are made using algebraic expressions, Monte Carlo simulations, and data analysis. Given the light computational burden and the performance of Taylor series based reliability measures, their use is recommended. © 2014 The British Psychological Society.

  18. Reverse transcription-quantitative polymerase chain reaction: description of a RIN-based algorithm for accurate data normalization

    Directory of Open Access Journals (Sweden)

    Boissière-Michot Florence

    2009-04-01

    Full Text Available Abstract Background Reverse transcription-quantitative polymerase chain reaction (RT-qPCR is the gold standard technique for mRNA quantification, but appropriate normalization is required to obtain reliable data. Normalization to accurately quantitated RNA has been proposed as the most reliable method for in vivo biopsies. However, this approach does not correct differences in RNA integrity. Results In this study, we evaluated the effect of RNA degradation on the quantification of the relative expression of nine genes (18S, ACTB, ATUB, B2M, GAPDH, HPRT, POLR2L, PSMB6 and RPLP0 that cover a wide expression spectrum. Our results show that RNA degradation could introduce up to 100% error in gene expression measurements when RT-qPCR data were normalized to total RNA. To achieve greater resolution of small differences in transcript levels in degraded samples, we improved this normalization method by developing a corrective algorithm that compensates for the loss of RNA integrity. This approach allowed us to achieve higher accuracy, since the average error for quantitative measurements was reduced to 8%. Finally, we applied our normalization strategy to the quantification of EGFR, HER2 and HER3 in 104 rectal cancer biopsies. Taken together, our data show that normalization of gene expression measurements by taking into account also RNA degradation allows much more reliable sample comparison. Conclusion We developed a new normalization method of RT-qPCR data that compensates for loss of RNA integrity and therefore allows accurate gene expression quantification in human biopsies.

  19. The reliability of financial information of charitable organizations: an exploratory study based on the Benford’s Law

    Directory of Open Access Journals (Sweden)

    Marco Antonio Figueiredo Milani Filho

    2013-08-01

    Full Text Available Benford's Law (BL is a logarithmic distribution which is useful to detect abnormal patterns of digits in number sets. It is often used as a primary data auditing method for detecting traces of errors, illegal practices or undesired occurrences, such as fraud and earning management. In this descriptive study, I analyzed the financial information (revenue and expenditure of the registered charitable hospitals located in Ontario and Quebec, which have the majority (71.4% of these organizations within Canada. The aim of this study was to verify the reliability of the financial data of the respective hospitals, using the probability distribution predicted by Benford’s Law as a proxy of reliability. The sample was composed by 1,334 observations related to 339 entities operating in the tax year 2009 and 328 entities in 2010, gathered from the Canada Revenue Agency’s database. To analyze the discrepancies between the actual and expected frequencies of the significant-digit, two statistics were calculated: Z-test and Pearson’s chi-square test. The results show that, with a confidence level of 95%, the data set of the organizations located in Ontario and Quebec have similar distribution to the BL, suggesting that, in a preliminary analysis, their financial data are free from bias.

  20. Analysis of fatigue reliability for high temperature and high pressure multi-stage decompression control valve

    Science.gov (United States)

    Yu, Long; Xu, Juanjuan; Zhang, Lifang; Xu, Xiaogang

    2018-03-01

    Based on stress-strength interference theory to establish the reliability mathematical model for high temperature and high pressure multi-stage decompression control valve (HMDCV), and introduced to the temperature correction coefficient for revising material fatigue limit at high temperature. Reliability of key dangerous components and fatigue sensitivity curve of each component are calculated and analyzed by the means, which are analyzed the fatigue life of control valve and combined with reliability theory of control valve model. The impact proportion of each component on the control valve system fatigue failure was obtained. The results is shown that temperature correction factor makes the theoretical calculations of reliability more accurate, prediction life expectancy of main pressure parts accords with the technical requirements, and valve body and the sleeve have obvious influence on control system reliability, the stress concentration in key part of control valve can be reduced in the design process by improving structure.

  1. Dynamic sensing model for accurate delectability of environmental phenomena using event wireless sensor network

    Science.gov (United States)

    Missif, Lial Raja; Kadhum, Mohammad M.

    2017-09-01

    Wireless Sensor Network (WSN) has been widely used for monitoring where sensors are deployed to operate independently to sense abnormal phenomena. Most of the proposed environmental monitoring systems are designed based on a predetermined sensing range which does not reflect the sensor reliability, event characteristics, and the environment conditions. Measuring of the capability of a sensor node to accurately detect an event within a sensing field is of great important for monitoring applications. This paper presents an efficient mechanism for even detection based on probabilistic sensing model. Different models have been presented theoretically in this paper to examine their adaptability and applicability to the real environment applications. The numerical results of the experimental evaluation have showed that the probabilistic sensing model provides accurate observation and delectability of an event, and it can be utilized for different environment scenarios.

  2. A simulation model for reliability evaluation of Space Station power systems

    Science.gov (United States)

    Singh, C.; Patton, A. D.; Kumar, Mudit; Wagner, H.

    1988-01-01

    A detailed simulation model for the hybrid Space Station power system is presented which allows photovoltaic and solar dynamic power sources to be mixed in varying proportions. The model considers the dependence of reliability and storage characteristics during the sun and eclipse periods, and makes it possible to model the charging and discharging of the energy storage modules in a relatively accurate manner on a continuous basis.

  3. Analytical approach for confirming the achievement of LMFBR reliability goals

    International Nuclear Information System (INIS)

    Ingram, G.E.; Elerath, J.G.; Wood, A.P.

    1981-01-01

    The approach, recommended by GE-ARSD, for confirming the achievement of LMFBR reliability goals relies upon a comprehensive understanding of the physical and operational characteristics of the system and the environments to which the system will be subjected during its operational life. This kind of understanding is required for an approach based on system hardware testing or analyses, as recommended in this report. However, for a system as complex and expensive as the LMFBR, an approach which relies primarily on system hardware testing would be prohibitive both in cost and time to obtain the required system reliability test information. By using an analytical approach, results of tests (reliability and functional) at a low level within the specific system of interest, as well as results from other similar systems can be used to form the data base for confirming the achievement of the system reliability goals. This data, along with information relating to the design characteristics and operating environments of the specific system, will be used in the assessment of the system's reliability

  4. Our Commitment to Reliable Health and Medical Information

    Science.gov (United States)

    ... the intent of a website to publish transparent information. The transparency of the website will improve the usefulness and objectivity of the information and the publishment of correct data. The HONcode ...

  5. Identifying Complementary and Alternative Medicine Usage Information from Internet Resources. A Systematic Review.

    Science.gov (United States)

    Sharma, Vivekanand; Holmes, John H; Sarkar, Indra N

    2016-08-05

    Identify and highlight research issues and methods used in studying Complementary and Alternative Medicine (CAM) information needs, access, and exchange over the Internet. A literature search was conducted using Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines from PubMed to identify articles that have studied Internet use in the CAM context. Additional searches were conducted at Nature.com and Google Scholar. The Internet provides a major medium for attaining CAM information and can also serve as an avenue for conducting CAM related surveys. Based on the literature analyzed in this review, there seems to be significant interest in developing methodologies for identifying CAM treatments, including the analysis of search query data and social media platform discussions. Several studies have also underscored the challenges in developing approaches for identifying the reliability of CAM-related information on the Internet, which may not be supported with reliable sources. The overall findings of this review suggest that there are opportunities for developing approaches for making available accurate information and developing ways to restrict the spread and sale of potentially harmful CAM products and information. Advances in Internet research are yet to be used in context of understanding CAM prevalence and perspectives. Such approaches may provide valuable insights into the current trends and needs in context of CAM use and spread.

  6. IDENTIFYING COMPLEMENTARY AND ALTERNATIVE MEDICINE USAGE INFORMATION FROM INTERNET RESOURCES: A SYSTEMATIC REVIEW

    Science.gov (United States)

    Sharma, V.; Holmes, J.H.; Sarkar, I.N.

    2016-01-01

    SUMMARY Objective Identify and highlight research issues and methods used in studying Complementary and Alternative Medicine (CAM) information needs, access, and exchange over the Internet. Methods A literature search was conducted using Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines from PubMed to identify articles that have studied Internet use in the CAM context. Additional searches were conducted at Nature.com and Google Scholar. Results The Internet provides a major medium for attaining CAM information and can also serve as an avenue for conducting CAM related surveys. Based on the literature analyzed in this review, there seems to be significant interest in developing methodologies for identifying CAM treatments, including the analysis of search query data and social media platform discussions. Several studies have also underscored the challenges in developing approaches for identifying the reliability of CAM-related information on the Internet, which may not be supported with reliable sources. The overall findings of this review suggest that there are opportunities for developing approaches for making available accurate information and developing ways to restrict the spread and sale of potentially harmful CAM products and information. Conclusions Advances in Internet research are yet to be used in context of understanding CAM prevalence and perspectives. Such approaches may provide valuable insights into the current trends and needs in context of CAM use and spread. PMID:27352304

  7. Reliability analysis framework for computer-assisted medical decision systems

    International Nuclear Information System (INIS)

    Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.

    2007-01-01

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  8. Development of RBDGG Solver and Its Application to System Reliability Analysis

    International Nuclear Information System (INIS)

    Kim, Man Cheol

    2010-01-01

    For the purpose of making system reliability analysis easier and more intuitive, RBDGG (Reliability Block diagram with General Gates) methodology was introduced as an extension of the conventional reliability block diagram. The advantage of the RBDGG methodology is that the structure of a RBDGG model is very similar to the actual structure of the analyzed system, and therefore the modeling of a system for system reliability and unavailability analysis becomes very intuitive and easy. The main idea of the development of the RBDGG methodology is similar with that of the development of the RGGG (Reliability Graph with General Gates) methodology, which is an extension of a conventional reliability graph. The newly proposed methodology is now implemented into a software tool, RBDGG Solver. RBDGG Solver was developed as a WIN32 console application. RBDGG Solver receives information on the failure modes and failure probabilities of each component in the system, along with the connection structure and connection logics among the components in the system. Based on the received information, RBDGG Solver automatically generates a system reliability analysis model for the system, and then provides the analysis results. In this paper, application of RBDGG Solver to the reliability analysis of an example system, and verification of the calculation results are provided for the purpose of demonstrating how RBDGG Solver is used for system reliability analysis

  9. A Data-Driven Reliability Estimation Approach for Phased-Mission Systems

    Directory of Open Access Journals (Sweden)

    Hua-Feng He

    2014-01-01

    Full Text Available We attempt to address the issues associated with reliability estimation for phased-mission systems (PMS and present a novel data-driven approach to achieve reliability estimation for PMS using the condition monitoring information and degradation data of such system under dynamic operating scenario. In this sense, this paper differs from the existing methods only considering the static scenario without using the real-time information, which aims to estimate the reliability for a population but not for an individual. In the presented approach, to establish a linkage between the historical data and real-time information of the individual PMS, we adopt a stochastic filtering model to model the phase duration and obtain the updated estimation of the mission time by Bayesian law at each phase. At the meanwhile, the lifetime of PMS is estimated from degradation data, which are modeled by an adaptive Brownian motion. As such, the mission reliability can be real time obtained through the estimated distribution of the mission time in conjunction with the estimated lifetime distribution. We demonstrate the usefulness of the developed approach via a numerical example.

  10. 78 FR 22773 - Revisions to Reliability Standard for Transmission Vegetation Management; Correction

    Science.gov (United States)

    2013-04-17

    ...; Order No. 777] Revisions to Reliability Standard for Transmission Vegetation Management; Correction... modifying certain Reliability Standards. DATES: Effective on May 28, 2013. FOR FURTHER INFORMATION CONTACT... Requirement R2 of Reliability Standard FAC-003-2 within 45 days of the effective date of the Final Rule, while...

  11. On reliable discovery of molecular signatures

    Directory of Open Access Journals (Sweden)

    Björkegren Johan

    2009-01-01

    Full Text Available Abstract Background Molecular signatures are sets of genes, proteins, genetic variants or other variables that can be used as markers for a particular phenotype. Reliable signature discovery methods could yield valuable insight into cell biology and mechanisms of human disease. However, it is currently not clear how to control error rates such as the false discovery rate (FDR in signature discovery. Moreover, signatures for cancer gene expression have been shown to be unstable, that is, difficult to replicate in independent studies, casting doubts on their reliability. Results We demonstrate that with modern prediction methods, signatures that yield accurate predictions may still have a high FDR. Further, we show that even signatures with low FDR may fail to replicate in independent studies due to limited statistical power. Thus, neither stability nor predictive accuracy are relevant when FDR control is the primary goal. We therefore develop a general statistical hypothesis testing framework that for the first time provides FDR control for signature discovery. Our method is demonstrated to be correct in simulation studies. When applied to five cancer data sets, the method was able to discover molecular signatures with 5% FDR in three cases, while two data sets yielded no significant findings. Conclusion Our approach enables reliable discovery of molecular signatures from genome-wide data with current sample sizes. The statistical framework developed herein is potentially applicable to a wide range of prediction problems in bioinformatics.

  12. Accuracy and reliability of facial soft tissue depth measurements using cone beam computer tomography

    NARCIS (Netherlands)

    Fourie, Zacharias; Damstra, Janalt; Gerrits, Pieter; Ren, Yijin

    2010-01-01

    It is important to have accurate and reliable measurements of soft tissue thickness for specific landmarks of the face and scalp when producing a facial reconstruction. In the past several methods have been created to measure facial soft tissue thickness (FSTT) in cadavers and in the living. The

  13. A hybrid algorithm for reliability analysis combining Kriging and subset simulation importance sampling

    International Nuclear Information System (INIS)

    Tong, Cao; Sun, Zhili; Zhao, Qianli; Wang, Qibin; Wang, Shuang

    2015-01-01

    To solve the problem of large computation when failure probability with time-consuming numerical model is calculated, we propose an improved active learning reliability method called AK-SSIS based on AK-IS algorithm. First, an improved iterative stopping criterion in active learning is presented so that iterations decrease dramatically. Second, the proposed method introduces Subset simulation importance sampling (SSIS) into the active learning reliability calculation, and then a learning function suitable for SSIS is proposed. Finally, the efficiency of AK-SSIS is proved by two academic examples from the literature. The results show that AK-SSIS requires fewer calls to the performance function than AK-IS, and the failure probability obtained from AK-SSIS is very robust and accurate. Then this method is applied on a spur gear pair for tooth contact fatigue reliability analysis.

  14. A hybrid algorithm for reliability analysis combining Kriging and subset simulation importance sampling

    Energy Technology Data Exchange (ETDEWEB)

    Tong, Cao; Sun, Zhili; Zhao, Qianli; Wang, Qibin [Northeastern University, Shenyang (China); Wang, Shuang [Jiangxi University of Science and Technology, Ganzhou (China)

    2015-08-15

    To solve the problem of large computation when failure probability with time-consuming numerical model is calculated, we propose an improved active learning reliability method called AK-SSIS based on AK-IS algorithm. First, an improved iterative stopping criterion in active learning is presented so that iterations decrease dramatically. Second, the proposed method introduces Subset simulation importance sampling (SSIS) into the active learning reliability calculation, and then a learning function suitable for SSIS is proposed. Finally, the efficiency of AK-SSIS is proved by two academic examples from the literature. The results show that AK-SSIS requires fewer calls to the performance function than AK-IS, and the failure probability obtained from AK-SSIS is very robust and accurate. Then this method is applied on a spur gear pair for tooth contact fatigue reliability analysis.

  15. Reliability Characteristics of Power Plants

    Directory of Open Access Journals (Sweden)

    Zbynek Martinek

    2017-01-01

    Full Text Available This paper describes the phenomenon of reliability of power plants. It gives an explanation of the terms connected with this topic as their proper understanding is important for understanding the relations and equations which model the possible real situations. The reliability phenomenon is analysed using both the exponential distribution and the Weibull distribution. The results of our analysis are specific equations giving information about the characteristics of the power plants, the mean time of operations and the probability of failure-free operation. Equations solved for the Weibull distribution respect the failures as well as the actual operating hours. Thanks to our results, we are able to create a model of dynamic reliability for prediction of future states. It can be useful for improving the current situation of the unit as well as for creating the optimal plan of maintenance and thus have an impact on the overall economics of the operation of these power plants.

  16. Feasibility study for the European Reliability Data System (ERDS)

    International Nuclear Information System (INIS)

    Mancini, G.

    1980-01-01

    In the framework of the Reactor Safety Programme of the Commission of the European Communities, the JRC - Ispra Establishment has performed a feasibility study for an integrated European Reliability Data System, the aim of which is the collection and organization of information related to the operation of LWRs with regard to component and systems behaviour, abnormal occurrences, outages, etc. Component Event Data Bank (CEGB), Abnormal Occurrences Reporting System, Generic Reliability Parameter Data Bank, Operating Unit Status Reports and the main activities carried out during the last two years are described. The most important achievements are briefly reported, such as: Reference Classification for Systems, Components and Failure Events, Informatic Structure of the Pilot Experiment of the CEDB, Information Retrieval System for Abnormal Occurrences Reports, Data Bank on Component Reliability Parameters, System on the Exchange of Operation Experience of LWRs, Statistical Data Treatment. Finally, the general conclusions of the feasibility study are summarized: the possibility and the usefulness for the creation of an integrated European Reliability Data System are outlined. (author)

  17. The relationship between cost estimates reliability and BIM adoption: SEM analysis

    Science.gov (United States)

    Ismail, N. A. A.; Idris, N. H.; Ramli, H.; Rooshdi, R. R. Raja Muhammad; Sahamir, S. R.

    2018-02-01

    This paper presents the usage of Structural Equation Modelling (SEM) approach in analysing the effects of Building Information Modelling (BIM) technology adoption in improving the reliability of cost estimates. Based on the questionnaire survey results, SEM analysis using SPSS-AMOS application examined the relationships between BIM-improved information and cost estimates reliability factors, leading to BIM technology adoption. Six hypotheses were established prior to SEM analysis employing two types of SEM models, namely the Confirmatory Factor Analysis (CFA) model and full structural model. The SEM models were then validated through the assessment on their uni-dimensionality, validity, reliability, and fitness index, in line with the hypotheses tested. The final SEM model fit measures are: P-value=0.000, RMSEA=0.0790.90, TLI=0.956>0.90, NFI=0.935>0.90 and ChiSq/df=2.259; indicating that the overall index values achieved the required level of model fitness. The model supports all the hypotheses evaluated, confirming that all relationship exists amongst the constructs are positive and significant. Ultimately, the analysis verified that most of the respondents foresee better understanding of project input information through BIM visualization, its reliable database and coordinated data, in developing more reliable cost estimates. They also perceive to accelerate their cost estimating task through BIM adoption.

  18. Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Antônio Dâmaso

    2017-11-01

    Full Text Available Power consumption is a primary interest in Wireless Sensor Networks (WSNs, and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way.

  19. Integrated Evaluation of Reliability and Power Consumption of Wireless Sensor Networks

    Science.gov (United States)

    Dâmaso, Antônio; Maciel, Paulo

    2017-01-01

    Power consumption is a primary interest in Wireless Sensor Networks (WSNs), and a large number of strategies have been proposed to evaluate it. However, those approaches usually neither consider reliability issues nor the power consumption of applications executing in the network. A central concern is the lack of consolidated solutions that enable us to evaluate the power consumption of applications and the network stack also considering their reliabilities. To solve this problem, we introduce a fully automatic solution to design power consumption aware WSN applications and communication protocols. The solution presented in this paper comprises a methodology to evaluate the power consumption based on the integration of formal models, a set of power consumption and reliability models, a sensitivity analysis strategy to select WSN configurations and a toolbox named EDEN to fully support the proposed methodology. This solution allows accurately estimating the power consumption of WSN applications and the network stack in an automated way. PMID:29113078

  20. Inter-arch digital model vs. manual cast measurements: Accuracy and reliability.

    Science.gov (United States)

    Kiviahde, Heikki; Bukovac, Lea; Jussila, Päivi; Pesonen, Paula; Sipilä, Kirsi; Raustia, Aune; Pirttiniemi, Pertti

    2017-06-28

    The purpose of this study was to evaluate the accuracy and reliability of inter-arch measurements using digital dental models and conventional dental casts. Thirty sets of dental casts with permanent dentition were examined. Manual measurements were done with a digital caliper directly on the dental casts, and digital measurements were made on 3D models by two independent examiners. Intra-class correlation coefficients (ICC), a paired sample t-test or Wilcoxon signed-rank test, and Bland-Altman plots were used to evaluate intra- and inter-examiner error and to determine the accuracy and reliability of the measurements. The ICC values were generally good for manual and excellent for digital measurements. The Bland-Altman plots of all the measurements showed good agreement between the manual and digital methods and excellent inter-examiner agreement using the digital method. Inter-arch occlusal measurements on digital models are accurate and reliable and are superior to manual measurements.

  1. A Two-Phase Space Resection Model for Accurate Topographic Reconstruction from Lunar Imagery with PushbroomScanners.

    Science.gov (United States)

    Xu, Xuemiao; Zhang, Huaidong; Han, Guoqiang; Kwan, Kin Chung; Pang, Wai-Man; Fang, Jiaming; Zhao, Gansen

    2016-04-11

    Exterior orientation parameters' (EOP) estimation using space resection plays an important role in topographic reconstruction for push broom scanners. However, existing models of space resection are highly sensitive to errors in data. Unfortunately, for lunar imagery, the altitude data at the ground control points (GCPs) for space resection are error-prone. Thus, existing models fail to produce reliable EOPs. Motivated by a finding that for push broom scanners, angular rotations of EOPs can be estimated independent of the altitude data and only involving the geographic data at the GCPs, which are already provided, hence, we divide the modeling of space resection into two phases. Firstly, we estimate the angular rotations based on the reliable geographic data using our proposed mathematical model. Then, with the accurate angular rotations, the collinear equations for space resection are simplified into a linear problem, and the global optimal solution for the spatial position of EOPs can always be achieved. Moreover, a certainty term is integrated to penalize the unreliable altitude data for increasing the error tolerance. Experimental results evidence that our model can obtain more accurate EOPs and topographic maps not only for the simulated data, but also for the real data from Chang'E-1, compared to the existing space resection model.

  2. A Two-Phase Space Resection Model for Accurate Topographic Reconstruction from Lunar Imagery with PushbroomScanners

    Directory of Open Access Journals (Sweden)

    Xuemiao Xu

    2016-04-01

    Full Text Available Exterior orientation parameters’ (EOP estimation using space resection plays an important role in topographic reconstruction for push broom scanners. However, existing models of space resection are highly sensitive to errors in data. Unfortunately, for lunar imagery, the altitude data at the ground control points (GCPs for space resection are error-prone. Thus, existing models fail to produce reliable EOPs. Motivated by a finding that for push broom scanners, angular rotations of EOPs can be estimated independent of the altitude data and only involving the geographic data at the GCPs, which are already provided, hence, we divide the modeling of space resection into two phases. Firstly, we estimate the angular rotations based on the reliable geographic data using our proposed mathematical model. Then, with the accurate angular rotations, the collinear equations for space resection are simplified into a linear problem, and the global optimal solution for the spatial position of EOPs can always be achieved. Moreover, a certainty term is integrated to penalize the unreliable altitude data for increasing the error tolerance. Experimental results evidence that our model can obtain more accurate EOPs and topographic maps not only for the simulated data, but also for the real data from Chang’E-1, compared to the existing space resection model.

  3. An efficient and accurate 3D displacements tracking strategy for digital volume correlation

    Science.gov (United States)

    Pan, Bing; Wang, Bo; Wu, Dafang; Lubineau, Gilles

    2014-07-01

    Owing to its inherent computational complexity, practical implementation of digital volume correlation (DVC) for internal displacement and strain mapping faces important challenges in improving its computational efficiency. In this work, an efficient and accurate 3D displacement tracking strategy is proposed for fast DVC calculation. The efficiency advantage is achieved by using three improvements. First, to eliminate the need of updating Hessian matrix in each iteration, an efficient 3D inverse compositional Gauss-Newton (3D IC-GN) algorithm is introduced to replace existing forward additive algorithms for accurate sub-voxel displacement registration. Second, to ensure the 3D IC-GN algorithm that converges accurately and rapidly and avoid time-consuming integer-voxel displacement searching, a generalized reliability-guided displacement tracking strategy is designed to transfer accurate and complete initial guess of deformation for each calculation point from its computed neighbors. Third, to avoid the repeated computation of sub-voxel intensity interpolation coefficients, an interpolation coefficient lookup table is established for tricubic interpolation. The computational complexity of the proposed fast DVC and the existing typical DVC algorithms are first analyzed quantitatively according to necessary arithmetic operations. Then, numerical tests are performed to verify the performance of the fast DVC algorithm in terms of measurement accuracy and computational efficiency. The experimental results indicate that, compared with the existing DVC algorithm, the presented fast DVC algorithm produces similar precision and slightly higher accuracy at a substantially reduced computational cost.

  4. An efficient and accurate 3D displacements tracking strategy for digital volume correlation

    KAUST Repository

    Pan, Bing

    2014-07-01

    Owing to its inherent computational complexity, practical implementation of digital volume correlation (DVC) for internal displacement and strain mapping faces important challenges in improving its computational efficiency. In this work, an efficient and accurate 3D displacement tracking strategy is proposed for fast DVC calculation. The efficiency advantage is achieved by using three improvements. First, to eliminate the need of updating Hessian matrix in each iteration, an efficient 3D inverse compositional Gauss-Newton (3D IC-GN) algorithm is introduced to replace existing forward additive algorithms for accurate sub-voxel displacement registration. Second, to ensure the 3D IC-GN algorithm that converges accurately and rapidly and avoid time-consuming integer-voxel displacement searching, a generalized reliability-guided displacement tracking strategy is designed to transfer accurate and complete initial guess of deformation for each calculation point from its computed neighbors. Third, to avoid the repeated computation of sub-voxel intensity interpolation coefficients, an interpolation coefficient lookup table is established for tricubic interpolation. The computational complexity of the proposed fast DVC and the existing typical DVC algorithms are first analyzed quantitatively according to necessary arithmetic operations. Then, numerical tests are performed to verify the performance of the fast DVC algorithm in terms of measurement accuracy and computational efficiency. The experimental results indicate that, compared with the existing DVC algorithm, the presented fast DVC algorithm produces similar precision and slightly higher accuracy at a substantially reduced computational cost. © 2014 Elsevier Ltd.

  5. Analysis of operating reliability of WWER-1000 unit

    International Nuclear Information System (INIS)

    Bortlik, J.

    1985-01-01

    The nuclear power unit was divided into 33 technological units. Input data for reliability analysis were surveys of operating results obtained from the IAEA information system and certain indexes of the reliability of technological equipment determined using the Bayes formula. The missing reliability data for technological equipment were used from the basic variant. The fault tree of the WWER-1000 unit was determined for the peak event defined as the impossibility of reaching 100%, 75% and 50% of rated power. The period was observed of the nuclear power plant operation with reduced output owing to defect and the respective time needed for a repair of the equipment. The calculation of the availability of the WWER-1000 unit was made for different variant situations. Certain indexes of the operating reliability of the WWER-1000 unit which are the result of a detailed reliability analysis are tabulated for selected variants. (E.S.)

  6. Coefficient Alpha: A Reliability Coefficient for the 21st Century?

    Science.gov (United States)

    Yang, Yanyun; Green, Samuel B.

    2011-01-01

    Coefficient alpha is almost universally applied to assess reliability of scales in psychology. We argue that researchers should consider alternatives to coefficient alpha. Our preference is for structural equation modeling (SEM) estimates of reliability because they are informative and allow for an empirical evaluation of the assumptions…

  7. Reliability demonstration methodology for products with Gamma Process by optimal accelerated degradation testing

    International Nuclear Information System (INIS)

    Zhang, Chunhua; Lu, Xiang; Tan, Yuanyuan; Wang, Yashun

    2015-01-01

    For products with high reliability and long lifetime, accelerated degradation testing (ADT) may be adopted during product development phase to verify whether its reliability satisfies the predetermined level within feasible test duration. The actual degradation from engineering is usually a strictly monotonic process, such as fatigue crack growth, wear, and erosion. However, the method for reliability demonstration by ADT with monotonic degradation process has not been investigated so far. This paper proposes a reliability demonstration methodology by ADT for this kind of product. We first apply Gamma process to describe the monotonic degradation. Next, we present a reliability demonstration method by converting the required reliability level into allowable cumulative degradation in ADT and comparing the actual accumulative degradation with the allowable level. Further, we suggest an analytical optimal ADT design method for more efficient reliability demonstration by minimizing the asymptotic variance of decision variable in reliability demonstration under the constraints of sample size, test duration, test cost, and predetermined decision risks. The method is validated and illustrated with example on reliability demonstration of alloy product, and is applied to demonstrate the wear reliability within long service duration of spherical plain bearing in the end. - Highlights: • We present a reliability demonstration method by ADT for products with monotonic degradation process, which may be applied to verify reliability with long service life for products with monotonic degradation process within feasible test duration. • We suggest an analytical optimal ADT design method for more efficient reliability demonstration, which differs from the existed optimal ADT design for more accurate reliability estimation by different objective function and different constraints. • The methods are applied to demonstrate the wear reliability within long service duration of

  8. Validity and Reliability of Scores Obtained on Multiple-Choice Questions: Why Functioning Distractors Matter

    Science.gov (United States)

    Ali, Syed Haris; Carr, Patrick A.; Ruit, Kenneth G.

    2016-01-01

    Plausible distractors are important for accurate measurement of knowledge via multiple-choice questions (MCQs). This study demonstrates the impact of higher distractor functioning on validity and reliability of scores obtained on MCQs. Freeresponse (FR) and MCQ versions of a neurohistology practice exam were given to four cohorts of Year 1 medical…

  9. To the problem of reliability of high-voltage accelerators for industrial purposes

    International Nuclear Information System (INIS)

    Al'bertinskij, B.I.; Svin'in, M.P.; Tsepakin, S.G.

    1979-01-01

    Statistical data characterizing the reliability of ELECTRON and AVRORA-2 type accelerators are presented. Used as a reliability index was the mean time to failure of the main accelerator units. The analysis of accelerator failures allowed a number of conclusions to be drawn. The high failure rate level is connected with inadequate training of the servicing personnel and a natural period of equipment adjustment. The mathematical analysis of the failure rate showed that the main responsibility for insufficient high reliability rests with selenium diodes which are employed in the high voltage power supply. Substitution of selenium diodes by silicon ones increases time between failures. It is shown that accumulation and processing of operational statistical data will permit more accurate prediction of the reliability of produced high-voltage accelerators, make it possible to cope with the problems of planning optimal, in time, preventive inspections and repair, and to select optimal safety factors and test procedures n time, preventive inspections and repair, and to select optimal safety factors and test procedures n time, prevent

  10. Safe and reliable solutions for Internet application in power sector

    International Nuclear Information System (INIS)

    Eichelburg, W. K.

    2004-01-01

    The requirements for communication of various information systems (control systems, EMS, ERP) continually increase. Internet is prevailingly a Universal communication device for interconnection of the distant systems at the present. However, the communication with the outside world is important, the internal system must be protected safely and reliably. The goal of the article is to inform the experienced participants with the verified solutions of the safe and reliable Internet utilization for interconnection of control systems on the superior level, the distant management, the diagnostic and for interconnection of information systems. An added value is represented by the solutions using Internet for image and sound transmission. (author)

  11. Three-dimensional magnetic resonance imaging of physeal injury: reliability and clinical utility.

    Science.gov (United States)

    Lurie, Brett; Koff, Matthew F; Shah, Parina; Feldmann, Eric James; Amacker, Nadja; Downey-Zayas, Timothy; Green, Daniel; Potter, Hollis G

    2014-01-01

    Injuries to the physis are common in children with a subset resulting in an osseous bar and potential growth disturbance. Magnetic resonance imaging allows for detailed assessment of the physis with the ability to generate 3-dimensional physeal models from volumetric data. The purpose of this study was to assess the interrater reliability of physeal bar area measurements generated using a validated semiautomated segmentation technique and to highlight the clinical utility of quantitative 3-dimensional (3D) physeal mapping in pediatric orthopaedic practice. The Radiology Information System/Picture Archiving Communication System (PACS) at our institution was searched to find consecutive patients who were imaged for the purpose of assessing a physeal bar or growth disturbance between December 2006 and October 2011. Physeal segmentation was retrospectively performed by 2 independent operators using semiautomated software to generate physeal maps and bar area measurements from 3-dimensional spoiled gradient recalled echo sequences. Inter-reliability was statistically analyzed. Subsequent surgical management for each patient was recorded from the patient notes and surgical records. We analyzed 24 patients (12M/12F) with a mean age of 11.4 years (range, 5-year to 15-year olds) and 25 physeal bars. Of the physeal bars: 9 (36%) were located in the distal tibia; 8 (32%) in the proximal tibia; 5 (20%) in the distal femur; 1 (4%) in the proximal femur; 1 (4%) in the proximal humerus; and 1 (4%) in the distal radius. The independent operator measurements of physeal bar area were highly correlated with a Pearson correlation coefficient (r) of 0.96 and an intraclass correlation coefficient for average measures of 0.99 (95% confidence interval, 0.97-0.99). Four patients underwent resection of the identified physeal bars, 9 patients were treated with epiphysiodesis, and 1 patient underwent bilateral tibial osteotomies. Semiautomated segmentation of the physis is a reproducible

  12. Reliability analysis of protection system of advanced pressurized water reactor - APR 1400

    International Nuclear Information System (INIS)

    Varde, P. V.; Choi, J. G.; Lee, D. Y.; Han, J. B.

    2003-04-01

    Reliability analysis was carried out for the protection system of the Korean Advanced Pressurized Water Reactor - APR 1400. The main focus of this study was the reliability analysis of digital protection system, however, towards giving an integrated statement of complete protection reliability an attempt has been made to include the shutdown devices and other related aspects based on the information available to date. The sensitivity analysis has been carried out for the critical components / functions in the system. Other aspects like importance analysis and human error reliability for the critical human actions form part of this work. The framework provided by this study and the results obtained shows that this analysis has potential to be utilized as part of risk informed approach for future design / regulatory applications

  13. Can emergency physicians accurately and reliably assess acute vertigo in the emergency department?

    Science.gov (United States)

    Vanni, Simone; Nazerian, Peiman; Casati, Carlotta; Moroni, Federico; Risso, Michele; Ottaviani, Maddalena; Pecci, Rudi; Pepe, Giuseppe; Vannucchi, Paolo; Grifoni, Stefano

    2015-04-01

    To validate a clinical diagnostic tool, used by emergency physicians (EPs), to diagnose the central cause of patients presenting with vertigo, and to determine interrater reliability of this tool. A convenience sample of adult patients presenting to a single academic ED with isolated vertigo (i.e. vertigo without other neurological deficits) was prospectively evaluated with STANDING (SponTAneousNystagmus, Direction, head Impulse test, standiNG) by five trained EPs. The first step focused on the presence of spontaneous nystagmus, the second on the direction of nystagmus, the third on head impulse test and the fourth on gait. The local standard practice, senior audiologist evaluation corroborated by neuroimaging when deemed appropriate, was considered the reference standard. Sensitivity and specificity of STANDING were calculated. On the first 30 patients, inter-observer agreement among EPs was also assessed. Five EPs with limited experience in nystagmus assessment volunteered to participate in the present study enrolling 98 patients. Their average evaluation time was 9.9 ± 2.8 min (range 6-17). Central acute vertigo was suspected in 16 (16.3%) patients. There were 13 true positives, three false positives, 81 true negatives and one false negative, with a high sensitivity (92.9%, 95% CI 70-100%) and specificity (96.4%, 95% CI 93-38%) for central acute vertigo according to senior audiologist evaluation. The Cohen's kappas of the first, second, third and fourth steps of the STANDING were 0.86, 0.93, 0.73 and 0.78, respectively. The whole test showed a good inter-observer agreement (k = 0.76, 95% CI 0.45-1). In the hands of EPs, STANDING showed a good inter-observer agreement and accuracy validated against the local standard of care. © 2015 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  14. Reliability and Validity of Ten Consumer Activity Trackers Depend on Walking Speed.

    Science.gov (United States)

    Fokkema, Tryntsje; Kooiman, Thea J M; Krijnen, Wim P; VAN DER Schans, Cees P; DE Groot, Martijn

    2017-04-01

    To examine the test-retest reliability and validity of ten activity trackers for step counting at three different walking speeds. Thirty-one healthy participants walked twice on a treadmill for 30 min while wearing 10 activity trackers (Polar Loop, Garmin Vivosmart, Fitbit Charge HR, Apple Watch Sport, Pebble Smartwatch, Samsung Gear S, Misfit Flash, Jawbone Up Move, Flyfit, and Moves). Participants walked three walking speeds for 10 min each; slow (3.2 km·h), average (4.8 km·h), and vigorous (6.4 km·h). To measure test-retest reliability, intraclass correlations (ICC) were determined between the first and second treadmill test. Validity was determined by comparing the trackers with the gold standard (hand counting), using mean differences, mean absolute percentage errors, and ICC. Statistical differences were calculated by paired-sample t tests, Wilcoxon signed-rank tests, and by constructing Bland-Altman plots. Test-retest reliability varied with ICC ranging from -0.02 to 0.97. Validity varied between trackers and different walking speeds with mean differences between the gold standard and activity trackers ranging from 0.0 to 26.4%. Most trackers showed relatively low ICC and broad limits of agreement of the Bland-Altman plots at the different speeds. For the slow walking speed, the Garmin Vivosmart and Fitbit Charge HR showed the most accurate results. The Garmin Vivosmart and Apple Watch Sport demonstrated the best accuracy at an average walking speed. For vigorous walking, the Apple Watch Sport, Pebble Smartwatch, and Samsung Gear S exhibited the most accurate results. Test-retest reliability and validity of activity trackers depends on walking speed. In general, consumer activity trackers perform better at an average and vigorous walking speed than at a slower walking speed.

  15. Extension of the Accurate Voltage-Sag Fault Location Method in Electrical Power Distribution Systems

    Directory of Open Access Journals (Sweden)

    Youssef Menchafou

    2016-03-01

    Full Text Available Accurate Fault location in an Electric Power Distribution System (EPDS is important in maintaining system reliability. Several methods have been proposed in the past. However, the performances of these methods either show to be inefficient or are a function of the fault type (Fault Classification, because they require the use of an appropriate algorithm for each fault type. In contrast to traditional approaches, an accurate impedance-based Fault Location (FL method is presented in this paper. It is based on the voltage-sag calculation between two measurement points chosen carefully from the available strategic measurement points of the line, network topology and current measurements at substation. The effectiveness and the accuracy of the proposed technique are demonstrated for different fault types using a radial power flow system. The test results are achieved from the numerical simulation using the data of a distribution line recognized in the literature.

  16. Characterizing reliability in a product/process design-assurance program

    Energy Technology Data Exchange (ETDEWEB)

    Kerscher, W.J. III [Delphi Energy and Engine Management Systems, Flint, MI (United States); Booker, J.M.; Bement, T.R.; Meyer, M.A. [Los Alamos National Lab., NM (United States)

    1997-10-01

    Over the years many advancing techniques in the area of reliability engineering have surfaced in the military sphere of influence, and one of these techniques is Reliability Growth Testing (RGT). Private industry has reviewed RGT as part of the solution to their reliability concerns, but many practical considerations have slowed its implementation. It`s objective is to demonstrate the reliability requirement of a new product with a specified confidence. This paper speaks directly to that objective but discusses a somewhat different approach to achieving it. Rather than conducting testing as a continuum and developing statistical confidence bands around the results, this Bayesian updating approach starts with a reliability estimate characterized by large uncertainty and then proceeds to reduce the uncertainty by folding in fresh information in a Bayesian framework.

  17. Study on Feasibility of Applying Function Approximation Moment Method to Achieve Reliability-Based Design Optimization

    International Nuclear Information System (INIS)

    Huh, Jae Sung; Kwak, Byung Man

    2011-01-01

    Robust optimization or reliability-based design optimization are some of the methodologies that are employed to take into account the uncertainties of a system at the design stage. For applying such methodologies to solve industrial problems, accurate and efficient methods for estimating statistical moments and failure probability are required, and further, the results of sensitivity analysis, which is needed for searching direction during the optimization process, should also be accurate. The aim of this study is to employ the function approximation moment method into the sensitivity analysis formulation, which is expressed as an integral form, to verify the accuracy of the sensitivity results, and to solve a typical problem of reliability-based design optimization. These results are compared with those of other moment methods, and the feasibility of the function approximation moment method is verified. The sensitivity analysis formula with integral form is the efficient formulation for evaluating sensitivity because any additional function calculation is not needed provided the failure probability or statistical moments are calculated

  18. Enhance pump reliability through improved inservice testing

    International Nuclear Information System (INIS)

    Healy, J.J.

    1990-01-01

    EPRI has undertaken a study to assess the effectiveness of existing testing programs to accurately monitor and predict performance changes before either pump performance degrades or an actual failure occurs. Anticipated changes in inservice testing techniques are directed towards enhancing the validity of test data, ensuring its repeatability, and avoiding deterioration of the pump assembly. There is a new-found interest in test programs of all types that has occurred, in part, because of an increase in reported pump degradation and pump failure. Inservice testing of pumps, which has long been a basis for assuring operability, has apparently produced an opposite effect; namely, the appearance of a reduction in reliability

  19. Quantitative assessment of probability of failing safely for the safety instrumented system using reliability block diagram method

    International Nuclear Information System (INIS)

    Jin, Jianghong; Pang, Lei; Zhao, Shoutang; Hu, Bin

    2015-01-01

    Highlights: • Models of PFS for SIS were established by using the reliability block diagram. • The more accurate calculation of PFS for SIS can be acquired by using SL. • Degraded operation of complex SIS does not affect the availability of SIS. • The safe undetected failure is the largest contribution to the PFS of SIS. - Abstract: The spurious trip of safety instrumented system (SIS) brings great economic losses to production. How to ensure the safety instrumented system is reliable and available has been put on the schedule. But the existing models on spurious trip rate (STR) or probability of failing safely (PFS) are too simplified and not accurate, in-depth studies of availability to obtain more accurate PFS for SIS are required. Based on the analysis of factors that influence the PFS for the SIS, using reliability block diagram method (RBD), the quantitative study of PFS for the SIS is carried out, and gives some application examples. The results show that, the common cause failure will increase the PFS; degraded operation does not affect the availability of the SIS; if the equipment was tested and repaired one by one, the unavailability of the SIS can be ignored; the corresponding occurrence time of independent safe undetected failure should be the system lifecycle (SL) rather than the proof test interval and the independent safe undetected failure is the largest contribution to the PFS for the SIS

  20. Methodology for allocating reliability and risk

    International Nuclear Information System (INIS)

    Cho, N.Z.; Papazoglou, I.A.; Bari, R.A.

    1986-05-01

    This report describes a methodology for reliability and risk allocation in nuclear power plants. The work investigates the technical feasibility of allocating reliability and risk, which are expressed in a set of global safety criteria and which may not necessarily be rigid, to various reactor systems, subsystems, components, operations, and structures in a consistent manner. The report also provides general discussions on the problem of reliability and risk allocation. The problem is formulated as a multiattribute decision analysis paradigm. The work mainly addresses the first two steps of a typical decision analysis, i.e., (1) identifying alternatives, and (2) generating information on outcomes of the alternatives, by performing a multiobjective optimization on a PRA model and reliability cost functions. The multiobjective optimization serves as the guiding principle to reliability and risk allocation. The concept of ''noninferiority'' is used in the multiobjective optimization problem. Finding the noninferior solution set is the main theme of the current approach. The final step of decision analysis, i.e., assessment of the decision maker's preferences could then be performed more easily on the noninferior solution set. Some results of the methodology applications to a nontrivial risk model are provided, and several outstanding issues such as generic allocation, preference assessment, and uncertainty are discussed. 29 refs., 44 figs., 39 tabs

  1. An Integrative Approach to Accurate Vehicle Logo Detection

    Directory of Open Access Journals (Sweden)

    Hao Pan

    2013-01-01

    required for many applications in intelligent transportation systems and automatic surveillance. The task is challenging considering the small target of logos and the wide range of variability in shape, color, and illumination. A fast and reliable vehicle logo detection approach is proposed following visual attention mechanism from the human vision. Two prelogo detection steps, that is, vehicle region detection and a small RoI segmentation, rapidly focalize a small logo target. An enhanced Adaboost algorithm, together with two types of features of Haar and HOG, is proposed to detect vehicles. An RoI that covers logos is segmented based on our prior knowledge about the logos’ position relative to license plates, which can be accurately localized from frontal vehicle images. A two-stage cascade classier proceeds with the segmented RoI, using a hybrid of Gentle Adaboost and Support Vector Machine (SVM, resulting in precise logo positioning. Extensive experiments were conducted to verify the efficiency of the proposed scheme.

  2. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    Science.gov (United States)

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  3. Multinomial-exponential reliability function: a software reliability model

    International Nuclear Information System (INIS)

    Saiz de Bustamante, Amalio; Saiz de Bustamante, Barbara

    2003-01-01

    The multinomial-exponential reliability function (MERF) was developed during a detailed study of the software failure/correction processes. Later on MERF was approximated by a much simpler exponential reliability function (EARF), which keeps most of MERF mathematical properties, so the two functions together makes up a single reliability model. The reliability model MERF/EARF considers the software failure process as a non-homogeneous Poisson process (NHPP), and the repair (correction) process, a multinomial distribution. The model supposes that both processes are statistically independent. The paper discusses the model's theoretical basis, its mathematical properties and its application to software reliability. Nevertheless it is foreseen model applications to inspection and maintenance of physical systems. The paper includes a complete numerical example of the model application to a software reliability analysis

  4. Development of the Japanese version of an information aid to provide accurate information on prognosis to patients with advanced non-small-cell lung cancer receiving chemotherapy: a pilot study.

    Science.gov (United States)

    Nakano, Kikuo; Kitahara, Yoshihiro; Mito, Mineyo; Seno, Misato; Sunada, Shoji

    2018-02-27

    Without explicit prognostic information, patients may overestimate their life expectancy and make poor choices at the end of life. We sought to design the Japanese version of an information aid (IA) to provide accurate information on prognosis to patients with advanced non-small-cell lung cancer (NSCLC) and to assess the effects of the IA on hope, psychosocial status, and perception of curability. We developed the Japanese version of an IA, which provided information on survival and cure rates as well as numerical survival estimates for patients with metastatic NSCLC receiving first-line chemotherapy. We then assessed the pre- and post-intervention effects of the IA on hope, anxiety, and perception of curability and treatment benefits. A total of 20 (95%) of 21 patients (65% male; median age, 72 years) completed the IA pilot test. Based on the results, scores on the Distress and Impact Thermometer screening tool for adjustment disorders and major depression tended to decrease (from 4.5 to 2.5; P = 0.204), whereas no significant changes were seen in scores for anxiety on the Japanese version of the Support Team Assessment Schedule or in scores on the Hearth Hope Index (from 41.9 to 41.5; p = 0.204). The majority of the patients (16/20, 80%) had high expectations regarding the curative effects of chemotherapy. The Japanese version of the IA appeared to help patients with NSCLC maintain hope, and did not increase their anxiety when they were given explicit prognostic information; however, the IA did not appear to help such patients understand the goal of chemotherapy. Further research is needed to test the findings in a larger sample and measure the outcomes of explicit prognostic information on hope, psychological status, and perception of curability.

  5. Proceedings of the SRESA national conference on reliability and safety engineering

    International Nuclear Information System (INIS)

    Varde, P.V.; Vaishnavi, P.; Sujatha, S.; Valarmathi, A.

    2014-01-01

    The objective of this conference was to provide a forum for technical discussions on recent developments in the area of risk based approach and Prognostic Health Management of critical systems in decision making. The reliability and safety engineering methods are concerned with the way which the product fails, and the effects of failure is to understand how a product works and assures acceptable levels of safety. The reliability engineering addresses all the anticipated and possibly unanticipated causes of failure to ensure the occurrence of failure is prevented or minimized. The topics discussed in the conference were: Reliability in Engineering Design, Safety Assessment and Management, Reliability analysis and Assessment , Stochastic Petri nets for reliability Modeling, Dynamic Reliability, Reliability Prediction, Hardware Reliability, Software Reliability in Safety Critical Issues, Probabilistic Safety Assessment, Risk Informed Approach, Dynamic Models for Reliability Analysis, Reliability based Design and Analysis, Prognostics and Health Management, Remaining Useful Life (RUL), Human Reliability Modeling, Risk Based Applications, Hazard and Operability Study (HAZOP), Reliability in Network Security and Quality Assurance and Management etc. The papers relevant to INIS are indexed separately

  6. Consensus-based Distributed Control for Accurate Reactive, Harmonic and Imbalance Power Sharing in Microgrids

    DEFF Research Database (Denmark)

    Zhou, Jianguo; Kim, Sunghyok; Zhang, Huaguang

    2018-01-01

    This paper investigates the issue of accurate reactive, harmonic and imbalance power sharing in a microgrid. Harmonic and imbalance droop controllers are developed to proportionally share the harmonic power and the imbalance power among distributed generation (DG) units and improve the voltage...... voltage. With the proposed methods, the microgrid system reliability and flexibility can be enhanced and the knowledge of the line impedance is not required. And the reactive, harmonic and imbalance power can be proportionally shared among the DG units. Moreover, the quality of the voltage at PCC can...

  7. YouTube as a source of information on rhinosinusitis: the good, the bad and the ugly.

    Science.gov (United States)

    Biggs, T C; Bird, J H; Harries, P G; Salib, R J

    2013-08-01

    YouTube is an internet-based repository of user-generated content. This study aimed to determine whether YouTube represented a valid and reliable patient information resource for the lay person on the topic of rhinosinusitis. The study included the first 100 YouTube videos found using the search term 'sinusitis'. Videos were graded on their ability to inform the lay person on the subject of rhinosinusitis. Forty-five per cent of the videos were deemed to provide some useful information. Fifty-five per cent of the videos contained little or no useful facts, 27 per cent of which contained potentially misleading or even dangerous information. Videos uploaded by medical professionals or those from health information websites contained more useful information than those uploaded by independent users. YouTube appears to be an unreliable resource for accurate and up to date medical information relating to rhinosinusitis. However, it may provide some useful information if mechanisms existed to direct lay people to verifiable and credible sources.

  8. Activity-based costing as an information basis for an efficient strategic management process

    Directory of Open Access Journals (Sweden)

    Kaličanin Đorđe

    2013-01-01

    Full Text Available Activity-based costing (ABC provides an information basis for monitoring and controlling one of two possible sources of competitive advantage, low-cost production and lowcost distribution. On the basis of cost information about particular processes and activities, management may determine their contribution to the success of a company, and may decide to transfer certain processes and activities to another company. Accuracy of cost information is conditioned by finding an adequate relation between overhead costs and cost objects, identifying and tracing cost drivers and output measures of activities, and by monitoring cost behaviour of different levels of a product. Basic characteristics of the ABC approach, such as more accurate cost price accounting of objects, focusing on process and activity output (rather than only on resource consumption and on understanding and interpretation of cost structure (rather than on cost measurement, enable managers to estimate and control future costs more reliably. Thus the ABC methodology provides a foundation for cost tracing, analysis, and management, which entails making quality and accurate operative and strategic decisions as a basis for the longterm orientation of a company. ABC is also complementary to the widely accepted technique of strategic planning and strategy implementation known as Balanced Scorecard (BSC.

  9. Reliability of thermal interface materials: A review

    International Nuclear Information System (INIS)

    Due, Jens; Robinson, Anthony J.

    2013-01-01

    Thermal interface materials (TIMs) are used extensively to improve thermal conduction across two mating parts. They are particularly crucial in electronics thermal management since excessive junction-to-ambient thermal resistances can cause elevated temperatures which can negatively influence device performance and reliability. Of particular interest to electronic package designers is the thermal resistance of the TIM layer at the end of its design life. Estimations of this allow the package to be designed to perform adequately over its entire useful life. To this end, TIM reliability studies have been performed using accelerated stress tests. This paper reviews the body of work which has been performed on TIM reliability. It focuses on the various test methodologies with commentary on the results which have been obtained for the different TIM materials. Based on the information available in the open literature, a test procedure is proposed for TIM selection based on beginning and end of life performance. - Highlights: ► This paper reviews the body of work which has been performed on TIM reliability. ► Test methodologies for reliability testing are outlined. ► Reliability results for the different TIM materials are discussed. ► A test procedure is proposed for TIM selection BOLife and EOLife performance.

  10. Practical application of reliability engineering in detailed design and maintenance

    International Nuclear Information System (INIS)

    Barden, S.E.

    1975-01-01

    Modern plant systems are closely coupled combinations of sophisticated and expensive equipment, some important parts of which may be in the development stage (high technology sector), and simpler, crude but not necessarily cheap equipment (low technology sector). Manpower resources involved with such plant systems can also be placed in high and low technology categories (i.e. specialist design and construction staff, and production staff, respectively). Neither can operate effectively without the other, and both are equally important. A sophisticated on-line computer controlling plant or analysing fault symptoms is useless, if not unsafe, if the peripheral sensing and control equipment on plant providing input data is poorly designed and inaccurate, and/or unreliable because of inadequate maintenance. Similarly, the designer can be misled and misinformed, and subsequent design evolution can be wrongly directed, if production recors do not accurately reflect what is actually happening on the plant. The application of Reliability Technology can be counter productive if it demands more effort in the collection of data that it save in facilitating quick, correct engineering decisions, and more accurate assessments of resource requirements. Reliability Engineering techniques must be simplified to made their use widely adopted in the important low technology sector, and established in all financial and contractural procedures associated with design specification and production management. This paper develops this theme with practical examples. (author)

  11. On the safeness of examinees and the reliability of system

    International Nuclear Information System (INIS)

    Kudo, Kazumi; Kanda, Kosuke; Saito, Kazuhiko; Maesawa, Tsuneharu; Idekami, Tomio

    1979-01-01

    The control technique of the reliability of examination system was investigated from the viewpoint of patient safety and image information, based on the prevention of microshock owing to circulatory organ checking system. As for the equipments in hospitals, the size of rooms, air conditioning system, power source installation, earth and piping arrangements should be fully discussed at the planning stage. EPR system must be introduced for the prevention for microshock. Intensive education and training are required for operators to secure safeness in operation. Thorough care should be taken to prevent bacilli infection. Further examinations were made on the control technique of the reliability of photographing system from viewpoint of image information, and it is necessary to study the factors for obtaining the reliability of compound machinery components and the devices of generating radiation. (Kobatake, H.)

  12. Development of the Centralized Reliability Data Organization (CREDO)

    International Nuclear Information System (INIS)

    Haas, P.M.; Bott, T.F.; Knee, H.E.; Manning, J.J.; Hudson, S.D.; Greene, N.M.; Woodside, M.A.

    1979-01-01

    CREDO has been established to meet the needs of the US Breeder Reactor Program and the advanced reactor community for a centralized source of accurate reliability/maintainability data and data-related services. The center provides for a comprehensive program of collection, evaluation, and dissemination of data and for necessary user services. A steering committee of nationally recognized experts has been formed to guide CREDO development and operation. Historic data have been collected, analyzed, and published. Routine, continuous data collection and processing has been initiated at US sites. Interfaces with existing data bases have been established. Special user services are under development

  13. Analysis of the Reliability of the "Alternator- Alternator Belt" System

    Directory of Open Access Journals (Sweden)

    Ivan Mavrin

    2012-10-01

    Full Text Available Before starting and also during the exploitation of va1ioussystems, it is vety imp011ant to know how the system and itsparts will behave during operation regarding breakdowns, i.e.failures. It is possible to predict the service behaviour of a systemby determining the functions of reliability, as well as frequencyand intensity of failures.The paper considers the theoretical basics of the functionsof reliability, frequency and intensity of failures for the twomain approaches. One includes 6 equal intetvals and the other13 unequal intetvals for the concrete case taken from practice.The reliability of the "alternator- alternator belt" system installedin the buses, has been analysed, according to the empiricaldata on failures.The empitical data on failures provide empirical functionsof reliability and frequency and intensity of failures, that arepresented in tables and graphically. The first analysis perfO!med by dividing the mean time between failures into 6 equaltime intervals has given the forms of empirical functions of fa ilurefrequency and intensity that approximately cotTespond totypical functions. By dividing the failure phase into 13 unequalintetvals with two failures in each interval, these functions indicateexplicit transitions from early failure inte1val into the randomfailure interval, i.e. into the ageing intetval. Functions thusobtained are more accurate and represent a better solution forthe given case.In order to estimate reliability of these systems with greateraccuracy, a greater number of failures needs to be analysed.

  14. A reliable method for ageing of whiting (Merlangius merlangus) for use in stock assessment and management

    DEFF Research Database (Denmark)

    Ross, Stine Dalmann; Hüssy, Karin

    2013-01-01

    Accurate age estimation is important for stock assessment and management. The importance of reliable ageing is emphasized by the impending analytical assessment of whiting (Merlangius merlangus) in the Baltic Sea. Whiting is a top predator in the western Baltic Sea, where it is fished commercially...

  15. Representing Geospatial Environment Observation Capability Information: A Case Study of Managing Flood Monitoring Sensors in the Jinsha River Basin

    Science.gov (United States)

    Hu, Chuli; Guan, Qingfeng; Li, Jie; Wang, Ke; Chen, Nengcheng

    2016-01-01

    Sensor inquirers cannot understand comprehensive or accurate observation capability information because current observation capability modeling does not consider the union of multiple sensors nor the effect of geospatial environmental features on the observation capability of sensors. These limitations result in a failure to discover credible sensors or plan for their collaboration for environmental monitoring. The Geospatial Environmental Observation Capability (GEOC) is proposed in this study and can be used as an information basis for the reliable discovery and collaborative planning of multiple environmental sensors. A field-based GEOC (GEOCF) information representation model is built. Quintuple GEOCF feature components and two GEOCF operations are formulated based on the geospatial field conceptual framework. The proposed GEOCF markup language is used to formalize the proposed GEOCF. A prototype system called GEOCapabilityManager is developed, and a case study is conducted for flood observation in the lower reaches of the Jinsha River Basin. The applicability of the GEOCF is verified through the reliable discovery of flood monitoring sensors and planning for the collaboration of these sensors. PMID:27999247

  16. Improving Reliability of Information Leakage Detection and Prevention Systems

    Directory of Open Access Journals (Sweden)

    A. V. Mamaev

    2011-03-01

    Full Text Available The problem of protection from deliberate leaks of information is one of the most difficult. Integrated systems of information protection against insider have a serious drawback. Using this disadvantage the offender receives the possibility of unauthorized theft of information from working machine.

  17. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  18. ARCHITECTURE AND RELIABILITY OF OPERATING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Stanislav V. Nazarov

    2018-03-01

    Full Text Available Progress in the production technology of microprocessors significantly increased reliability and performance of the computer systems hardware. It cannot be told about the corresponding characteristics of the software and its basis – the operating system (OS. Some achievements of program engineering are more modest in this field. Both directions of OS improvement (increasing of productivity and reliability are connected with the development of effective structures of these systems. OS functional complexity leads to the multiplicity of the structure, which is further enhanced by the specialization of the operating system depending on scope of computer system (complex scientific calculations, real time, information retrieval systems, systems of the automated and automatic control, etc. The functional complexity of the OS leads to the complexity of its architecture, which is further enhanced by the specialization of the operating system, depending on the computer system application area (complex scientific calculations, real-time, information retrieval systems, automated and automatic control systems, etc.. That fact led to variety of modern OS. It is possible to estimate reliability of different OS structures only as results of long-term field experiment or simulation modeling. However it is most often unacceptable because of time and funds expenses for carrying out such research. This survey attempts to evaluate the reliability of two main OS architectures: large multi-layered modular core and a multiserver (client-server system. Represented by continuous Markov chains which are explored in the stationary mode on the basis of transition from systems of the differential equations of Kolmogorov to system of the linear algebraic equations, models of these systems are developed.

  19. Reliability of containment and safety-related structures

    International Nuclear Information System (INIS)

    Nessim, M.A.

    1995-09-01

    A research program on Reliability of Containment and Safety-related Structures has been developed and is described in this document. This program is designed to support AECB's regulatory activities aimed at ensuring the safety of these structures. These activities include evaluating submissions by operators and requesting special assessments when necessary. The results of the proposed research will also be useful in revising and enhancing the CSA design standards for containment and safety-related structures. The process of developing the research program started with an information collection and review phase. The sources of information included C-FER's previous work in the area, various recent research publications, regulatory documents and relevant design standards, and a detailed discussion with AECB staff. The second step was to outline the process of reliability evaluation, and identify the required models and parameters. Comparison between the required and available information was used to identify gaps in the state-of-the-art, and the research program was designed to fill these gaps. The program is organized in four major topics, namely: development of an approach for reliability analysis; compilation and development of the required analysis tools; application to specific problems related to design, assessment, maintenance and testing of structures; and testing and validation. It is suggested that the program should be supported by an on-going process of communication and consultation between AECB staff and industry experts. This will lend credibility to the results and facilitate their future application. (author). 1 fig

  20. The fair value of operational reliability

    International Nuclear Information System (INIS)

    Patino-Echeverri, Dalia; Morel, Benoit

    2005-01-01

    Information about the uncertainties that surround the operation of the power system can be used to enlighten the debate of how much reliability should be pursued and how resources should be allocated to pursue it. In this paper we present a method to determine the value of having flexible generators to react to load fluctuations. This value can be seen as the value of hedging against the uncertainty on the load due to the volatility of the demand and the possibility of congestion. Because having this flexibility can be related to a financial option, we will use an extension of options theory and in particular the risk-neutral valuation method, to find a risk neutral quantification of its value. We illustrate our point valuing the flexibility that leads to ''operational reliability'' in the PJM market. Our formula for that value is what we call ''the fair value'' of operational reliability. (Author)

  1. Accurate and Reliable Prediction of the Binding Affinities of Macrocycles to Their Protein Targets.

    Science.gov (United States)

    Yu, Haoyu S; Deng, Yuqing; Wu, Yujie; Sindhikara, Dan; Rask, Amy R; Kimura, Takayuki; Abel, Robert; Wang, Lingle

    2017-12-12

    Macrocycles have been emerging as a very important drug class in the past few decades largely due to their expanded chemical diversity benefiting from advances in synthetic methods. Macrocyclization has been recognized as an effective way to restrict the conformational space of acyclic small molecule inhibitors with the hope of improving potency, selectivity, and metabolic stability. Because of their relatively larger size as compared to typical small molecule drugs and the complexity of the structures, efficient sampling of the accessible macrocycle conformational space and accurate prediction of their binding affinities to their target protein receptors poses a great challenge of central importance in computational macrocycle drug design. In this article, we present a novel method for relative binding free energy calculations between macrocycles with different ring sizes and between the macrocycles and their corresponding acyclic counterparts. We have applied the method to seven pharmaceutically interesting data sets taken from recent drug discovery projects including 33 macrocyclic ligands covering a diverse chemical space. The predicted binding free energies are in good agreement with experimental data with an overall root-mean-square error (RMSE) of 0.94 kcal/mol. This is to our knowledge the first time where the free energy of the macrocyclization of linear molecules has been directly calculated with rigorous physics-based free energy calculation methods, and we anticipate the outstanding accuracy demonstrated here across a broad range of target classes may have significant implications for macrocycle drug discovery.

  2. Reliability of surface electromyography timing parameters in gait in cervical spondylotic myelopathy.

    LENUS (Irish Health Repository)

    Malone, Ailish

    2012-02-01

    The aims of this study were to validate a computerised method to detect muscle activity from surface electromyography (SEMG) signals in gait in patients with cervical spondylotic myelopathy (CSM), and to evaluate the test-retest reliability of the activation times designated by this method. SEMG signals were recorded from rectus femoris (RF), biceps femoris (BF), tibialis anterior (TA), and medial gastrocnemius (MG), during gait in 12 participants with CSM on two separate test days. Four computerised activity detection methods, based on the Teager-Kaiser Energy Operator (TKEO), were applied to a subset of signals and compared to visual interpretation of muscle activation. The most accurate method was then applied to all signals for evaluation of test-retest reliability. A detection method based on a combined slope and amplitude threshold showed the highest agreement (87.5%) with visual interpretation. With respect to reliability, the standard error of measurement (SEM) of the timing of RF, TA and MG between test days was 5.5% stride duration or less, while the SEM of BF was 9.4%. The timing parameters of RF, TA and MG designated by this method were considered sufficiently reliable for use in clinical practice, however the reliability of BF was questionable.

  3. Reliability of the Suicide Opinion Questionnaire.

    Science.gov (United States)

    Rogers, James R.; DeShon, Richard P.

    The lack of systematic psychometric information on the Suicide Opinion Questionnaire (SOQ) was addressed by investigating the factor structure and reliability of the eight-factor clinical scale model (mental illness, cry for help, right to die, religion, impulsivity, normality, aggression, and moral evil), developed for interpreting responses to…

  4. Fast, reliable sexing of prosimian DNA

    DEFF Research Database (Denmark)

    Fredsted, Tina; Villesen, Palle

    2004-01-01

    to identify conserved regions in the amelogenin gene. Using these conserved regions, we can target species that have no sequence information. We designed a single, conserved primer pair that is useful for fast and reliable molecular sexing of prosimian primates. A single PCR yields two fragments in males...

  5. Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.

    Science.gov (United States)

    Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander

    2018-04-10

    A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing

  6. Measuring reliability under epistemic uncertainty: Review on non-probabilistic reliability metrics

    Directory of Open Access Journals (Sweden)

    Kang Rui

    2016-06-01

    Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.

  7. The complex network reliability and influential nodes

    Science.gov (United States)

    Li, Kai; He, Yongfeng

    2017-08-01

    In order to study the complex network node important degree and reliability, considering semi-local centrality, betweenness centrality and PageRank algorithm, through the simulation method to gradually remove nodes and recalculate the importance in the random network, small world network and scale-free network. Study the relationship between the largest connected component and node removed proportion, the research results show that betweenness centrality and PageRank algorithm based on the global information network are more effective for evaluating the importance of nodes, and the reliability of the network is related to the network topology.

  8. National Aeronautics and Space Administration: Leadership and Systems Needed to Effect Financial Management Improvements

    National Research Council Canada - National Science Library

    Kutz, Gregory

    2002-01-01

    ...). This implied that NASA not only could generate reliable information once a year for external financial reporting purposes but also could provide accurate, reliable information for day-today decision-making...

  9. A Technique Using Calibrated Photography and Photoshop for Accurate Shade Analysis and Communication.

    Science.gov (United States)

    McLaren, Edward A; Figueira, Johan; Goldstein, Ronald E

    2017-02-01

    This article reviews the critical aspects of controlling the shade-taking environment and discusses various modalities introduced throughout the years to acquire and communicate shade information. Demonstrating a highly calibrated digital photographic technique for capturing shade information, this article shows how to use Photoshop® to standardize images and extract color information from the tooth and shade tab for use by a ceramist for an accurate shade-matching restoration.

  10. Accurate location estimation of moving object In Wireless Sensor network

    Directory of Open Access Journals (Sweden)

    Vinay Bhaskar Semwal

    2011-12-01

    Full Text Available One of the central issues in wirless sensor networks is track the location, of moving object which have overhead of saving data, an accurate estimation of the target location of object with energy constraint .We do not have any mechanism which control and maintain data .The wireless communication bandwidth is also very limited. Some field which is using this technique are flood and typhoon detection, forest fire detection, temperature and humidity and ones we have these information use these information back to a central air conditioning and ventilation.In this research paper, we propose protocol based on the prediction and adaptive based algorithm which is using less sensor node reduced by an accurate estimation of the target location. We had shown that our tracking method performs well in terms of energy saving regardless of mobility pattern of the mobile target. We extends the life time of network with less sensor node. Once a new object is detected, a mobile agent will be initiated to track the roaming path of the object.

  11. Automated reliability assessment for spectroscopic redshift measurements

    Science.gov (United States)

    Jamal, S.; Le Brun, V.; Le Fèvre, O.; Vibert, D.; Schmitt, A.; Surace, C.; Copin, Y.; Garilli, B.; Moresco, M.; Pozzetti, L.

    2018-03-01

    Context. Future large-scale surveys, such as the ESA Euclid mission, will produce a large set of galaxy redshifts (≥106) that will require fully automated data-processing pipelines to analyze the data, extract crucial information and ensure that all requirements are met. A fundamental element in these pipelines is to associate to each galaxy redshift measurement a quality, or reliability, estimate. Aim. In this work, we introduce a new approach to automate the spectroscopic redshift reliability assessment based on machine learning (ML) and characteristics of the redshift probability density function. Methods: We propose to rephrase the spectroscopic redshift estimation into a Bayesian framework, in order to incorporate all sources of information and uncertainties related to the redshift estimation process and produce a redshift posterior probability density function (PDF). To automate the assessment of a reliability flag, we exploit key features in the redshift posterior PDF and machine learning algorithms. Results: As a working example, public data from the VIMOS VLT Deep Survey is exploited to present and test this new methodology. We first tried to reproduce the existing reliability flags using supervised classification in order to describe different types of redshift PDFs, but due to the subjective definition of these flags (classification accuracy 58%), we soon opted for a new homogeneous partitioning of the data into distinct clusters via unsupervised classification. After assessing the accuracy of the new clusters via resubstitution and test predictions (classification accuracy 98%), we projected unlabeled data from preliminary mock simulations for the Euclid space mission into this mapping to predict their redshift reliability labels. Conclusions: Through the development of a methodology in which a system can build its own experience to assess the quality of a parameter, we are able to set a preliminary basis of an automated reliability assessment for

  12. Establishing magnetic resonance imaging as an accurate and reliable tool to diagnose and monitor esophageal cancer in a rat model.

    Directory of Open Access Journals (Sweden)

    Juliann E Kosovec

    Full Text Available OBJECTIVE: To assess the reliability of magnetic resonance imaging (MRI for detection of esophageal cancer in the Levrat model of end-to-side esophagojejunostomy. BACKGROUND: The Levrat model has proven utility in terms of its ability to replicate Barrett's carcinogenesis by inducing gastroduodenoesophageal reflux (GDER. Due to lack of data on the utility of non-invasive methods for detection of esophageal cancer, treatment efficacy studies have been limited, as adenocarcinoma histology has only been validated post-mortem. It would therefore be of great value if the validity and reliability of MRI could be established in this setting. METHODS: Chronic GDER reflux was induced in 19 male Sprague-Dawley rats using the modified Levrat model. At 40 weeks post-surgery, all animals underwent endoscopy, MRI scanning, and post-mortem histological analysis of the esophagus and anastomosis. With post-mortem histology serving as the gold standard, assessment of presence of esophageal cancer was made by five esophageal specialists and five radiologists on endoscopy and MRI, respectively. RESULTS: The accuracy of MRI and endoscopic analysis to correctly identify cancer vs. no cancer was 85.3% and 50.5%, respectively. ROC curves demonstrated that MRI rating had an AUC of 0.966 (p<0.001 and endoscopy rating had an AUC of 0.534 (p = 0.804. The sensitivity and specificity of MRI for identifying cancer vs. no-cancer was 89.1% and 80% respectively, as compared to 45.5% and 57.5% for endoscopy. False positive rates of MRI and endoscopy were 20% and 42.5%, respectively. CONCLUSIONS: MRI is a more reliable diagnostic method than endoscopy in the Levrat model. The non-invasiveness of the tool and its potential to volumetrically quantify the size and number of tumors likely makes it even more useful in evaluating novel agents and their efficacy in treatment studies of esophageal cancer.

  13. Reliability Approach of a Compressor System using Reliability Block ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... This paper presents a reliability analysis of such a system using reliability ... Keywords-compressor system, reliability, reliability block diagram, RBD .... the same structure has been kept with the three subsystems: air flow, oil flow and .... and Safety in Engineering Design", Springer, 2009. [3] P. O'Connor ...

  14. A methodology and success/failure criteria for determining emergency diesel generator reliability

    International Nuclear Information System (INIS)

    Wyckoff, H.L.

    1986-01-01

    In the U.S., comprehensive records of nationwide emergency diesel generator (EDG) reliability at nuclear power plants have not been consistently collected. Those surveys that have been undertaken have not always been complete and accurate. Moreover, they have been based On an extremely conservative methodology and success/failure criteria that are specified in U.S. Nuclear Regulatory Commission Reg. Guide 1.108. This Reg. Guide was one of the NRCs earlier efforts and does not yield the caliber of statistically defensible reliability values that are now needed. On behalf of the U.S. utilities, EPRI is taking the lead in organizing, investigating, and compiling a realistic database of EDG operating success/failure experience for the years 1983, 1984 and 1985. These data will be analyzed to provide an overall picture of EDG reliability. This paper describes the statistical methodology and start and run success/- failure criteria that EPRI is using. The survey is scheduled to be completed in March 1986. (author)

  15. A methodology and success/failure criteria for determining emergency diesel generator reliability

    Energy Technology Data Exchange (ETDEWEB)

    Wyckoff, H. L. [Electric Power Research Institute, Palo Alto, California (United States)

    1986-02-15

    In the U.S., comprehensive records of nationwide emergency diesel generator (EDG) reliability at nuclear power plants have not been consistently collected. Those surveys that have been undertaken have not always been complete and accurate. Moreover, they have been based On an extremely conservative methodology and success/failure criteria that are specified in U.S. Nuclear Regulatory Commission Reg. Guide 1.108. This Reg. Guide was one of the NRCs earlier efforts and does not yield the caliber of statistically defensible reliability values that are now needed. On behalf of the U.S. utilities, EPRI is taking the lead in organizing, investigating, and compiling a realistic database of EDG operating success/failure experience for the years 1983, 1984 and 1985. These data will be analyzed to provide an overall picture of EDG reliability. This paper describes the statistical methodology and start and run success/- failure criteria that EPRI is using. The survey is scheduled to be completed in March 1986. (author)

  16. An enhanced reliability-oriented workforce planning model for process industry using combined fuzzy goal programming and differential evolution approach

    Science.gov (United States)

    Ighravwe, D. E.; Oke, S. A.; Adebiyi, K. A.

    2018-03-01

    This paper draws on the "human reliability" concept as a structure for gaining insight into the maintenance workforce assessment in a process industry. Human reliability hinges on developing the reliability of humans to a threshold that guides the maintenance workforce to execute accurate decisions within the limits of resources and time allocations. This concept offers a worthwhile point of deviation to encompass three elegant adjustments to literature model in terms of maintenance time, workforce performance and return-on-workforce investments. These fully explain the results of our influence. The presented structure breaks new grounds in maintenance workforce theory and practice from a number of perspectives. First, we have successfully implemented fuzzy goal programming (FGP) and differential evolution (DE) techniques for the solution of optimisation problem in maintenance of a process plant for the first time. The results obtained in this work showed better quality of solution from the DE algorithm compared with those of genetic algorithm and particle swarm optimisation algorithm, thus expressing superiority of the proposed procedure over them. Second, the analytical discourse, which was framed on stochastic theory, focusing on specific application to a process plant in Nigeria is a novelty. The work provides more insights into maintenance workforce planning during overhaul rework and overtime maintenance activities in manufacturing systems and demonstrated capacity in generating substantially helpful information for practice.

  17. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  18. Reliability of thermal-hydraulic passive safety systems

    International Nuclear Information System (INIS)

    D'Auria, F.; Araneo, D.; Pierro, F.; Galassi, G.

    2014-01-01

    The scholar will be informed of reliability concepts applied to passive system adopted for nuclear reactors. Namely, for classical components and systems the failure concept is associated with malfunction of breaking of hardware. In the case of passive systems the failure is associated with phenomena. A method for studying the reliability of passive systems is discussed and is applied. The paper deals with the description of the REPAS (Reliability Evaluation of Passive Safety System) methodology developed by University of Pisa (UNIPI) and with results from its application. The general objective of the REPAS methodology is to characterize the performance of a passive system in order to increase the confidence toward its operation and to compare the performances of active and passive systems and the performances of different passive systems

  19. Determining the reliability function of the thermal power system in power plant "Nikola Tesla, Block B1"

    Directory of Open Access Journals (Sweden)

    Kalaba Dragan V.

    2015-01-01

    Full Text Available Representation of probabilistic technique for evaluation of thermal power system reliability is the main subject of this paper. The system of thermal power plant under study consists of three subsystems and the reliability assessment is based on a sixteen-year failure database. By applying the mathematical theory of reliability to exploitation research data and using complex two-parameter Weibull distribution, the theoretical reliability functions of specified system have been determined. Obtained probabilistic laws of failure occurrence have confirmed a hypothesis that the distribution of the observed random variable fully describes behaviour of such a system in terms of reliability. Shown results make possible to acquire a better knowledge of current state of the system, as well as a more accurate estimation of its behavior during future exploitation. Final benefit is opportunity for potential improvement of complex system maintenance policies aimed at the reduction of unexpected failure occurrences.

  20. System ergonomics as an approach to improve human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1988-01-01

    The application of system technics on ergonomical problems is called system ergonomics. This enables improvements of human reliability by design measures. The precondition for this is the knowledge of how information processing is performed by man and machine. By a separate consideration of sensory processing, cognitive processing, and motory processing it is possible to have a more exact idea of the system element 'man'. The system element 'machine' is well described by differential equations which allow an ergonomical assessment of the manouverability. The knowledge of information processing of man and machine enables a task analysis. This makes appear on one hand the human boundaries depending on the different properties of the task and on the other hand suitable ergonomical solution proposals which improve the reliability of the total system. It is a disadvantage, however, that the change of human reliability by such measures may not be quoted numerically at the moment. (orig.)

  1. Case study on the use of PSA methods: Human reliability analysis

    International Nuclear Information System (INIS)

    1991-04-01

    The overall objective of treating human reliability in a probabilistic safety analysis is to ensure that the key human interactions of typical crews are accurately and systematically incorporated into the study in a traceable manner. An additional objective is to make the human reliability analysis (HRA) as realistic as possible, taking into account the emergency procedures, the man-machine interface, the focus of training process, and the knowledge and experience of the crews. Section 3 of the paper describes an overview of this analytical process which leads to three more detailed example problems described in Section 4. Section 5 discusses a peer review process. References are presented that are useful in performing HRAs. In addition appendices are provided for definitions, selected data and a generic list of performance shaping factors. 35 refs, figs and tabs

  2. System Reliability Engineering

    International Nuclear Information System (INIS)

    Lim, Tae Jin

    2005-02-01

    This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.

  3. Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR)

    International Nuclear Information System (INIS)

    Gilbert, B.G.; Reece, W.J.; Gertman, D.I.; Gilmore, W.E.; Galyean, W.J.

    1990-12-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) is an automated data base management system for processing and storing human error probability and hardware component failure data. The NUCLARR system software resides on an IBM (or compatible) personal computer. NUCLARR can furnish the end user with data inputs for both human and hardware reliability analysis in support of a variety of risk assessment activities. The NUCLARR system is documented in a five-volume series of reports. Volume 5: Data Manual provides a hard-copy representation of all data and related information available within the NUCLARR system software. This document is organized in three sections. Part 1 is the summary description, which presents an overview of the NUCLARR system and data processing procedures. Part 2 contains all data and information relevant to the human error probability (HEP) data side of NUCLARR. Data and information for the hardware component failure data (HCFD) side are presented in Part 3. 7 refs

  4. A Novel Evaluation Method for Building Construction Project Based on Integrated Information Entropy with Reliability Theory

    Directory of Open Access Journals (Sweden)

    Xiao-ping Bai

    2013-01-01

    Full Text Available Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  5. Characterizing Information Processing With a Mobile Device: Measurement of Simple and Choice Reaction Time.

    Science.gov (United States)

    Burke, Daniel; Linder, Susan; Hirsch, Joshua; Dey, Tanujit; Kana, Daniel; Ringenbach, Shannon; Schindler, David; Alberts, Jay

    2017-10-01

    Information processing is typically evaluated using simple reaction time (SRT) and choice reaction time (CRT) paradigms in which a specific response is initiated following a given stimulus. The measurement of reaction time (RT) has evolved from monitoring the timing of mechanical switches to computerized paradigms. The proliferation of mobile devices with touch screens makes them a natural next technological approach to assess information processing. The aims of this study were to determine the validity and reliability of using of a mobile device (Apple iPad or iTouch) to accurately measure RT. Sixty healthy young adults completed SRT and CRT tasks using a traditional test platform and mobile platforms on two occasions. The SRT was similar across test modality: 300, 287, and 280 milliseconds (ms) for the traditional, iPad, and iTouch, respectively. The CRT was similar within mobile devices, though slightly faster on the traditional: 359, 408, and 384 ms for traditional, iPad, and iTouch, respectively. Intraclass correlation coefficients ranged from 0.79 to 0.85 for SRT and from 0.75 to 0.83 for CRT. The similarity and reliability of SRT across platforms and consistency of SRT and CRT across test conditions indicate that mobile devices provide the next generation of assessment platforms for information processing.

  6. Information and Informality

    DEFF Research Database (Denmark)

    Larsson, Magnus; Segerstéen, Solveig; Svensson, Cathrin

    2011-01-01

    leaders on the basis of their possession of reliable knowledge in technical as well as organizational domains. The informal leaders engaged in interpretation and brokering of information and knowledge, as well as in mediating strategic values and priorities on both formal and informal arenas. Informal...... leaders were thus seen to function on the level of the organization as a whole, and in cooperation with formal leaders. Drawing on existing theory of leadership in creative and professional contexts, this cooperation can be specified to concern task structuring. The informal leaders in our study...... contributed to task structuring through sensemaking activities, while formal leaders focused on aspects such as clarifying output expectations, providing feedback, project structure, and diversity....

  7. Prediction of Accurate Mixed Mode Fatigue Crack Growth Curves using the Paris' Law

    Science.gov (United States)

    Sajith, S.; Krishna Murthy, K. S. R.; Robi, P. S.

    2017-12-01

    Accurate information regarding crack growth times and structural strength as a function of the crack size is mandatory in damage tolerance analysis. Various equivalent stress intensity factor (SIF) models are available for prediction of mixed mode fatigue life using the Paris' law. In the present investigation these models have been compared to assess their efficacy in prediction of the life close to the experimental findings as there are no guidelines/suggestions available on selection of these models for accurate and/or conservative predictions of fatigue life. Within the limitations of availability of experimental data and currently available numerical simulation techniques, the results of present study attempts to outline models that would provide accurate and conservative life predictions.

  8. On the use of NDT Data for Reliability-Based Assessment of Existing Timber Structures

    DEFF Research Database (Denmark)

    Sousa, Hélder S.; Sørensen, John Dalsgaard; Kirkegaard, Poul Henning

    2013-01-01

    The objective of this paper is to address the possibilities of using non-destructive testing (NDT) data for updating information and obtaining adequate characterization of the reliability level of existing timber structures and, also, for assessing the evolution in time of performance...... of these structures when exposed to deterioration. By improving the knowledge upon the mechanical properties of timber, better and more substantiated decisions after a reliability safety assessment are aimed at. Bayesian methods are used to update the mechanical properties of timber and reliability assessment......, and information of NDT is also used to calibrate these models. The proposed approach is used for reliability assessment of different structural timber systems. Reliability of the structural system is assessed regarding the failure consequences of individual elements defined as key elements which were determined...

  9. Educating patients to evaluate web-based health care information: the GATOR approach to healthy surfing.

    Science.gov (United States)

    Weber, Bryan A; Derrico, David J; Yoon, Saunjoo L; Sherwill-Navarro, Pamela

    2010-05-01

    Teaching patients to assess web resources effectively has become an important need in primary care. The acronym GATOR (genuine, accurate, trustworthy, origin and readability), an easily memorized strategy for assessing web-based health information, is presented in this paper. Despite the fact that many patients consult the World-Wide Web (or Internet) daily to find information related to health concerns, a lack of experience, knowledge, or education may limit ability to accurately evaluate health-related sites and the information they contain. Health information on the Web is not subject to regulation, oversight, or mandatory updates and sites are often transient due to ever changing budget priorities. This makes it difficult, if not impossible, for patients to develop a list of stable sites containing current, reliable information. Commentary aimed at improving patient's use of web based health care information. The GATOR acronym is easy to remember and understand and may assist patients in making knowledgeable decisions as they traverse through the sometimes misleading and often overwhelming amount of health information on the Web. The GATOR acronym provides a mechanism that can be used to structure frank discussion with patients and assist in health promotion through education. When properly educated about how to find and evaluate Web-based health information, patients may avoid negative consequences that result from trying unsafe recommendations drawn from untrustworthy sites. They may also be empowered to not only seek more information about their health conditions, treatment and available alternatives, but also to discuss their feelings, ideas and concerns with their healthcare providers.

  10. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  11. A study of operational and testing reliability in software reliability analysis

    International Nuclear Information System (INIS)

    Yang, B.; Xie, M.

    2000-01-01

    Software reliability is an important aspect of any complex equipment today. Software reliability is usually estimated based on reliability models such as nonhomogeneous Poisson process (NHPP) models. Software systems are improving in testing phase, while it normally does not change in operational phase. Depending on whether the reliability is to be predicted for testing phase or operation phase, different measure should be used. In this paper, two different reliability concepts, namely, the operational reliability and the testing reliability, are clarified and studied in detail. These concepts have been mixed up or even misused in some existing literature. Using different reliability concept will lead to different reliability values obtained and it will further lead to different reliability-based decisions made. The difference of the estimated reliabilities is studied and the effect on the optimal release time is investigated

  12. The fair value of operational reliability

    Energy Technology Data Exchange (ETDEWEB)

    Patino-Echeverri, Dalia; Morel, Benoit

    2005-12-15

    Information about the uncertainties that surround the operation of the power system can be used to enlighten the debate of how much reliability should be pursued and how resources should be allocated to pursue it. In this paper we present a method to determine the value of having flexible generators to react to load fluctuations. This value can be seen as the value of hedging against the uncertainty on the load due to the volatility of the demand and the possibility of congestion. Because having this flexibility can be related to a financial option, we will use an extension of options theory and in particular the risk-neutral valuation method, to find a risk neutral quantification of its value. We illustrate our point valuing the flexibility that leads to ''operational reliability'' in the PJM market. Our formula for that value is what we call ''the fair value'' of operational reliability. (Author)

  13. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  14. A flexible and accurate digital volume correlation method applicable to high-resolution volumetric images

    Science.gov (United States)

    Pan, Bing; Wang, Bo

    2017-10-01

    Digital volume correlation (DVC) is a powerful technique for quantifying interior deformation within solid opaque materials and biological tissues. In the last two decades, great efforts have been made to improve the accuracy and efficiency of the DVC algorithm. However, there is still a lack of a flexible, robust and accurate version that can be efficiently implemented in personal computers with limited RAM. This paper proposes an advanced DVC method that can realize accurate full-field internal deformation measurement applicable to high-resolution volume images with up to billions of voxels. Specifically, a novel layer-wise reliability-guided displacement tracking strategy combined with dynamic data management is presented to guide the DVC computation from slice to slice. The displacements at specified calculation points in each layer are computed using the advanced 3D inverse-compositional Gauss-Newton algorithm with the complete initial guess of the deformation vector accurately predicted from the computed calculation points. Since only limited slices of interest in the reference and deformed volume images rather than the whole volume images are required, the DVC calculation can thus be efficiently implemented on personal computers. The flexibility, accuracy and efficiency of the presented DVC approach are demonstrated by analyzing computer-simulated and experimentally obtained high-resolution volume images.

  15. Effective and accurate approach for modeling of commensurate-incommensurate transition in krypton monolayer on graphite.

    Science.gov (United States)

    Ustinov, E A

    2014-10-07

    Commensurate-incommensurate (C-IC) transition of krypton molecular layer on graphite received much attention in recent decades in theoretical and experimental researches. However, there still exists a possibility of generalization of the phenomenon from thermodynamic viewpoint on the basis of accurate molecular simulation. Recently, a new technique was developed for analysis of two-dimensional (2D) phase transitions in systems involving a crystalline phase, which is based on accounting for the effect of temperature and the chemical potential on the lattice constant of the 2D layer using the Gibbs-Duhem equation [E. A. Ustinov, J. Chem. Phys. 140, 074706 (2014)]. The technique has allowed for determination of phase diagrams of 2D argon layers on the uniform surface and in slit pores. This paper extends the developed methodology on systems accounting for the periodic modulation of the substrate potential. The main advantage of the developed approach is that it provides highly accurate evaluation of the chemical potential of crystalline layers, which allows reliable determination of temperature and other parameters of various 2D phase transitions. Applicability of the methodology is demonstrated on the krypton-graphite system. Analysis of phase diagram of the krypton molecular layer, thermodynamic functions of coexisting phases, and a method of prediction of adsorption isotherms is considered accounting for a compression of the graphite due to the krypton-carbon interaction. The temperature and heat of C-IC transition has been reliably determined for the gas-solid and solid-solid system.

  16. Effective and accurate approach for modeling of commensurate–incommensurate transition in krypton monolayer on graphite

    International Nuclear Information System (INIS)

    Ustinov, E. A.

    2014-01-01

    Commensurate–incommensurate (C-IC) transition of krypton molecular layer on graphite received much attention in recent decades in theoretical and experimental researches. However, there still exists a possibility of generalization of the phenomenon from thermodynamic viewpoint on the basis of accurate molecular simulation. Recently, a new technique was developed for analysis of two-dimensional (2D) phase transitions in systems involving a crystalline phase, which is based on accounting for the effect of temperature and the chemical potential on the lattice constant of the 2D layer using the Gibbs–Duhem equation [E. A. Ustinov, J. Chem. Phys. 140, 074706 (2014)]. The technique has allowed for determination of phase diagrams of 2D argon layers on the uniform surface and in slit pores. This paper extends the developed methodology on systems accounting for the periodic modulation of the substrate potential. The main advantage of the developed approach is that it provides highly accurate evaluation of the chemical potential of crystalline layers, which allows reliable determination of temperature and other parameters of various 2D phase transitions. Applicability of the methodology is demonstrated on the krypton–graphite system. Analysis of phase diagram of the krypton molecular layer, thermodynamic functions of coexisting phases, and a method of prediction of adsorption isotherms is considered accounting for a compression of the graphite due to the krypton–carbon interaction. The temperature and heat of C-IC transition has been reliably determined for the gas–solid and solid–solid system

  17. Reliability assessment of passive containment isolation system using APSRA methodology

    International Nuclear Information System (INIS)

    Nayak, A.K.; Jain, Vikas; Gartia, M.R.; Srivastava, A.; Prasad, Hari; Anthony, A.; Gaikwad, A.J.; Bhatia, S.; Sinha, R.K.

    2008-01-01

    In this paper, a methodology known as APSRA (Assessment of Passive System ReliAbility) has been employed for evaluation of the reliability of passive systems. The methodology has been applied to the passive containment isolation system (PCIS) of the Indian advanced heavy water reactor (AHWR). In the APSRA methodology, the passive system reliability evaluation is based on the failure probability of the system to carryout the desired function. The methodology first determines the operational characteristics of the system and the failure conditions by assigning a predetermined failure criterion. The failure surface is predicted using a best estimate code considering deviations of the operating parameters from their nominal states, which affect the PCIS performance. APSRA proposes to compare the code predictions with the test data to generate the uncertainties on the failure parameter prediction, which is later considered in the code for accurate prediction of failure surface of the system. Once the failure surface of the system is predicted, the cause of failure is examined through root diagnosis, which occurs mainly due to failure of mechanical components. The failure probability of these components is evaluated through a classical PSA treatment using the generic data. The reliability of the PCIS is evaluated from the probability of availability of the components for the success of the passive containment isolation system

  18. DOE Human Reliability Program Removals Report 2004-2006

    International Nuclear Information System (INIS)

    Center for Human Reliability Studies

    2007-01-01

    This report presents results of the comprehensive data analysis and assessment of all U.S. Department of Energy (DOE) and National Nuclear Security Administration (NNSA) facilities that have positions requiring workers to be certified in the Human Reliability Program (HRP). Those facilities include: Albuquerque, Amarillo, DOE Headquarters, Hanford, Idaho, Nevada, Oak Ridge, Oakland, and Savannah River. The HRP was established to ensure, through continuous review and evaluation, the reliability of individuals who have access to the DOE's most sensitive facilities, materials, and information

  19. Interrater reliability and accuracy of clinicians and trained research assistants performing prospective data collection in emergency department patients with potential acute coronary syndrome.

    Science.gov (United States)

    Cruz, Carlos O; Meshberg, Emily B; Shofer, Frances S; McCusker, Christine M; Chang, Anna Marie; Hollander, Judd E

    2009-07-01

    Clinical research requires high-quality data collection. Data collected at the emergency department evaluation is generally considered more precise than data collected through chart abstraction but is cumbersome and time consuming. We test whether trained research assistants without a medical background can obtain clinical research data as accurately as physicians. We hypothesize that they would be at least as accurate because they would not be distracted by clinical requirements. We conducted a prospective comparative study of 33 trained research assistants and 39 physicians (35 residents) to assess interrater reliability with respect to guideline-recommended clinical research data. Immediately after the research assistant and clinician evaluation, the data were compared by a tiebreaker third person who forced the patient to choose one of the 2 answers as the correct one when responses were discordant. Crude percentage agreement and interrater reliability were assessed (kappa statistic). One hundred forty-three patients were recruited (mean age 50.7 years; 47% female patients). Overall, the median agreement was 81% (interquartile range [IQR] 73% to 92%) and interrater reliability was fair (kappa value 0.36 [IQR 0.26 to 0.52]) but varied across categories of data: cardiac risk factors (median 86% [IQR 81% to 93%]; median 0.69 [IQR 0.62 to 0.83]), other cardiac history (median 93% [IQR 79% to 95%]; median 0.56 [IQR 0.29 to 0.77]), pain location (median 92% [IR 86% to 94%]; median 0.37 [IQR 0.25 to 0.29]), radiation (median 86% [IQR 85% to 87%]; median 0.37 [IQR 0.26 to 0.42]), quality (median 85% [IQR 75% to 94%]; median 0.29 [IQR 0.23 to 0.40]), and associated symptoms (median 74% [IQR 65% to 78%]; median 0.28 [IQR 0.20 to 0.40]). When discordant information was obtained, the research assistant was more often correct (median 64% [IQR 53% to 72%]). The relatively fair interrater reliability observed in our study is consistent with previous studies evaluating

  20. Contaminants in landfill soils - Reliability of prefeasibility studies.

    Science.gov (United States)

    Hölzle, Ingo

    2017-05-01

    Recent landfill mining studies have researched the potential for resource recovery using samples from core drilling or grab cranes. However, most studies used small sample numbers, which may not represent the heterogeneous landfill composition. As a consequence, there exists a high risk of an incorrect economic and/or ecological evaluation. The main objective of this work is to investigate the possibilities and limitations of preliminary investigations concerning the crucial soil composition. The preliminary samples of landfill investigations were compared to the excavation samples from three completely excavated landfills in Germany. In addition, the research compared the reliability of prediction of the two investigation methods, core drilling and grab crane. Sampling using a grab crane led to better results, even for smaller investigations of 10 samples. Analyses of both methods showed sufficiently accurate results to make predictions (standard error 5%, level of confidence 95%) for most heavy metals, cyanide and PAH in the dry substance and for sulphate, barium, Benzo[a]pyrene, pH and the electrical conductivity in leachate analyses of soil type waste. While chrome and nickel showed less accurate results, the concentrations of hydrocarbons, TOC, DOC, PCB and fluorine (leachate) were not predictable even for sample numbers of up to 59. Overestimations of pollutant concentrations were more frequently apparent in drilling, and underestimations when using a grab crane. The dispersion of the element and elemental composition had no direct impact on the reliability of prediction. Thus, an individual consideration of the particular element or elemental composition for dry substance and leachate analyses is recommended to adapt the sample strategy and calculate an optimum sample number. Copyright © 2016 Elsevier Ltd. All rights reserved.