WorldWideScience

Sample records for reliable prior information

  1. On Bayesian reliability analysis with informative priors and censoring

    International Nuclear Information System (INIS)

    Coolen, F.P.A.

    1996-01-01

    In the statistical literature many methods have been presented to deal with censored observations, both within the Bayesian and non-Bayesian frameworks, and such methods have been successfully applied to, e.g., reliability problems. Also, in reliability theory it is often emphasized that, through shortage of statistical data and possibilities for experiments, one often needs to rely heavily on judgements of engineers, or other experts, for which means Bayesian methods are attractive. It is therefore important that such judgements can be elicited easily to provide informative prior distributions that reflect the knowledge of the engineers well. In this paper we focus on this aspect, especially on the situation that the judgements of the consulted engineers are based on experiences in environments where censoring has also been present previously. We suggest the use of the attractive interpretation of hyperparameters of conjugate prior distributions when these are available for assumed parametric models for lifetimes, and we show how one may go beyond the standard conjugate priors, using similar interpretations of hyper-parameters, to enable easier elicitation when censoring has been present in the past. This may even lead to more flexibility for modelling prior knowledge than when using standard conjugate priors, whereas the disadvantage of more complicated calculations that may be needed to determine posterior distributions play a minor role due to the advanced mathematical and statistical software that is widely available these days

  2. Incorporation of local dependent reliability information into the Prior Image Constrained Compressed Sensing (PICCS) reconstruction algorithm

    International Nuclear Information System (INIS)

    Vaegler, Sven; Sauer, Otto; Stsepankou, Dzmitry; Hesser, Juergen

    2015-01-01

    The reduction of dose in cone beam computer tomography (CBCT) arises from the decrease of the tube current for each projection as well as from the reduction of the number of projections. In order to maintain good image quality, sophisticated image reconstruction techniques are required. The Prior Image Constrained Compressed Sensing (PICCS) incorporates prior images into the reconstruction algorithm and outperforms the widespread used Feldkamp-Davis-Kress-algorithm (FDK) when the number of projections is reduced. However, prior images that contain major variations are not appropriately considered so far in PICCS. We therefore propose the partial-PICCS (pPICCS) algorithm. This framework is a problem-specific extension of PICCS and enables the incorporation of the reliability of the prior images additionally. We assumed that the prior images are composed of areas with large and small deviations. Accordingly, a weighting matrix considered the assigned areas in the objective function. We applied our algorithm to the problem of image reconstruction from few views by simulations with a computer phantom as well as on clinical CBCT projections from a head-and-neck case. All prior images contained large local variations. The reconstructed images were compared to the reconstruction results by the FDK-algorithm, by Compressed Sensing (CS) and by PICCS. To show the gain of image quality we compared image details with the reference image and used quantitative metrics (root-mean-square error (RMSE), contrast-to-noise-ratio (CNR)). The pPICCS reconstruction framework yield images with substantially improved quality even when the number of projections was very small. The images contained less streaking, blurring and inaccurately reconstructed structures compared to the images reconstructed by FDK, CS and conventional PICCS. The increased image quality is also reflected in large RMSE differences. We proposed a modification of the original PICCS algorithm. The pPICCS algorithm

  3. Formulating informative, data-based priors for failure probability estimation in reliability analysis

    International Nuclear Information System (INIS)

    Guikema, Seth D.

    2007-01-01

    Priors play an important role in the use of Bayesian methods in risk analysis, and using all available information to formulate an informative prior can lead to more accurate posterior inferences. This paper examines the practical implications of using five different methods for formulating an informative prior for a failure probability based on past data. These methods are the method of moments, maximum likelihood (ML) estimation, maximum entropy estimation, starting from a non-informative 'pre-prior', and fitting a prior based on confidence/credible interval matching. The priors resulting from the use of these different methods are compared qualitatively, and the posteriors are compared quantitatively based on a number of different scenarios of observed data used to update the priors. The results show that the amount of information assumed in the prior makes a critical difference in the accuracy of the posterior inferences. For situations in which the data used to formulate the informative prior is an accurate reflection of the data that is later observed, the ML approach yields the minimum variance posterior. However, the maximum entropy approach is more robust to differences between the data used to formulate the prior and the observed data because it maximizes the uncertainty in the prior subject to the constraints imposed by the past data

  4. Prior information in structure estimation

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav; Nedoma, Petr; Khailova, Natalia; Pavelková, Lenka

    2003-01-01

    Roč. 150, č. 6 (2003), s. 643-653 ISSN 1350-2379 R&D Projects: GA AV ČR IBS1075102; GA AV ČR IBS1075351; GA ČR GA102/03/0049 Institutional research plan: CEZ:AV0Z1075907 Keywords : prior knowledge * structure estimation * autoregressive models Subject RIV: BC - Control Systems Theory Impact factor: 0.745, year: 2003 http://library.utia.cas.cz/separaty/historie/karny-0411258.pdf

  5. Bayesian nonparametric system reliability using sets of priors

    NARCIS (Netherlands)

    Walter, G.M.; Aslett, L.J.M.; Coolen, F.P.A.

    2016-01-01

    An imprecise Bayesian nonparametric approach to system reliability with multiple types of components is developed. This allows modelling partial or imperfect prior knowledge on component failure distributions in a flexible way through bounds on the functioning probability. Given component level test

  6. Designing reliability information flows

    International Nuclear Information System (INIS)

    Petkova, Valia T.; Lu Yuan; Ion, Roxana A.; Sander, Peter C.

    2005-01-01

    It is well-known [Reliab. Eng. Syst. Saf. 75 (2002) 295] that in modern development processes it is essential to have an information flow structure that facilitates fast feedback from product users (customers) to departments at the front end, in particular development and production. As information is only relevant if it is used when taking decisions, this paper presents a guideline for building field feedback information flows that facilitate the decision taking during the product creation and realisation process. The guideline takes into consideration that the type of decisions depends on the span-of-control, therefore following Parsons [Structure and Process in Modern Societies (1990)] the span-of-control is subdivided into the following three levels: strategic, tactic, and executive. The guideline is illustrated with a case in which it is used for analysing the quality of existing field feedback flows

  7. Varying prior information in Bayesian inversion

    International Nuclear Information System (INIS)

    Walker, Matthew; Curtis, Andrew

    2014-01-01

    Bayes' rule is used to combine likelihood and prior probability distributions. The former represents knowledge derived from new data, the latter represents pre-existing knowledge; the Bayesian combination is the so-called posterior distribution, representing the resultant new state of knowledge. While varying the likelihood due to differing data observations is common, there are also situations where the prior distribution must be changed or replaced repeatedly. For example, in mixture density neural network (MDN) inversion, using current methods the neural network employed for inversion needs to be retrained every time prior information changes. We develop a method of prior replacement to vary the prior without re-training the network. Thus the efficiency of MDN inversions can be increased, typically by orders of magnitude when applied to geophysical problems. We demonstrate this for the inversion of seismic attributes in a synthetic subsurface geological reservoir model. We also present results which suggest that prior replacement can be used to control the statistical properties (such as variance) of the final estimate of the posterior in more general (e.g., Monte Carlo based) inverse problem solutions. (paper)

  8. Random template placement and prior information

    International Nuclear Information System (INIS)

    Roever, Christian

    2010-01-01

    In signal detection problems, one is usually faced with the task of searching a parameter space for peaks in the likelihood function which indicate the presence of a signal. Random searches have proven to be very efficient as well as easy to implement, compared e.g. to searches along regular grids in parameter space. Knowledge of the parameterised shape of the signal searched for adds structure to the parameter space, i.e., there are usually regions requiring to be densely searched while in other regions a coarser search is sufficient. On the other hand, prior information identifies the regions in which a search will actually be promising or may likely be in vain. Defining specific figures of merit allows one to combine both template metric and prior distribution and devise optimal sampling schemes over the parameter space. We show an example related to the gravitational wave signal from a binary inspiral event. Here the template metric and prior information are particularly contradictory, since signals from low-mass systems tolerate the least mismatch in parameter space while high-mass systems are far more likely, as they imply a greater signal-to-noise ratio (SNR) and hence are detectable to greater distances. The derived sampling strategy is implemented in a Markov chain Monte Carlo (MCMC) algorithm where it improves convergence.

  9. Gamma prior distribution selection for Bayesian analysis of failure rate and reliability

    International Nuclear Information System (INIS)

    Waler, R.A.; Johnson, M.M.; Waterman, M.S.; Martz, H.F. Jr.

    1977-01-01

    It is assumed that the phenomenon under study is such that the time-to-failure may be modeled by an exponential distribution with failure-rate parameter, lambda. For Bayesian analyses of the assumed model, the family of gamma distributions provides conjugate prior models for lambda. Thus, an experimenter needs to select a particular gamma model to conduct a Bayesian reliability analysis. The purpose of this paper is to present a methodology which can be used to translate engineering information, experience, and judgment into a choice of a gamma prior distribution. The proposed methodology assumes that the practicing engineer can provide percentile data relating to either the failure rate or the reliability of the phenomenon being investigated. For example, the methodology will select the gamma prior distribution which conveys an engineer's belief that the failure rate, lambda, simultaneously satisfies the probability statements, P(lambda less than 1.0 x 10 -3 ) = 0.50 and P(lambda less than 1.0 x 10 -5 ) = 0.05. That is, two percentiles provided by an engineer are used to determine a gamma prior model which agrees with the specified percentiles. For those engineers who prefer to specify reliability percentiles rather than the failure-rate percentiles illustrated above, one can use the induced negative-log gamma prior distribution which satisfies the probability statements, P(R(t 0 ) less than 0.99) = 0.50 and P(R(t 0 ) less than 0.99999) = 0.95 for some operating time t 0 . Also, the paper includes graphs for selected percentiles which assist an engineer in applying the methodology

  10. Gamma prior distribution selection for Bayesian analysis of failure rate and reliability

    International Nuclear Information System (INIS)

    Waller, R.A.; Johnson, M.M.; Waterman, M.S.; Martz, H.F. Jr.

    1976-07-01

    It is assumed that the phenomenon under study is such that the time-to-failure may be modeled by an exponential distribution with failure rate lambda. For Bayesian analyses of the assumed model, the family of gamma distributions provides conjugate prior models for lambda. Thus, an experimenter needs to select a particular gamma model to conduct a Bayesian reliability analysis. The purpose of this report is to present a methodology that can be used to translate engineering information, experience, and judgment into a choice of a gamma prior distribution. The proposed methodology assumes that the practicing engineer can provide percentile data relating to either the failure rate or the reliability of the phenomenon being investigated. For example, the methodology will select the gamma prior distribution which conveys an engineer's belief that the failure rate lambda simultaneously satisfies the probability statements, P(lambda less than 1.0 x 10 -3 ) equals 0.50 and P(lambda less than 1.0 x 10 -5 ) equals 0.05. That is, two percentiles provided by an engineer are used to determine a gamma prior model which agrees with the specified percentiles. For those engineers who prefer to specify reliability percentiles rather than the failure rate percentiles illustrated above, it is possible to use the induced negative-log gamma prior distribution which satisfies the probability statements, P(R(t 0 ) less than 0.99) equals 0.50 and P(R(t 0 ) less than 0.99999) equals 0.95, for some operating time t 0 . The report also includes graphs for selected percentiles which assist an engineer in applying the procedure. 28 figures, 16 tables

  11. Application of Bayesian Decision Theory Based on Prior Information in the Multi-Objective Optimization Problem

    Directory of Open Access Journals (Sweden)

    Xia Lei

    2010-12-01

    Full Text Available General multi-objective optimization methods are hard to obtain prior information, how to utilize prior information has been a challenge. This paper analyzes the characteristics of Bayesian decision-making based on maximum entropy principle and prior information, especially in case that how to effectively improve decision-making reliability in deficiency of reference samples. The paper exhibits effectiveness of the proposed method using the real application of multi-frequency offset estimation in distributed multiple-input multiple-output system. The simulation results demonstrate Bayesian decision-making based on prior information has better global searching capability when sampling data is deficient.

  12. Estimating security betas using prior information based on firm fundamentals

    NARCIS (Netherlands)

    Cosemans, M.; Frehen, R.; Schotman, P.C.; Bauer, R.

    2010-01-01

    This paper proposes a novel approach for estimating time-varying betas of individual stocks that incorporates prior information based on fundamentals. We shrink the rolling window estimate of beta towards a firm-specific prior that is motivated by asset pricing theory. The prior captures structural

  13. Reliable Dual Tensor Model Estimation in Single and Crossing Fibers Based on Jeffreys Prior

    Science.gov (United States)

    Yang, Jianfei; Poot, Dirk H. J.; Caan, Matthan W. A.; Su, Tanja; Majoie, Charles B. L. M.; van Vliet, Lucas J.; Vos, Frans M.

    2016-01-01

    Purpose This paper presents and studies a framework for reliable modeling of diffusion MRI using a data-acquisition adaptive prior. Methods Automated relevance determination estimates the mean of the posterior distribution of a rank-2 dual tensor model exploiting Jeffreys prior (JARD). This data-acquisition prior is based on the Fisher information matrix and enables the assessment whether two tensors are mandatory to describe the data. The method is compared to Maximum Likelihood Estimation (MLE) of the dual tensor model and to FSL’s ball-and-stick approach. Results Monte Carlo experiments demonstrated that JARD’s volume fractions correlated well with the ground truth for single and crossing fiber configurations. In single fiber configurations JARD automatically reduced the volume fraction of one compartment to (almost) zero. The variance in fractional anisotropy (FA) of the main tensor component was thereby reduced compared to MLE. JARD and MLE gave a comparable outcome in data simulating crossing fibers. On brain data, JARD yielded a smaller spread in FA along the corpus callosum compared to MLE. Tract-based spatial statistics demonstrated a higher sensitivity in detecting age-related white matter atrophy using JARD compared to both MLE and the ball-and-stick approach. Conclusions The proposed framework offers accurate and precise estimation of diffusion properties in single and dual fiber regions. PMID:27760166

  14. Testability evaluation using prior information of multiple sources

    Directory of Open Access Journals (Sweden)

    Wang Chao

    2014-08-01

    Full Text Available Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR and fault isolation rate (FIR, which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testability demonstration test data (TDTD such as low evaluation confidence and inaccurate result, a testability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF, and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  15. Testability evaluation using prior information of multiple sources

    Institute of Scientific and Technical Information of China (English)

    Wang Chao; Qiu Jing; Liu Guanjun; Zhang Yong

    2014-01-01

    Testability plays an important role in improving the readiness and decreasing the life-cycle cost of equipment. Testability demonstration and evaluation is of significance in measuring such testability indexes as fault detection rate (FDR) and fault isolation rate (FIR), which is useful to the producer in mastering the testability level and improving the testability design, and helpful to the consumer in making purchase decisions. Aiming at the problems with a small sample of testabil-ity demonstration test data (TDTD) such as low evaluation confidence and inaccurate result, a test-ability evaluation method is proposed based on the prior information of multiple sources and Bayes theory. Firstly, the types of prior information are analyzed. The maximum entropy method is applied to the prior information with the mean and interval estimate forms on the testability index to obtain the parameters of prior probability density function (PDF), and the empirical Bayesian method is used to get the parameters for the prior information with a success-fail form. Then, a parametrical data consistency check method is used to check the compatibility between all the sources of prior information and TDTD. For the prior information to pass the check, the prior credibility is calculated. A mixed prior distribution is formed based on the prior PDFs and the corresponding credibility. The Bayesian posterior distribution model is acquired with the mixed prior distribution and TDTD, based on which the point and interval estimates are calculated. Finally, examples of a flying control system are used to verify the proposed method. The results show that the proposed method is feasible and effective.

  16. Crowdsourcing prior information to improve study design and data analysis.

    Directory of Open Access Journals (Sweden)

    Jeffrey S Chrabaszcz

    Full Text Available Though Bayesian methods are being used more frequently, many still struggle with the best method for setting priors with novel measures or task environments. We propose a method for setting priors by eliciting continuous probability distributions from naive participants. This allows us to include any relevant information participants have for a given effect. Even when prior means are near-zero, this method provides a principle way to estimate dispersion and produce shrinkage, reducing the occurrence of overestimated effect sizes. We demonstrate this method with a number of published studies and compare the effect of different prior estimation and aggregation methods.

  17. Analysis of information security reliability: A tutorial

    International Nuclear Information System (INIS)

    Kondakci, Suleyman

    2015-01-01

    This article presents a concise reliability analysis of network security abstracted from stochastic modeling, reliability, and queuing theories. Network security analysis is composed of threats, their impacts, and recovery of the failed systems. A unique framework with a collection of the key reliability models is presented here to guide the determination of the system reliability based on the strength of malicious acts and performance of the recovery processes. A unique model, called Attack-obstacle model, is also proposed here for analyzing systems with immunity growth features. Most computer science curricula do not contain courses in reliability modeling applicable to different areas of computer engineering. Hence, the topic of reliability analysis is often too diffuse to most computer engineers and researchers dealing with network security. This work is thus aimed at shedding some light on this issue, which can be useful in identifying models, their assumptions and practical parameters for estimating the reliability of threatened systems and for assessing the performance of recovery facilities. It can also be useful for the classification of processes and states regarding the reliability of information systems. Systems with stochastic behaviors undergoing queue operations and random state transitions can also benefit from the approaches presented here. - Highlights: • A concise survey and tutorial in model-based reliability analysis applicable to information security. • A framework of key modeling approaches for assessing reliability of networked systems. • The framework facilitates quantitative risk assessment tasks guided by stochastic modeling and queuing theory. • Evaluation of approaches and models for modeling threats, failures, impacts, and recovery analysis of information systems

  18. Calculation of noninformative prior of reliability parameter and initiating event frequency with Jeffreys method

    International Nuclear Information System (INIS)

    He Jie; Zhang Binbin

    2013-01-01

    In the probabilistic safety assessment (PSA) of nuclear power plants, there are few historical records on some initiating event frequencies or component failures in industry. In order to determine the noninformative priors of such reliability parameters and initiating event frequencies, the Jeffreys method in Bayesian statistics was employed. The mathematical mechanism of the Jeffreys prior and the simplified constrained noninformative distribution (SCNID) were elaborated in this paper. The Jeffreys noninformative formulas and the credible intervals of the Gamma-Poisson and Beta-Binomial models were introduced. As an example, the small break loss-of-coolant accident (SLOCA) was employed to show the application of the Jeffreys prior in determining an initiating event frequency. The result shows that the Jeffreys method is an effective method for noninformative prior calculation. (authors)

  19. Finding A Minimally Informative Dirichlet Prior Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  20. Finding a minimally informative Dirichlet prior distribution using least squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  1. Finding a Minimally Informative Dirichlet Prior Distribution Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straight-forward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in closed form, and so an approximate beta distribution is used in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial aleatory model for common-cause failure, must be estimated from data that is often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  2. Reliability of "Google" for obtaining medical information

    Directory of Open Access Journals (Sweden)

    Mihir Kothari

    2015-01-01

    Full Text Available Internet is used by many patients to obtain relevant medical information. We assessed the impact of "Google" search on the knowledge of the parents whose ward suffered from squint. In 21 consecutive patients, the "Google" search improved the mean score of the correct answers from 47% to 62%. We found that "Google" search was useful and reliable source of information for the patients with regards to the disease etiopathogenesis and the problems caused by the disease. The internet-based information, however, was incomplete and not reliable with regards to the disease treatment.

  3. Compressive Online Robust Principal Component Analysis with Multiple Prior Information

    DEFF Research Database (Denmark)

    Van Luong, Huynh; Deligiannis, Nikos; Seiler, Jürgen

    -rank components. Unlike conventional batch RPCA, which processes all the data directly, our method considers a small set of measurements taken per data vector (frame). Moreover, our method incorporates multiple prior information signals, namely previous reconstructed frames, to improve these paration...... and thereafter, update the prior information for the next frame. Using experiments on synthetic data, we evaluate the separation performance of the proposed algorithm. In addition, we apply the proposed algorithm to online video foreground and background separation from compressive measurements. The results show...

  4. Superposing pure quantum states with partial prior information

    Science.gov (United States)

    Dogra, Shruti; Thomas, George; Ghosh, Sibasish; Suter, Dieter

    2018-05-01

    The principle of superposition is an intriguing feature of quantum mechanics, which is regularly exploited in many different circumstances. A recent work [M. Oszmaniec et al., Phys. Rev. Lett. 116, 110403 (2016), 10.1103/PhysRevLett.116.110403] shows that the fundamentals of quantum mechanics restrict the process of superimposing two unknown pure states, even though it is possible to superimpose two quantum states with partial prior knowledge. The prior knowledge imposes geometrical constraints on the choice of input states. We discuss an experimentally feasible protocol to superimpose multiple pure states of a d -dimensional quantum system and carry out an explicit experimental realization for two single-qubit pure states with partial prior information on a two-qubit NMR quantum information processor.

  5. Remote patient monitoring: Information reliability challenges

    NARCIS (Netherlands)

    Petkovic, M.

    2009-01-01

    An increasing number of extramural applications in the personal healthcare domain pose new challenges regarding the security of medical data. In this paper, we focus on remote patient monitoring systems and the issues around information reliability. In these systems medical data is not collected by

  6. Rapid sampling of molecular motions with prior information constraints.

    Science.gov (United States)

    Raveh, Barak; Enosh, Angela; Schueler-Furman, Ora; Halperin, Dan

    2009-02-01

    Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT). Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  7. Rapid sampling of molecular motions with prior information constraints.

    Directory of Open Access Journals (Sweden)

    Barak Raveh

    2009-02-01

    Full Text Available Proteins are active, flexible machines that perform a range of different functions. Innovative experimental approaches may now provide limited partial information about conformational changes along motion pathways of proteins. There is therefore a need for computational approaches that can efficiently incorporate prior information into motion prediction schemes. In this paper, we present PathRover, a general setup designed for the integration of prior information into the motion planning algorithm of rapidly exploring random trees (RRT. Each suggested motion pathway comprises a sequence of low-energy clash-free conformations that satisfy an arbitrary number of prior information constraints. These constraints can be derived from experimental data or from expert intuition about the motion. The incorporation of prior information is very straightforward and significantly narrows down the vast search in the typically high-dimensional conformational space, leading to dramatic reduction in running time. To allow the use of state-of-the-art energy functions and conformational sampling, we have integrated this framework into Rosetta, an accurate protocol for diverse types of structural modeling. The suggested framework can serve as an effective complementary tool for molecular dynamics, Normal Mode Analysis, and other prevalent techniques for predicting motion in proteins. We applied our framework to three different model systems. We show that a limited set of experimentally motivated constraints may effectively bias the simulations toward diverse predicates in an outright fashion, from distance constraints to enforcement of loop closure. In particular, our analysis sheds light on mechanisms of protein domain swapping and on the role of different residues in the motion.

  8. Reliability of dynamic systems under limited information.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr. (.,; .); Grigoriu, Mircea

    2006-09-01

    A method is developed for reliability analysis of dynamic systems under limited information. The available information includes one or more samples of the system output; any known information on features of the output can be used if available. The method is based on the theory of non-Gaussian translation processes and is shown to be particularly suitable for problems of practical interest. For illustration, we apply the proposed method to a series of simple example problems and compare with results given by traditional statistical estimators in order to establish the accuracy of the method. It is demonstrated that the method delivers accurate results for the case of linear and nonlinear dynamic systems, and can be applied to analyze experimental data and/or mathematical model outputs. Two complex applications of direct interest to Sandia are also considered. First, we apply the proposed method to assess design reliability of a MEMS inertial switch. Second, we consider re-entry body (RB) component vibration response during normal re-entry, where the objective is to estimate the time-dependent probability of component failure. This last application is directly relevant to re-entry random vibration analysis at Sandia, and may provide insights on test-based and/or model-based qualification of weapon components for random vibration environments.

  9. Incorporation prior belief in the general path model: A comparison of information sources

    International Nuclear Information System (INIS)

    Coble, Jamie; Hines, Wesley

    2014-01-01

    The general path model (GPM) is one approach for performing degradation-based, or Type III, prognostics. The GPM fits a parametric function to the collected observations of a prognostic parameter and extrapolates the fit to a failure threshold. This approach has been successfully applied to a variety of systems when a sufficient number of prognostic parameter observations are available. However, the parametric fit can suffer significantly when few data are available or the data are very noisy. In these instances, it is beneficial to include additional information to influence the fit to conform to a prior belief about the evolution of system degradation. Bayesian statistical approaches have been proposed to include prior information in the form of distributions of expected model parameters. This requires a number of run-to-failure cases with tracked prognostic parameters; these data may not be readily available for many systems. Reliability information and stressor-based (Type I and Type II, respectively) prognostic estimates can provide the necessary prior belief for the GPM. This article presents the Bayesian updating framework to include prior information in the GPM and compares the efficacy of including different information sources on two data sets.

  10. Bayesian Analysis of two Censored Shifted Gompertz Mixture Distributions using Informative and Noninformative Priors

    Directory of Open Access Journals (Sweden)

    Tabassum Naz Sindhu

    2017-03-01

    Full Text Available This study deals with Bayesian analysis of shifted Gompertz mixture model under type-I censored samples assuming both informative and noninformative priors. We have discussed the Bayesian estimation of parameters of shifted Gompertz mixture model under the uniform, and gamma priors assuming three loss functions. Further, some properties of the model with some graphs of the mixture density are discussed. These properties include Bayes estimators, posterior risks and reliability function under simulation scheme. Bayes estimates are obtained considering two cases: (a when the shape parameter is known and (b when all parameters are unknown. We analyzed some simulated sets in order to investigate the effect of prior belief, loss functions, and performance of the proposed set of estimators of the mixture model parameters.

  11. Reliability of the exercise ECG in detecting silent ischemia in patients with prior myocardial infarction

    International Nuclear Information System (INIS)

    Yamagishi, Takashi; Matsuda, Yasuo; Satoh, Akira

    1991-01-01

    To assess the reliability of the exercise ECG in detecting silent ischemia, ECG results were compared with those of stress-redistribution thallium-201 single-photon emission computed tomography (SPECT) in 116 patients with prior myocardial infarction and in 20 normal subjects used as a control. The left ventricle (LV) was divided into 20 segmental images, which were scored blindly on a 5-point scale. The redistribution score was defined as thallium defect score of exercise subtracted by that of redistribution image and was used as a measure of amount of ischemic but viable myocardium. The upper limit of normal redistribution score (=4.32) was defined as mean+2 standard deviations derived from 20 normal subjects. Of 116 patients, 47 had the redistribution score above the normal range. Twenty-five (53%) of the 47 patients showed positive ECG response. Fourteen (20%) of the 69 patients, who had the normal redistribution score, showed positive ECG response. Thus, the ECG response had a sensitivity of 53% and a specificity of 80% in detecting transient ischemia. Furthermore, the 116 patients were subdivided into 4 groups according to the presence or absence of chest pain and ECG change during exercise. Fourteen patients showed both chest pain and ECG change and all these patients had the redistribution score above the normal range. Twenty-five patients showed ECG change without chest pain and 11 (44%) of the 25 patients had the abnormal redistribution. Three (43%) of 7 patients who showed chest pain without ECG change had the abnormal redistribution score. Of 70 patients who had neither chest pain nor ECG change, 19 (27%) had the redistribution score above the normal range. Thus, limitations exist in detecting silent ischemia by ECG in patients with a prior myocardial infarction, because the ECG response to the exercise test may have a low degree of sensitivity and a high degree of false positive and false negative results in detecting silent ischemia. (author)

  12. Patient Self-Report of Prior Laser Treatment Reliably Indicates Presence of Severe Diabetic Retinopathy

    Science.gov (United States)

    GRASSI, MICHAEL A.; MAZZULLA, D. ANTHONY; KNUDTSON, MICHAEL D.; HUANG, WENDY W.; LEE, KRISTINE E.; KLEIN, BARBARA E.; NICOLAE, DAN L.; KLEIN, RONALD

    2009-01-01

    PURPOSE To determine whether patient self-report of prior laser treatment can be used as a reliable tool for assessing the presence of severe diabetic retinopathy. DESIGN This was a retrospective study on two groups of diabetic subjects. METHODS One hundred patients with diabetes were recruited from the general eye and retina clinics at the University of Chicago Hospitals. The patients were asked, “Have you ever received laser treatment for your diabetic eye disease (DED)?” A chart review was then conducted noting if the patient had received either focal laser treatment for diabetic macular edema or panretinal photocoagulation for proliferative diabetic retinopathy. Data from the Wisconsin Epidemiological Study of Diabetic Retinopathy (WESDR) were also analyzed. Participant responses to the question “Have you had laser photocoagulation treatment for your eyes?” were analyzed with documentation of photocoagulation scars determined by grading seven-standard field color fundus photographs. RESULTS In the University of Chicago group, 96 of 100 (96%) of patients were accurate in reporting whether they had received previous laser treatment for DED (sensitivity 95.8%, specificity 96.1%, and positive predictive value 88.5%). In the WESDR analysis, 2,329 of 2,348 (99%) of participants were accurate in reporting whether they had prior laser treatment for DED (sensitivity 96.0%, specificity 99.5%, and positive predictive value 95.6%). CONCLUSIONS The high sensitivity and specificity of our results validate the use of patient self-report as a useful tool in assessing past laser treatment for severe diabetic retinopathy. Patient self-report may be a useful surrogate to clinical examination or medical record review to determine the presence of severe diabetic retinopathy. PMID:19054495

  13. Iterative CT shading correction with no prior information

    Science.gov (United States)

    Wu, Pengwei; Sun, Xiaonan; Hu, Hongjie; Mao, Tingyu; Zhao, Wei; Sheng, Ke; Cheung, Alice A.; Niu, Tianye

    2015-11-01

    Shading artifacts in CT images are caused by scatter contamination, beam-hardening effect and other non-ideal imaging conditions. The purpose of this study is to propose a novel and general correction framework to eliminate low-frequency shading artifacts in CT images (e.g. cone-beam CT, low-kVp CT) without relying on prior information. The method is based on the general knowledge of the relatively uniform CT number distribution in one tissue component. The CT image is first segmented to construct a template image where each structure is filled with the same CT number of a specific tissue type. Then, by subtracting the ideal template from the CT image, the residual image from various error sources are generated. Since forward projection is an integration process, non-continuous shading artifacts in the image become continuous signals in a line integral. Thus, the residual image is forward projected and its line integral is low-pass filtered in order to estimate the error that causes shading artifacts. A compensation map is reconstructed from the filtered line integral error using a standard FDK algorithm and added back to the original image for shading correction. As the segmented image does not accurately depict a shaded CT image, the proposed scheme is iterated until the variation of the residual image is minimized. The proposed method is evaluated using cone-beam CT images of a Catphan©600 phantom and a pelvis patient, and low-kVp CT angiography images for carotid artery assessment. Compared with the CT image without correction, the proposed method reduces the overall CT number error from over 200 HU to be less than 30 HU and increases the spatial uniformity by a factor of 1.5. Low-contrast object is faithfully retained after the proposed correction. An effective iterative algorithm for shading correction in CT imaging is proposed that is only assisted by general anatomical information without relying on prior knowledge. The proposed method is thus practical

  14. Information about robustness, reliability and safety in early design phases

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster

    methods, and an industrial case to assess how the use of information about robustness, reliability and safety as practised by current methods influences concept development. Current methods cannot be used in early design phases due to their dependence on detailed design information for the identification...... alternatives. This prompts designers to reuse working principles that are inherently flawed, as they are liable to disturbances, failures and hazards. To address this issue, an approach based upon individual records of early design issues consists of comparing failures and benefits from prior working...... principles, before making a decision, and improving the more suitable alternatives through this feedback. Workshops were conducted with design practitioners to evaluate the potential of the approach and to simulate decision-making and gain feedback on a proof-of-concept basis. The evaluation has demonstrated...

  15. An Ensemble Approach to Building Mercer Kernels with Prior Information

    Science.gov (United States)

    Srivastava, Ashok N.; Schumann, Johann; Fischer, Bernd

    2005-01-01

    This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly dimensional feature space. we describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using pre-defined kernels. These data adaptive kernels can encode prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. Specifically, we demonstrate the use of the algorithm in situations with extremely small samples of data. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS) and demonstrate the method's superior performance against standard methods. The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains templates for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic-algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code.

  16. Source-specific Informative Prior for i-Vector Extraction

    DEFF Research Database (Denmark)

    Shepstone, Sven Ewan; Lee, Kong Aik; Li, Haizhou

    2015-01-01

    An i-vector is a low-dimensional fixed-length representation of a variable-length speech utterance, and is defined as the posterior mean of a latent variable conditioned on the observed feature sequence of an utterance. The assumption is that the prior for the latent variable is non...

  17. Estimating security betas using prior information based on firm fundamentals

    NARCIS (Netherlands)

    Cosemans, Mathijs; Frehen, Rik; Schotman, Peter; Bauer, Rob

    We propose a hybrid approach for estimating beta that shrinks rolling window estimates towards firm-specific priors motivated by economic theory. Our method yields superior forecasts of beta that have important practical implications. First, hybrid betas carry a significant price of risk in the

  18. Estimating Security Betas Using Prior Information Based on Firm Fundamentals

    NARCIS (Netherlands)

    Cosemans, Mathijs; Frehen, Rik; Schotman, Peter; Bauer, Rob

    2016-01-01

    We propose a hybrid approach for estimating beta that shrinks rolling window estimates toward firm-specific priors motivated by economic theory. Our method yields superior forecasts of beta that have important practical implications. First, unlike standard rolling window betas, hybrid betas carry a

  19. The reliability and usability of district health information software ...

    African Journals Online (AJOL)

    The reliability and usability of district health information software: case studies from Tanzania. ... The District Health Information System (DHIS) software from the Health Information System ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  20. Prior and present evidence: how prior experience interacts with present information in a perceptual decision making task.

    Directory of Open Access Journals (Sweden)

    Muhsin Karim

    Full Text Available Vibrotactile discrimination tasks have been used to examine decision making processes in the presence of perceptual uncertainty, induced by barely discernible frequency differences between paired stimuli or by the presence of embedded noise. One lesser known property of such tasks is that decisions made on a single trial may be biased by information from prior trials. An example is the time-order effect whereby the presentation order of paired stimuli may introduce differences in accuracy. Subjects perform better when the first stimulus lies between the second stimulus and the global mean of all stimuli on the judged dimension ("preferred" time-orders compared to the alternative presentation order ("nonpreferred" time-orders. This has been conceptualised as a "drift" of the first stimulus representation towards the global mean of the stimulus-set (an internal standard. We describe the influence of prior information in relation to the more traditionally studied factors of interest in a classic discrimination task.Sixty subjects performed a vibrotactile discrimination task with different levels of uncertainty parametrically induced by increasing task difficulty, aperiodic stimulus noise, and changing the task instructions whilst maintaining identical stimulus properties (the "context".The time-order effect had a greater influence on task performance than two of the explicit factors-task difficulty and noise-but not context. The influence of prior information increased with the distance of the first stimulus from the global mean, suggesting that the "drift" velocity of the first stimulus towards the global mean representation was greater for these trials.Awareness of the time-order effect and prior information in general is essential when studying perceptual decision making tasks. Implicit mechanisms may have a greater influence than the explicit factors under study. It also affords valuable insights into basic mechanisms of information

  1. Prior elicitation: Interactive spreadsheet graphics with sliders can be fun, and informative

    OpenAIRE

    Jones, G; Johnson, WO

    2014-01-01

    There are several approaches to setting priors in Bayesian data analysis. Some attempt to minimize the impact of the prior on the posterior, allowing the data to "speak for themselves," or to provide Bayesian inferences that have good frequentist properties. In contrast, this note focuses on priors where scientific knowledge is used, possibly partially informative. There are many articles on the use of such subjective information. We focus on using standard software for eliciting priors from ...

  2. 76 FR 82315 - Agency Information Collection Activities: Prior Disclosure

    Science.gov (United States)

    2011-12-30

    ...: Extension (without change). Affected Public: Businesses. Estimated Number of Respondents: 3,500. Estimated... collection be extended with no change to the burden hours or to the information collected. This document is... appropriate automated, electronic, mechanical, or other technological techniques or other forms of information...

  3. Management Control for Reliable Financial Information

    Directory of Open Access Journals (Sweden)

    Victoria María Antonieta Martín Granados

    2010-06-01

    Full Text Available The financial information is the document that the administration of a juridical entity issues to know his financial situation. The financial information is useful and confiable for the users of the financial information when this has been prepared under conditions of certainty. This certainty is provided by the administration when it establishes political and procedures of internal control, as well as the surveillance in the accomplishment of the internal control. This control incides in the financial information since it is inherent to the operative flow and extends itself in relevant information, veracious and comparable. This is important for users of the financial information, due to the fact that they take timely and objective decisions.

  4. 76 FR 66741 - Agency Information Collection Activities: Prior Disclosure

    Science.gov (United States)

    2011-10-27

    ... forms of information technology; and (e) the annual cost burden to respondents or record keepers from... Responses: 3,500. Estimated Time per Response: 1 hour. Estimated Total Annual Burden Hours: 3,500. Dated...

  5. ASTROPHYSICAL PRIOR INFORMATION AND GRAVITATIONAL-WAVE PARAMETER ESTIMATION

    International Nuclear Information System (INIS)

    Pankow, Chris; Sampson, Laura; Perri, Leah; Chase, Eve; Coughlin, Scott; Zevin, Michael; Kalogera, Vassiliki

    2017-01-01

    The detection of electromagnetic counterparts to gravitational waves (GWs) has great promise for the investigation of many scientific questions. While it is well known that certain orientation parameters can reduce uncertainty in other related parameters, it was also hoped that the detection of an electromagnetic signal in conjunction with a GW could augment the measurement precision of the mass and spin from the gravitational signal itself. That is, knowledge of the sky location, inclination, and redshift of a binary could break degeneracies between these extrinsic, coordinate-dependent parameters and the physical parameters that are intrinsic to the binary. In this paper, we investigate this issue by assuming perfect knowledge of extrinsic parameters, and assessing the maximal impact of this knowledge on our ability to extract intrinsic parameters. We recover similar gains in extrinsic recovery to earlier work; however, we find only modest improvements in a few intrinsic parameters—namely the primary component’s spin. We thus conclude that, even in the best case, the use of additional information from electromagnetic observations does not improve the measurement of the intrinsic parameters significantly.

  6. ASTROPHYSICAL PRIOR INFORMATION AND GRAVITATIONAL-WAVE PARAMETER ESTIMATION

    Energy Technology Data Exchange (ETDEWEB)

    Pankow, Chris; Sampson, Laura; Perri, Leah; Chase, Eve; Coughlin, Scott; Zevin, Michael; Kalogera, Vassiliki [Center for Interdisciplinary Exploration and Research in Astrophysics (CIERA) and Department of Physics and Astronomy, Northwestern University, 2145 Sheridan Road, Evanston, IL 60208 (United States)

    2017-01-10

    The detection of electromagnetic counterparts to gravitational waves (GWs) has great promise for the investigation of many scientific questions. While it is well known that certain orientation parameters can reduce uncertainty in other related parameters, it was also hoped that the detection of an electromagnetic signal in conjunction with a GW could augment the measurement precision of the mass and spin from the gravitational signal itself. That is, knowledge of the sky location, inclination, and redshift of a binary could break degeneracies between these extrinsic, coordinate-dependent parameters and the physical parameters that are intrinsic to the binary. In this paper, we investigate this issue by assuming perfect knowledge of extrinsic parameters, and assessing the maximal impact of this knowledge on our ability to extract intrinsic parameters. We recover similar gains in extrinsic recovery to earlier work; however, we find only modest improvements in a few intrinsic parameters—namely the primary component’s spin. We thus conclude that, even in the best case, the use of additional information from electromagnetic observations does not improve the measurement of the intrinsic parameters significantly.

  7. Effects of prior information on decoding degraded speech: an fMRI study.

    Science.gov (United States)

    Clos, Mareike; Langner, Robert; Meyer, Martin; Oechslin, Mathias S; Zilles, Karl; Eickhoff, Simon B

    2014-01-01

    Expectations and prior knowledge are thought to support the perceptual analysis of incoming sensory stimuli, as proposed by the predictive-coding framework. The current fMRI study investigated the effect of prior information on brain activity during the decoding of degraded speech stimuli. When prior information enabled the comprehension of the degraded sentences, the left middle temporal gyrus and the left angular gyrus were activated, highlighting a role of these areas in meaning extraction. In contrast, the activation of the left inferior frontal gyrus (area 44/45) appeared to reflect the search for meaningful information in degraded speech material that could not be decoded because of mismatches with the prior information. Our results show that degraded sentences evoke instantaneously different percepts and activation patterns depending on the type of prior information, in line with prediction-based accounts of perception. Copyright © 2012 Wiley Periodicals, Inc.

  8. Parameter estimation of multivariate multiple regression model using bayesian with non-informative Jeffreys’ prior distribution

    Science.gov (United States)

    Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.

    2018-05-01

    Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.

  9. Is free, prior and informed consent a form of corporate social responsibility?

    NARCIS (Netherlands)

    Rodhouse, Toyah; Vanclay, Frank

    2016-01-01

    International organizations are increasingly including Indigenous peoples' rights and the concept of Free, Prior and Informed Consent (FPIC) in their guidance documents, codes of conduct, and performance standards. Leading companies are adjusting their Corporate Social Responsibility (CSR) and

  10. Performance of informative priors skeptical of large treatment effects in clinical trials: A simulation study.

    Science.gov (United States)

    Pedroza, Claudia; Han, Weilu; Thanh Truong, Van Thi; Green, Charles; Tyson, Jon E

    2018-01-01

    One of the main advantages of Bayesian analyses of clinical trials is their ability to formally incorporate skepticism about large treatment effects through the use of informative priors. We conducted a simulation study to assess the performance of informative normal, Student- t, and beta distributions in estimating relative risk (RR) or odds ratio (OR) for binary outcomes. Simulation scenarios varied the prior standard deviation (SD; level of skepticism of large treatment effects), outcome rate in the control group, true treatment effect, and sample size. We compared the priors with regards to bias, mean squared error (MSE), and coverage of 95% credible intervals. Simulation results show that the prior SD influenced the posterior to a greater degree than the particular distributional form of the prior. For RR, priors with a 95% interval of 0.50-2.0 performed well in terms of bias, MSE, and coverage under most scenarios. For OR, priors with a wider 95% interval of 0.23-4.35 had good performance. We recommend the use of informative priors that exclude implausibly large treatment effects in analyses of clinical trials, particularly for major outcomes such as mortality.

  11. The Effect of Prior Knowledge on Price Acceptability and the Type of Information Examined.

    OpenAIRE

    Rao, Akshay R; Sieben, Wanda A

    1992-01-01

    This article assesses whether differences in prior knowledge result in differences in (1) price acceptability and (2) the extent to which different types of information are examined. Using a personal computer-based methodology, subjects who varied in their prior product knowledge provided price responses, and the time they spent examining various kinds of information was measured. Acceptable price-range and points (price limits) were found to be lowest for low-knowledge subjects. Further, the...

  12. Under Construction: Reviewing and Producing Information Reliability on the Web

    NARCIS (Netherlands)

    S.A. Adams (Samantha)

    2006-01-01

    textabstractSince 1995, medical professionals, governments and independent organizations have been developing special tools to help lay-persons find websites that are guaranteed to give only reliable medical or health-related information. However, as these different actors also recognize, such a

  13. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  14. Reliability and Validity of Curriculum-Based Informal Reading Inventories.

    Science.gov (United States)

    Fuchs, Lynn; And Others

    A study was conducted to explore the reliability and validity of three prominent procedures used in informal reading inventories (IRIs): (1) choosing a 95% word recognition accuracy standard for determining student instructional level, (2) arbitrarily selecting a passage to represent the difficulty level of a basal reader, and (3) employing…

  15. Gene regulatory network inference by point-based Gaussian approximation filters incorporating the prior information.

    Science.gov (United States)

    Jia, Bin; Wang, Xiaodong

    2013-12-17

    : The extended Kalman filter (EKF) has been applied to inferring gene regulatory networks. However, it is well known that the EKF becomes less accurate when the system exhibits high nonlinearity. In addition, certain prior information about the gene regulatory network exists in practice, and no systematic approach has been developed to incorporate such prior information into the Kalman-type filter for inferring the structure of the gene regulatory network. In this paper, an inference framework based on point-based Gaussian approximation filters that can exploit the prior information is developed to solve the gene regulatory network inference problem. Different point-based Gaussian approximation filters, including the unscented Kalman filter (UKF), the third-degree cubature Kalman filter (CKF3), and the fifth-degree cubature Kalman filter (CKF5) are employed. Several types of network prior information, including the existing network structure information, sparsity assumption, and the range constraint of parameters, are considered, and the corresponding filters incorporating the prior information are developed. Experiments on a synthetic network of eight genes and the yeast protein synthesis network of five genes are carried out to demonstrate the performance of the proposed framework. The results show that the proposed methods provide more accurate inference results than existing methods, such as the EKF and the traditional UKF.

  16. The dependence of human reliability upon task information content

    International Nuclear Information System (INIS)

    Hermanson, E.M.; Golay, M.W.

    1994-09-01

    The role of human error in safety mishaps is an important factor in system design. As systems become increasingly complex the capacity of the human to deal with the added complexity is diminished. It is therefore crucial to understand the relationship between system complexity and human reliability so that systems may be built in such a way as to minimize human error. One way of understanding this relationship is to quantify system complexity and then measure the human reaction in response to situations of varying complexity. The quantification of system complexity may be performed by determining the information content present in the tasks that the human must execute. The purpose of this work is therefore to build and perform a consistent experiment which will determine the extent to which human reliability depends upon task information content. Two main conclusions may be drawn from this work. The first is that human reliability depends upon task information content. Specifically, as the information content contained in a task increases, the capacity of a human to deal successfully with the task decreases monotonically. Here the definition of total success is the ability to complete the task at hand fully and correctly. Furthermore, there exists a value of information content below which a human can deal with the task successfully, but above which the success of an individual decreases monotonically with increasing information. These ideas should be generalizable to any model where system complexity can be clearly and consistently defined

  17. On the use of prior information in modelling metabolic utilization of energy in growing pigs

    DEFF Research Database (Denmark)

    Strathe, Anders Bjerring; Jørgensen, Henry; Fernández, José Adalberto

    2011-01-01

    Construction of models that provide a realistic representation of metabolic utilization of energy in growing animals tend to be over-parameterized because data generated from individual metabolic studies are often sparse. In the Bayesian framework prior information can enter the data analysis......, PD and LD) made on a given pig at a given time followed a multivariate normal distribution. Two different equation systems were adopted from Strathe et al. (2010), generating the expected values in the multivariate normal distribution. Non-informative prior distributions were assigned for all model......, kp and kf, respectively. Utilizing both sets of priors showed that the maintenance component was sensitive to the statement of prior belief and, hence, that the estimate of 0.91 MJkg0.60d1 (95% CI: 0.78; 1.09) should be interpreted with caution. It was shown that boars were superior in depositing...

  18. Laboratory Information Management System Chain of Custody: Reliability and Security

    Science.gov (United States)

    Tomlinson, J. J.; Elliott-Smith, W.; Radosta, T.

    2006-01-01

    A chain of custody (COC) is required in many laboratories that handle forensics, drugs of abuse, environmental, clinical, and DNA testing, as well as other laboratories that want to assure reliability of reported results. Maintaining a dependable COC can be laborious, but with the recent establishment of the criteria for electronic records and signatures by US regulatory agencies, laboratory information management systems (LIMSs) are now being developed to fully automate COCs. The extent of automation and of data reliability can vary, and FDA- and EPA-compliant electronic signatures and system security are rare. PMID:17671623

  19. A Bayesian reliability evaluation method with integrated accelerated degradation testing and field information

    International Nuclear Information System (INIS)

    Wang, Lizhi; Pan, Rong; Li, Xiaoyang; Jiang, Tongmin

    2013-01-01

    Accelerated degradation testing (ADT) is a common approach in reliability prediction, especially for products with high reliability. However, oftentimes the laboratory condition of ADT is different from the field condition; thus, to predict field failure, one need to calibrate the prediction made by using ADT data. In this paper a Bayesian evaluation method is proposed to integrate the ADT data from laboratory with the failure data from field. Calibration factors are introduced to calibrate the difference between the lab and the field conditions so as to predict a product's actual field reliability more accurately. The information fusion and statistical inference procedure are carried out through a Bayesian approach and Markov chain Monte Carlo methods. The proposed method is demonstrated by two examples and the sensitivity analysis to prior distribution assumption

  20. Laboratory Information Management System Chain of Custody: Reliability and Security

    OpenAIRE

    Tomlinson, J. J.; Elliott-Smith, W.; Radosta, T.

    2006-01-01

    A chain of custody (COC) is required in many laboratories that handle forensics, drugs of abuse, environmental, clinical, and DNA testing, as well as other laboratories that want to assure reliability of reported results. Maintaining a dependable COC can be laborious, but with the recent establishment of the criteria for electronic records and signatures by US regulatory agencies, laboratory information management systems (LIMSs) are now being developed to fully automate COCs. The extent of a...

  1. An information system supporting design for reliability and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Rit, J.F.; Beraud, M.T

    1997-12-31

    EDF is currently developing a methodology to integrate availability, operating experience and maintenance in the design of power plants. This involves studies that depend closely on the results and assumptions of each other about the reliability and operations of the plant. Therefore a support information system must be carefully designed. Concurrently with development of the methodology, a research oriented information system was designed and built. It is based on the database model of a logistic support repository that we tailored to our needs. (K.A.) 10 refs.

  2. An information system supporting design for reliability and maintenance

    International Nuclear Information System (INIS)

    Rit, J.F.; Beraud, M.T.

    1997-01-01

    EDF is currently developing a methodology to integrate availability, operating experience and maintenance in the design of power plants. This involves studies that depend closely on the results and assumptions of each other about the reliability and operations of the plant. Therefore a support information system must be carefully designed. Concurrently with development of the methodology, a research oriented information system was designed and built. It is based on the database model of a logistic support repository that we tailored to our needs. (K.A.)

  3. Spontaneous trait inference and spontaneous trait transference are both unaffected by prior evaluations of informants.

    Science.gov (United States)

    Zengel, Bettina; Ambler, James K; McCarthy, Randy J; Skowronski, John J

    2017-01-01

    This article reports results from a study in which participants encountered either (a) previously known informants who were positive (e.g. Abraham Lincoln), neutral (e.g., Jay Leno), or negative (e.g., Adolf Hitler), or (b) previously unknown informants. The informants ostensibly described either a trait-implicative positive behavior, a trait-implicative negative behavior, or a neutral behavior. These descriptions were framed as either the behavior of the informant or the behavior of another person. Results yielded evidence of informant-trait linkages for both self-informants and for informants who described another person. These effects were not moderated by informant type, behavior valence, or the congruency or incongruency between the prior knowledge of the informant and the behavior valence. Results are discussed in terms of theories of Spontaneous Trait Inference and Spontaneous Trait Transference.

  4. 21 CFR 1.281 - What information must be in a prior notice?

    Science.gov (United States)

    2010-04-01

    ... by truck, bus, or rail, the trip number; (v) For food arriving as containerized cargo by water, air... arrived by truck, bus, or rail, the trip number; (v) For food that arrived as containerized cargo by water... 21 Food and Drugs 1 2010-04-01 2010-04-01 false What information must be in a prior notice? 1.281...

  5. 40 CFR 60.2953 - What information must I submit prior to initial startup?

    Science.gov (United States)

    2010-07-01

    ... initial startup? 60.2953 Section 60.2953 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning...

  6. 40 CFR 60.2195 - What information must I submit prior to initial startup?

    Science.gov (United States)

    2010-07-01

    ... initial startup? 60.2195 Section 60.2195 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY..., 2001 Recordkeeping and Reporting § 60.2195 What information must I submit prior to initial startup? You... startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning capacity. (c) The...

  7. Integrating conflicting information from multiple texts: Effects of prior attitudes and text format

    NARCIS (Netherlands)

    Van Strien, Johan; Brand-Gruwel, Saskia; Boshuizen, Els

    2011-01-01

    Van Strien, J. L. H., Brand-Gruwel, S., & Boshuizen, H. P. A. (2011, August). Integrating conflicting information from multiple texts: Effects of prior attitudes and text format. Round table session presented at the Junior Researchers pre-conference of the biannual meeting of the European

  8. Influence of prior information on pain involves biased perceptual decision-making.

    Science.gov (United States)

    Wiech, Katja; Vandekerckhove, Joachim; Zaman, Jonas; Tuerlinckx, Francis; Vlaeyen, Johan W S; Tracey, Irene

    2014-08-04

    Prior information about features of a stimulus is a strong modulator of perception. For instance, the prospect of more intense pain leads to an increased perception of pain, whereas the expectation of analgesia reduces pain, as shown in placebo analgesia and expectancy modulations during drug administration. This influence is commonly assumed to be rooted in altered sensory processing and expectancy-related modulations in the spinal cord, are often taken as evidence for this notion. Contemporary models of perception, however, suggest that prior information can also modulate perception by biasing perceptual decision-making - the inferential process underlying perception in which prior information is used to interpret sensory information. In this type of bias, the information is already present in the system before the stimulus is observed. Computational models can distinguish between changes in sensory processing and altered decision-making as they result in different response times for incorrect choices in a perceptual decision-making task (Figure S1A,B). Using a drift-diffusion model, we investigated the influence of both processes in two independent experiments. The results of both experiments strongly suggest that these changes in pain perception are predominantly based on altered perceptual decision-making. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Algorithms for biomagnetic source imaging with prior anatomical and physiological information

    Energy Technology Data Exchange (ETDEWEB)

    Hughett, Paul William [Univ. of California, Berkeley, CA (United States). Dept. of Electrical Engineering and Computer Sciences

    1995-12-01

    This dissertation derives a new method for estimating current source amplitudes in the brain and heart from external magnetic field measurements and prior knowledge about the probable source positions and amplitudes. The minimum mean square error estimator for the linear inverse problem with statistical prior information was derived and is called the optimal constrained linear inverse method (OCLIM). OCLIM includes as special cases the Shim-Cho weighted pseudoinverse and Wiener estimators but allows more general priors and thus reduces the reconstruction error. Efficient algorithms were developed to compute the OCLIM estimate for instantaneous or time series data. The method was tested in a simulated neuromagnetic imaging problem with five simultaneously active sources on a grid of 387 possible source locations; all five sources were resolved, even though the true sources were not exactly at the modeled source positions and the true source statistics differed from the assumed statistics.

  10. Matrix-Inversion-Free Compressed Sensing With Variable Orthogonal Multi-Matching Pursuit Based on Prior Information for ECG Signals.

    Science.gov (United States)

    Cheng, Yih-Chun; Tsai, Pei-Yun; Huang, Ming-Hao

    2016-05-19

    Low-complexity compressed sensing (CS) techniques for monitoring electrocardiogram (ECG) signals in wireless body sensor network (WBSN) are presented. The prior probability of ECG sparsity in the wavelet domain is first exploited. Then, variable orthogonal multi-matching pursuit (vOMMP) algorithm that consists of two phases is proposed. In the first phase, orthogonal matching pursuit (OMP) algorithm is adopted to effectively augment the support set with reliable indices and in the second phase, the orthogonal multi-matching pursuit (OMMP) is employed to rescue the missing indices. The reconstruction performance is thus enhanced with the prior information and the vOMMP algorithm. Furthermore, the computation-intensive pseudo-inverse operation is simplified by the matrix-inversion-free (MIF) technique based on QR decomposition. The vOMMP-MIF CS decoder is then implemented in 90 nm CMOS technology. The QR decomposition is accomplished by two systolic arrays working in parallel. The implementation supports three settings for obtaining 40, 44, and 48 coefficients in the sparse vector. From the measurement result, the power consumption is 11.7 mW at 0.9 V and 12 MHz. Compared to prior chip implementations, our design shows good hardware efficiency and is suitable for low-energy applications.

  11. [Toxoplasmosis and Pregnancy: Reliability of Internet Sources of Information].

    Science.gov (United States)

    Bobić, Branko; Štajner, Tijana; Nikolić, Aleksandra; Klun, Ivana; Srbljanović, Jelena; Djurković-Djaković, Olgica

    2015-01-01

    Health education of women of childbearing age has been shown to be an acceptable approach to the prevention of toxoplasmosis, the most frequent congenitally transmitted parasitic infection. The aim of this study was to evaluate the Internet as a source of health education on toxoplasmosis in pregnancy. A group of 100 pregnant women examined in the National Reference Laboratory for Toxoplasmosis was surveyed by a questionnaire on the source of their information on toxoplasmosis. We also analyzed information offered by websites in the Serbian and Croatian languages through the Google search engine, using "toxoplasmosis" as a keyword. The 23 top websites were evaluated for comprehensiveness and accuracy of information on the impact of toxoplasmosis on the course of pregnancy, diagnosis and prevention. Having knowledge on toxoplasmosis was confirmed by 64 (64.0%) examined women, 40.6% (26/64) of whom learned about toxoplasmosis through the Internet, 48.4% from physicians, and 10.9% from friends. Increase in the degree of education was found to be associated with the probability that pregnant women would be informed via the Internet (RR=3.15, 95% CI=1.27-7.82, p=0.013). Analysis of four interactive websites (allowing users to ask questions) showed that routes of infection were the most common concern, particularly the risk presented by pet cats and dogs, followed by the diagnosis of infection (who and when should be tested, and how should the results be interpreted). Of 20 sites containing educational articles, only seven were authorized and two listed sources. Evaluation confirmed that information relevant to pregnant women was significantly more accurate than comprehensive, but no site gave both comprehensive and completely accurate information. Only four sites (20%) were good sources of information for pregnant women. Internet has proved itself as an important source of information. However, despite numerous websites, only a few offer reliable information to the

  12. Pre-Proposal Assessment of Reliability for Spacecraft Docking with Limited Information

    Science.gov (United States)

    Brall, Aron

    2013-01-01

    This paper addresses the problem of estimating the reliability of a critical system function as well as its impact on the system reliability when limited information is available. The approach addresses the basic function reliability, and then the impact of multiple attempts to accomplish the function. The dependence of subsequent attempts on prior failure to accomplish the function is also addressed. The autonomous docking of two spacecraft was the specific example that generated the inquiry, and the resultant impact on total reliability generated substantial interest in presenting the results due to the relative insensitivity of overall performance to basic function reliability and moderate degradation given sufficient attempts to try and accomplish the required goal. The application of the methodology allows proper emphasis on the characteristics that can be estimated with some knowledge, and to insulate the integrity of the design from those characteristics that can't be properly estimated with any rational value of uncertainty. The nature of NASA's missions contains a great deal of uncertainty due to the pursuit of new science or operations. This approach can be applied to any function where multiple attempts at success, with or without degradation, are allowed.

  13. A Frequency Matching Method: Solving Inverse Problems by Use of Geologically Realistic Prior Information

    DEFF Research Database (Denmark)

    Lange, Katrine; Frydendall, Jan; Cordua, Knud Skou

    2012-01-01

    The frequency matching method defines a closed form expression for a complex prior that quantifies the higher order statistics of a proposed solution model to an inverse problem. While existing solution methods to inverse problems are capable of sampling the solution space while taking into account...... arbitrarily complex a priori information defined by sample algorithms, it is not possible to directly compute the maximum a posteriori model, as the prior probability of a solution model cannot be expressed. We demonstrate how the frequency matching method enables us to compute the maximum a posteriori...... solution model to an inverse problem by using a priori information based on multiple point statistics learned from training images. We demonstrate the applicability of the suggested method on a synthetic tomographic crosshole inverse problem....

  14. Clustering and Bayesian hierarchical modeling for the definition of informative prior distributions in hydrogeology

    Science.gov (United States)

    Cucchi, K.; Kawa, N.; Hesse, F.; Rubin, Y.

    2017-12-01

    In order to reduce uncertainty in the prediction of subsurface flow and transport processes, practitioners should use all data available. However, classic inverse modeling frameworks typically only make use of information contained in in-situ field measurements to provide estimates of hydrogeological parameters. Such hydrogeological information about an aquifer is difficult and costly to acquire. In this data-scarce context, the transfer of ex-situ information coming from previously investigated sites can be critical for improving predictions by better constraining the estimation procedure. Bayesian inverse modeling provides a coherent framework to represent such ex-situ information by virtue of the prior distribution and combine them with in-situ information from the target site. In this study, we present an innovative data-driven approach for defining such informative priors for hydrogeological parameters at the target site. Our approach consists in two steps, both relying on statistical and machine learning methods. The first step is data selection; it consists in selecting sites similar to the target site. We use clustering methods for selecting similar sites based on observable hydrogeological features. The second step is data assimilation; it consists in assimilating data from the selected similar sites into the informative prior. We use a Bayesian hierarchical model to account for inter-site variability and to allow for the assimilation of multiple types of site-specific data. We present the application and validation of the presented methods on an established database of hydrogeological parameters. Data and methods are implemented in the form of an open-source R-package and therefore facilitate easy use by other practitioners.

  15. Limits on reliable information flows through stochastic populations.

    Science.gov (United States)

    Boczkowski, Lucas; Natale, Emanuele; Feinerman, Ofer; Korman, Amos

    2018-06-06

    Biological systems can share and collectively process information to yield emergent effects, despite inherent noise in communication. While man-made systems often employ intricate structural solutions to overcome noise, the structure of many biological systems is more amorphous. It is not well understood how communication noise may affect the computational repertoire of such groups. To approach this question we consider the basic collective task of rumor spreading, in which information from few knowledgeable sources must reliably flow into the rest of the population. We study the effect of communication noise on the ability of groups that lack stable structures to efficiently solve this task. We present an impossibility result which strongly restricts reliable rumor spreading in such groups. Namely, we prove that, in the presence of even moderate levels of noise that affect all facets of the communication, no scheme can significantly outperform the trivial one in which agents have to wait until directly interacting with the sources-a process which requires linear time in the population size. Our results imply that in order to achieve efficient rumor spread a system must exhibit either some degree of structural stability or, alternatively, some facet of the communication which is immune to noise. We then corroborate this claim by providing new analyses of experimental data regarding recruitment in Cataglyphis niger desert ants. Finally, in light of our theoretical results, we discuss strategies to overcome noise in other biological systems.

  16. Incorporating prior information into differential network analysis using non-paranormal graphical models.

    Science.gov (United States)

    Zhang, Xiao-Fei; Ou-Yang, Le; Yan, Hong

    2017-08-15

    Understanding how gene regulatory networks change under different cellular states is important for revealing insights into network dynamics. Gaussian graphical models, which assume that the data follow a joint normal distribution, have been used recently to infer differential networks. However, the distributions of the omics data are non-normal in general. Furthermore, although much biological knowledge (or prior information) has been accumulated, most existing methods ignore the valuable prior information. Therefore, new statistical methods are needed to relax the normality assumption and make full use of prior information. We propose a new differential network analysis method to address the above challenges. Instead of using Gaussian graphical models, we employ a non-paranormal graphical model that can relax the normality assumption. We develop a principled model to take into account the following prior information: (i) a differential edge less likely exists between two genes that do not participate together in the same pathway; (ii) changes in the networks are driven by certain regulator genes that are perturbed across different cellular states and (iii) the differential networks estimated from multi-view gene expression data likely share common structures. Simulation studies demonstrate that our method outperforms other graphical model-based algorithms. We apply our method to identify the differential networks between platinum-sensitive and platinum-resistant ovarian tumors, and the differential networks between the proneural and mesenchymal subtypes of glioblastoma. Hub nodes in the estimated differential networks rediscover known cancer-related regulator genes and contain interesting predictions. The source code is at https://github.com/Zhangxf-ccnu/pDNA. szuouyl@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  17. Stress affects the neural ensemble for integrating new information and prior knowledge.

    Science.gov (United States)

    Vogel, Susanne; Kluen, Lisa Marieke; Fernández, Guillén; Schwabe, Lars

    2018-06-01

    Prior knowledge, represented as a schema, facilitates memory encoding. This schema-related learning is assumed to rely on the medial prefrontal cortex (mPFC) that rapidly integrates new information into the schema, whereas schema-incongruent or novel information is encoded by the hippocampus. Stress is a powerful modulator of prefrontal and hippocampal functioning and first studies suggest a stress-induced deficit of schema-related learning. However, the underlying neural mechanism is currently unknown. To investigate the neural basis of a stress-induced schema-related learning impairment, participants first acquired a schema. One day later, they underwent a stress induction or a control procedure before learning schema-related and novel information in the MRI scanner. In line with previous studies, learning schema-related compared to novel information activated the mPFC, angular gyrus, and precuneus. Stress, however, affected the neural ensemble activated during learning. Whereas the control group distinguished between sets of brain regions for related and novel information, stressed individuals engaged the hippocampus even when a relevant schema was present. Additionally, stressed participants displayed aberrant functional connectivity between brain regions involved in schema processing when encoding novel information. The failure to segregate functional connectivity patterns depending on the presence of prior knowledge was linked to impaired performance after stress. Our results show that stress affects the neural ensemble underlying the efficient use of schemas during learning. These findings may have relevant implications for clinical and educational settings. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  19. Association of eHealth literacy with cancer information seeking and prior experience with cancer screening.

    Science.gov (United States)

    Park, Hyejin; Moon, Mikyung; Baeg, Jung Hoon

    2014-09-01

    Cancer is a critical disease with a high mortality rate in the US. Although useful information exists on the Internet, many people experience difficulty finding information about cancer prevention because they have limited eHealth literacy. This study aimed to identify relationships between the level of eHealth literacy and cancer information seeking experience or prior experience with cancer screening tests. A total of 108 adults participated in this study through questionnaires. Data covering demographics, eHealth literacy, cancer information seeking experience, educational needs for cancer information searching, and previous cancer screening tests were obtained. Study findings show that the level of eHealth literacy influences cancer information seeking. Individuals with low eHealth literacy are likely to be less confident about finding cancer information. In addition, people who have a low level of eHealth literacy need more education about seeking information than do those with a higher level of eHealth literacy. However, there is no significant relationship between eHealth literacy and cancer screening tests. More people today are using the Internet for access to information to maintain good health. It is therefore critical to educate those with low eHealth literacy so they can better self-manage their health.

  20. Correction of projective distortion in long-image-sequence mosaics without prior information

    Science.gov (United States)

    Yang, Chenhui; Mao, Hongwei; Abousleman, Glen; Si, Jennie

    2010-04-01

    Image mosaicking is the process of piecing together multiple video frames or still images from a moving camera to form a wide-area or panoramic view of the scene being imaged. Mosaics have widespread applications in many areas such as security surveillance, remote sensing, geographical exploration, agricultural field surveillance, virtual reality, digital video, and medical image analysis, among others. When mosaicking a large number of still images or video frames, the quality of the resulting mosaic is compromised by projective distortion. That is, during the mosaicking process, the image frames that are transformed and pasted to the mosaic become significantly scaled down and appear out of proportion with respect to the mosaic. As more frames continue to be transformed, important target information in the frames can be lost since the transformed frames become too small, which eventually leads to the inability to continue further. Some projective distortion correction techniques make use of prior information such as GPS information embedded within the image, or camera internal and external parameters. Alternatively, this paper proposes a new algorithm to reduce the projective distortion without using any prior information whatsoever. Based on the analysis of the projective distortion, we approximate the projective matrix that describes the transformation between image frames using an affine model. Using singular value decomposition, we can deduce the affine model scaling factor that is usually very close to 1. By resetting the image scale of the affine model to 1, the transformed image size remains unchanged. Even though the proposed correction introduces some error in the image matching, this error is typically acceptable and more importantly, the final mosaic preserves the original image size after transformation. We demonstrate the effectiveness of this new correction algorithm on two real-world unmanned air vehicle (UAV) sequences. The proposed method is

  1. Information flow a data bank preparation in nuclear power plant reliability information system

    International Nuclear Information System (INIS)

    Kolesa, K.; Vejvodova, I.

    1983-01-01

    In the year 1981 the reliability information system for nuclear power plants (ISS-JE) was established. The objective of the system is to make a statistical evaluation of the operation of nuclear power plants and to obtain information on the reliability of the equipment of nuclear power plants and the transmission of this information to manufacturers with the aim of inducing them to take corrective measures. The HP 1000 computer with the data base system IMAGE 100 is used which allows to process single queries and periodical outputs. The content of periodical outputs designed for various groups of subcontractors is briefly described and trends of the further development of the system indicated. (Ha)

  2. Incorporation of stochastic engineering models as prior information in Bayesian medical device trials.

    Science.gov (United States)

    Haddad, Tarek; Himes, Adam; Thompson, Laura; Irony, Telba; Nair, Rajesh

    2017-01-01

    Evaluation of medical devices via clinical trial is often a necessary step in the process of bringing a new product to market. In recent years, device manufacturers are increasingly using stochastic engineering models during the product development process. These models have the capability to simulate virtual patient outcomes. This article presents a novel method based on the power prior for augmenting a clinical trial using virtual patient data. To properly inform clinical evaluation, the virtual patient model must simulate the clinical outcome of interest, incorporating patient variability, as well as the uncertainty in the engineering model and in its input parameters. The number of virtual patients is controlled by a discount function which uses the similarity between modeled and observed data. This method is illustrated by a case study of cardiac lead fracture. Different discount functions are used to cover a wide range of scenarios in which the type I error rates and power vary for the same number of enrolled patients. Incorporation of engineering models as prior knowledge in a Bayesian clinical trial design can provide benefits of decreased sample size and trial length while still controlling type I error rate and power.

  3. Oil exploitation, development justice and the utility of free, prior and informed consent in northwest Kenya

    DEFF Research Database (Denmark)

    Owiso, Michael

    2018-01-01

    manipulation, internal weaknesses of the state in Africa, illiteracy among the local populations and extreme levels of poverty, among others, as significantly contributing to resource conflicts in the region. This contribution is anchored on the free, prior and informed consent framework as a remedy...... for the resource curse. The framework must have a regional, national as well as local strategic and systematic focus and must bring together key actors, amongst them public and private actors and particularly the communities, in thinking and planning for the future.......Policy addressing the governance of natural resources and issues of development justice, though present, is either in its nascent stages or is weak in addressing the challenges that accrue from this relationship within states in Africa and the world in general. This complicates and antagonises...

  4. Improving Metrological Reliability of Information-Measuring Systems Using Mathematical Modeling of Their Metrological Characteristics

    Science.gov (United States)

    Kurnosov, R. Yu; Chernyshova, T. I.; Chernyshov, V. N.

    2018-05-01

    The algorithms for improving the metrological reliability of analogue blocks of measuring channels and information-measuring systems are developed. The proposed algorithms ensure the optimum values of their metrological reliability indices for a given analogue circuit block solution.

  5. The Effect of Information Access Strategy on Power Consumption and Reliability in Wireless Sensor Network

    DEFF Research Database (Denmark)

    Tobgay, Sonam; Olsen, Rasmus Løvenstein; Prasad, Ramjee

    2013-01-01

    This paper examines the effect of different information access strategies on power consumption and information reliability, considering the wireless sensor network as the source of information. Basically, the paper explores three different access strategies, namely; reactive, periodic and hybrid...

  6. Testing the reliability of information extracted from ancient zircon

    Science.gov (United States)

    Kielman, Ross; Whitehouse, Martin; Nemchin, Alexander

    2015-04-01

    Studies combining zircon U-Pb chronology, trace element distribution as well as O and Hf isotope systematics are a powerful way to gain understanding of the processes shaping Earth's evolution, especially in detrital populations where constraints from the original host are missing. Such studies of the Hadean detrital zircon population abundant in sedimentary rocks in Western Australia have involved analysis of an unusually large number of individual grains, but also highlighted potential problems with the approach, only apparent when multiple analyses are obtained from individual grains. A common feature of the Hadean as well as many early Archaean zircon populations is their apparent inhomogeneity, which reduces confidence in conclusions based on studies combining chemistry and isotopic characteristics of zircon. In order to test the reliability of information extracted from early Earth zircon, we report results from one of the first in-depth multi-method study of zircon from a relatively simple early Archean magmatic rock, used as an analogue to ancient detrital zircon. The approach involves making multiple SIMS analyses in individual grains in order to be comparable to the most advanced studies of detrital zircon populations. The investigated sample is a relatively undeformed, non-migmatitic ca. 3.8 Ga tonalite collected a few kms south of the Isua Greenstone Belt, southwest Greenland. Extracted zircon grains can be combined into three different groups based on the behavior of their U-Pb systems: (i) grains that show internally consistent and concordant ages and define an average age of 3805±15 Ma, taken to be the age of the rock, (ii) grains that are distributed close to the concordia line, but with significant variability between multiple analyses, suggesting an ancient Pb loss and (iii) grains that have multiple analyses distributed along a discordia pointing towards a zero intercept, indicating geologically recent Pb-loss. This overall behavior has

  7. Design for reliability information and computer-based systems

    CERN Document Server

    Bauer, Eric

    2010-01-01

    "System reliability, availability and robustness are often not well understood by system architects, engineers and developers. They often don't understand what drives customer's availability expectations, how to frame verifiable availability/robustness requirements, how to manage and budget availability/robustness, how to methodically architect and design systems that meet robustness requirements, and so on. The book takes a very pragmatic approach of framing reliability and robustness as a functional aspect of a system so that architects, designers, developers and testers can address it as a concrete, functional attribute of a system, rather than an abstract, non-functional notion"--Provided by publisher.

  8. Technical information report: Plasma melter operation, reliability, and maintenance analysis

    International Nuclear Information System (INIS)

    Hendrickson, D.W.

    1995-01-01

    This document provides a technical report of operability, reliability, and maintenance of a plasma melter for low-level waste vitrification, in support of the Hanford Tank Waste Remediation System (TWRS) Low-Level Waste (LLW) Vitrification Program. A process description is provided that minimizes maintenance and downtime and includes material and energy balances, equipment sizes and arrangement, startup/operation/maintence/shutdown cycle descriptions, and basis for scale-up to a 200 metric ton/day production facility. Operational requirements are provided including utilities, feeds, labor, and maintenance. Equipment reliability estimates and maintenance requirements are provided which includes a list of failure modes, responses, and consequences

  9. SU-D-206-04: Iterative CBCT Scatter Shading Correction Without Prior Information

    International Nuclear Information System (INIS)

    Bai, Y; Wu, P; Mao, T; Gong, S; Wang, J; Niu, T; Sheng, K; Xie, Y

    2016-01-01

    Purpose: To estimate and remove the scatter contamination in the acquired projection of cone-beam CT (CBCT), to suppress the shading artifacts and improve the image quality without prior information. Methods: The uncorrected CBCT images containing shading artifacts are reconstructed by applying the standard FDK algorithm on CBCT raw projections. The uncorrected image is then segmented to generate an initial template image. To estimate scatter signal, the differences are calculated by subtracting the simulated projections of the template image from the raw projections. Since scatter signals are dominantly continuous and low-frequency in the projection domain, they are estimated by low-pass filtering the difference signals and subtracted from the raw CBCT projections to achieve the scatter correction. Finally, the corrected CBCT image is reconstructed from the corrected projection data. Since an accurate template image is not readily segmented from the uncorrected CBCT image, the proposed scheme is iterated until the produced template is not altered. Results: The proposed scheme is evaluated on the Catphan©600 phantom data and CBCT images acquired from a pelvis patient. The result shows that shading artifacts have been effectively suppressed by the proposed method. Using multi-detector CT (MDCT) images as reference, quantitative analysis is operated to measure the quality of corrected images. Compared to images without correction, the method proposed reduces the overall CT number error from over 200 HU to be less than 50 HU and can increase the spatial uniformity. Conclusion: An iterative strategy without relying on the prior information is proposed in this work to remove the shading artifacts due to scatter contamination in the projection domain. The method is evaluated in phantom and patient studies and the result shows that the image quality is remarkably improved. The proposed method is efficient and practical to address the poor image quality issue of CBCT

  10. SU-D-206-04: Iterative CBCT Scatter Shading Correction Without Prior Information

    Energy Technology Data Exchange (ETDEWEB)

    Bai, Y; Wu, P; Mao, T; Gong, S; Wang, J; Niu, T [Sir Run Run Shaw Hospital, Zhejiang University School of Medicine, Institute of Translational Medicine, Zhejiang University, Hangzhou, Zhejiang (China); Sheng, K [Department of Radiation Oncology, University of California, Los Angeles, School of Medicine, Los Angeles, CA (United States); Xie, Y [Institute of Biomedical and Health Engineering, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, Guangdong (China)

    2016-06-15

    Purpose: To estimate and remove the scatter contamination in the acquired projection of cone-beam CT (CBCT), to suppress the shading artifacts and improve the image quality without prior information. Methods: The uncorrected CBCT images containing shading artifacts are reconstructed by applying the standard FDK algorithm on CBCT raw projections. The uncorrected image is then segmented to generate an initial template image. To estimate scatter signal, the differences are calculated by subtracting the simulated projections of the template image from the raw projections. Since scatter signals are dominantly continuous and low-frequency in the projection domain, they are estimated by low-pass filtering the difference signals and subtracted from the raw CBCT projections to achieve the scatter correction. Finally, the corrected CBCT image is reconstructed from the corrected projection data. Since an accurate template image is not readily segmented from the uncorrected CBCT image, the proposed scheme is iterated until the produced template is not altered. Results: The proposed scheme is evaluated on the Catphan©600 phantom data and CBCT images acquired from a pelvis patient. The result shows that shading artifacts have been effectively suppressed by the proposed method. Using multi-detector CT (MDCT) images as reference, quantitative analysis is operated to measure the quality of corrected images. Compared to images without correction, the method proposed reduces the overall CT number error from over 200 HU to be less than 50 HU and can increase the spatial uniformity. Conclusion: An iterative strategy without relying on the prior information is proposed in this work to remove the shading artifacts due to scatter contamination in the projection domain. The method is evaluated in phantom and patient studies and the result shows that the image quality is remarkably improved. The proposed method is efficient and practical to address the poor image quality issue of CBCT

  11. Models of Information Security Highly Reliable Computing Systems

    Directory of Open Access Journals (Sweden)

    Vsevolod Ozirisovich Chukanov

    2016-03-01

    Full Text Available Methods of the combined reservation are considered. The models of reliability of systems considering parameters of restoration and prevention of blocks of system are described. Ratios for average quantity prevention and an availability quotient of blocks of system are given.

  12. Brain Stroke Detection by Microwaves Using Prior Information from Clinical Databases

    Directory of Open Access Journals (Sweden)

    Natalia Irishina

    2013-01-01

    Full Text Available Microwave tomographic imaging is an inexpensive, noninvasive modality of media dielectric properties reconstruction which can be utilized as a screening method in clinical applications such as breast cancer and brain stroke detection. For breast cancer detection, the iterative algorithm of structural inversion with level sets provides well-defined boundaries and incorporates an intrinsic regularization, which permits to discover small lesions. However, in case of brain lesion, the inverse problem is much more difficult due to the skull, which causes low microwave penetration and highly noisy data. In addition, cerebral liquid has dielectric properties similar to those of blood, which makes the inversion more complicated. Nevertheless, the contrast in the conductivity and permittivity values in this situation is significant due to blood high dielectric values compared to those of surrounding grey and white matter tissues. We show that using brain MRI images as prior information about brain's configuration, along with known brain dielectric properties, and the intrinsic regularization by structural inversion, allows successful and rapid stroke detection even in difficult cases. The method has been applied to 2D slices created from a database of 3D real MRI phantom images to effectively detect lesions larger than 2.5 × 10−2 m diameter.

  13. Forecasting Long-Term Crude Oil Prices Using a Bayesian Model with Informative Priors

    Directory of Open Access Journals (Sweden)

    Chul-Yong Lee

    2017-01-01

    Full Text Available In the long-term, crude oil prices may impact the economic stability and sustainability of many countries, especially those depending on oil imports. This study thus suggests an alternative model for accurately forecasting oil prices while reflecting structural changes in the oil market by using a Bayesian approach. The prior information is derived from the recent and expected structure of the oil market, using a subjective approach, and then updated with available market data. The model includes as independent variables factors affecting oil prices, such as world oil demand and supply, the financial situation, upstream costs, and geopolitical events. To test the model’s forecasting performance, it is compared with other models, including a linear ordinary least squares model and a neural network model. The proposed model outperforms on the forecasting performance test even though the neural network model shows the best results on a goodness-of-fit test. The results show that the crude oil price is estimated to increase to $169.3/Bbl by 2040.

  14. Our Commitment to Reliable Health and Medical Information

    Science.gov (United States)

    ... the intent of a website to publish transparent information. The transparency of the website will improve the usefulness and objectivity of the information and the publishment of correct data. The HONcode ...

  15. Improving Reliability of Information Leakage Detection and Prevention Systems

    Directory of Open Access Journals (Sweden)

    A. V. Mamaev

    2011-03-01

    Full Text Available The problem of protection from deliberate leaks of information is one of the most difficult. Integrated systems of information protection against insider have a serious drawback. Using this disadvantage the offender receives the possibility of unauthorized theft of information from working machine.

  16. A New Method of Reliability Evaluation Based on Wavelet Information Entropy for Equipment Condition Identification

    International Nuclear Information System (INIS)

    He, Z J; Zhang, X L; Chen, X F

    2012-01-01

    Aiming at reliability evaluation of condition identification of mechanical equipment, it is necessary to analyze condition monitoring information. A new method of reliability evaluation based on wavelet information entropy extracted from vibration signals of mechanical equipment is proposed. The method is quite different from traditional reliability evaluation models that are dependent on probability statistics analysis of large number sample data. The vibration signals of mechanical equipment were analyzed by means of second generation wavelet package (SGWP). We take relative energy in each frequency band of decomposed signal that equals a percentage of the whole signal energy as probability. Normalized information entropy (IE) is obtained based on the relative energy to describe uncertainty of a system instead of probability. The reliability degree is transformed by the normalized wavelet information entropy. A successful application has been achieved to evaluate the assembled quality reliability for a kind of dismountable disk-drum aero-engine. The reliability degree indicates the assembled quality satisfactorily.

  17. Reliability, Readability and Quality of Online Information about Femoracetabular Impingement

    Directory of Open Access Journals (Sweden)

    Fatih Küçükdurmaz

    2015-07-01

    Conclusion: According to our results, the websites intended to attract patients searching for information regarding femoroacetabular impingement are providing a highly accessible, readable information source, but do not appear to apply a comparable amount of rigor to scientific literature or healthcare practitioner websites in regard to matters such as citing sources for information, supplying methodology and including a publication date. This indicates that while these resources are easily accessed by patients, there is potential for them to be a source of misinformation.

  18. A novel technique to incorporate structural prior information into multi-modal tomographic reconstruction

    International Nuclear Information System (INIS)

    Kazantsev, Daniil; Dobson, Katherine J; Withers, Philip J; Lee, Peter D; Ourselin, Sébastien; Arridge, Simon R; Hutton, Brian F; Kaestner, Anders P; Lionheart, William R B

    2014-01-01

    There has been a rapid expansion of multi-modal imaging techniques in tomography. In biomedical imaging, patients are now regularly imaged using both single photon emission computed tomography (SPECT) and x-ray computed tomography (CT), or using both positron emission tomography and magnetic resonance imaging (MRI). In non-destructive testing of materials both neutron CT (NCT) and x-ray CT are widely applied to investigate the inner structure of material or track the dynamics of physical processes. The potential benefits from combining modalities has led to increased interest in iterative reconstruction algorithms that can utilize the data from more than one imaging mode simultaneously. We present a new regularization term in iterative reconstruction that enables information from one imaging modality to be used as a structural prior to improve resolution of the second modality. The regularization term is based on a modified anisotropic tensor diffusion filter, that has shape-adapted smoothing properties. By considering the underlying orientations of normal and tangential vector fields for two co-registered images, the diffusion flux is rotated and scaled adaptively to image features. The images can have different greyscale values and different spatial resolutions. The proposed approach is particularly good at isolating oriented features in images which are important for medical and materials science applications. By enhancing the edges it enables both easy identification and volume fraction measurements aiding segmentation algorithms used for quantification. The approach is tested on a standard denoising and deblurring image recovery problem, and then applied to 2D and 3D reconstruction problems; thereby highlighting the capabilities of the algorithm. Using synthetic data from SPECT co-registered with MRI, and real NCT data co-registered with x-ray CT, we show how the method can be used across a range of imaging modalities. (paper)

  19. A reliable information management for real-time systems

    International Nuclear Information System (INIS)

    Nishihara, Takuo; Tomita, Seiji

    1995-01-01

    In this paper, we propose a system configuration suitable for the hard realtime systems in which integrity and durability of information are important. On most hard real-time systems, where response time constraints are critical, the data which program access are volatile, and may be lost in case the systems are down. But for some real-time systems, the value-added intelligent network (IN) systems, e.g., integrity and durability of the stored data are very important. We propose a distributed system configuration for such hard real-time systems, comprised of service control modules and data management modules. The service control modules process transactions and responses based on deadline control, and the data management modules deal the stored data based on information recovery schemes well-restablished in fault real-time systems. (author)

  20. Fault-tolerant search algorithms reliable computation with unreliable information

    CERN Document Server

    Cicalese, Ferdinando

    2013-01-01

    Why a book on fault-tolerant search algorithms? Searching is one of the fundamental problems in computer science. Time and again algorithmic and combinatorial issues originally studied in the context of search find application in the most diverse areas of computer science and discrete mathematics. On the other hand, fault-tolerance is a necessary ingredient of computing. Due to their inherent complexity, information systems are naturally prone to errors, which may appear at any level - as imprecisions in the data, bugs in the software, or transient or permanent hardware failures. This book pr

  1. BAYESIAN ANALYSIS FOR THE PAIRED COMPARISON MODEL WITH ORDER EFFECTS (USING NON-INFORMATIVE PRIORS

    Directory of Open Access Journals (Sweden)

    Ghausia Masood Gilani

    2008-07-01

    Full Text Available Sometimes it may be difficult for a panelist to rank or compare more than two objects or treatments at the same time. For this reason, paired comparison method is used. In this study, the Davidson and Beaver (1977 model for paired comparisons with order effects is analyzed through the Bayesian Approach. For this purpose, the posterior means and the posterior modes are compared using the noninformative priors.

  2. Should a reliable information processor be chaotic (brain models)

    Energy Technology Data Exchange (ETDEWEB)

    Nicolis, J S

    1982-01-01

    Brain-like structures have evolved by performing signal processing initially by minimizing tracking errors on a competitive basis. Such systems are highly complex and at the same time notoriously disordered. The functional trace of the cerebral cortex of the human brain is a good example. The electroencephalogram (EEG) appears particularly fragmented during the execution of mental tasks, as well as during the recurrent episodes of rem sleep. A stochastically regular or a highly synchronized EEG on the other hand, characterises a drowsy (relaxing) or epileptic subject respectively and indicates-in both cases-a very incompetent information processor. The author suggests that such behavioral changeovers are produced via bifurcations which trigger the thalamocortical nonlinear pacemaking oscillator to switch from an unstable limit cycle to a strange attractor regime (i.e. to chaos), or vice versa. This analysis aims to show that the EEGs characteristics are not accidental but inevitable and even necessary and, therefore, functionally significant. 25 references.

  3. Extracting information from an ensemble of GCMs to reliably assess future global runoff change

    NARCIS (Netherlands)

    Sperna Weiland, F.C.; Beek, L.P.H. van; Weerts, A.H.; Bierkens, M.F.P.

    2011-01-01

    Future runoff projections derived from different global climate models (GCMs) show large differences. Therefore, within this study the, information from multiple GCMs has been combined to better assess hydrological changes. For projections of precipitation and temperature the Reliability ensemble

  4. Effectiveness of different approaches to disseminating traveler information on travel time reliability. [supporting datasets

    Science.gov (United States)

    2013-11-30

    Travel time reliability information includes static data about traffic speeds or trip times that capture historic variations from day to day, and it can help individuals understand the level of variation in traffic. Unlike real-time travel time infor...

  5. The utilisation of virtual images in patient information giving sessions for prostate cancer patients prior to radiotherapy

    International Nuclear Information System (INIS)

    Stewart-Lord, A.; Brown, M.; Noor, S.; Cook, J.; Jallow, O.

    2016-01-01

    The aim of the study was to explore the prostate patients' perceptions of a Virtual Environment for Radiotherapy Training (VERT) as an information giving resource prior to radiotherapy delivery. A survey design was used to determine the level of knowledge of those patients who attended VERT for a pre-treatment talk and identify the benefits and limitations of using VERT as pre-treatment information giving resource. Participants were invited to attend a VERT patient information session four weeks prior to their planning CT scan, and then complete a questionnaire two weeks after start of radiotherapy treatment. A sample of n = 38 patients were recruited over a five month data collection period. Results showed that patient perceptions on the use of VERT as information giving tool prior to radiotherapy treatment were very positive. The sessions enable patients to understand the potential impact of treatment volumes if the internal organ shape and location differed from that originally planned, enabling them to comply with radiotherapy treatment instructions. Additional key findings have demonstrated excellent levels of communication associated with the use of VERT emphasising the need for future patient preparation strategies to consider the use of virtual technology. - Highlights: • VERT pre-treatment information sessions were very helpful to patients. • Patients had a better understanding of what to expect during treatment. • The importance of following bowel and bladder treatment preparation was made clear. • The session helped to reduced patient anxiety and stress associated with treatment.

  6. Germany: INIS — 45 years of Reliable Nuclear Energy Information

    International Nuclear Information System (INIS)

    Rehme, Silke; Eck, Sabrina; Mutschelknauss, Michael

    2015-01-01

    other European input centers, has contributed to the comprehensiveness of the INIS database. The INIS Working Groups, dealing with topics related to the database and it’s scientific, technical, and strategic enhancements, allow members the most direct influence on INIS developments. From the very start, Germany has been an active member of the Working Group establishing the Guidelines for Standardized Entry of Corporate Bodies, published under this title as Issue 21 of the INIS Reference Series. Among others, FIZ Karlsruhe was also a committed member of the Working Group responsible for merging the thesaurus and the classification of the two information systems, INIS and ETDE, and contributed to the realization of the Multilingual Thesaurus

  7. A Reliable Measure of Information Security Awareness and the Identification of Bias in Responses

    Directory of Open Access Journals (Sweden)

    Agata McCormac

    2017-11-01

    Full Text Available The Human Aspects of Information Security Questionnaire (HAIS-Q is designed to measure Information Security Awareness. More specifically, the tool measures an individual’s knowledge, attitude, and self-reported behaviour relating to information security in the workplace. This paper reports on the reliability of the HAIS-Q, including test-retest reliability and internal consistency. The paper also assesses the reliability of three preliminary over-claiming items, designed specifically to complement the HAIS-Q, and identify those individuals who provide socially desirable responses. A total of 197 working Australians completed two iterations of the HAIS-Q and the over-claiming items, approximately 4 weeks apart. Results of the analysis showed that the HAIS-Q was externally reliable and internally consistent. Therefore, the HAIS-Q can be used to reliably measure information security awareness. Reliability testing on the preliminary over-claiming items was not as robust and further development is required and recommended. The implications of these findings mean that organisations can confidently use the HAIS-Q to not only measure the current state of employee information security awareness within their organisation, but they can also measure the effectiveness and impacts of training interventions, information security awareness programs and campaigns. The influence of cultural changes and the effect of security incidents can also be assessed.

  8. Complex method to calculate objective assessments of information systems protection to improve expert assessments reliability

    Science.gov (United States)

    Abdenov, A. Zh; Trushin, V. A.; Abdenova, G. A.

    2018-01-01

    The paper considers the questions of filling the relevant SIEM nodes based on calculations of objective assessments in order to improve the reliability of subjective expert assessments. The proposed methodology is necessary for the most accurate security risk assessment of information systems. This technique is also intended for the purpose of establishing real-time operational information protection in the enterprise information systems. Risk calculations are based on objective estimates of the adverse events implementation probabilities, predictions of the damage magnitude from information security violations. Calculations of objective assessments are necessary to increase the reliability of the proposed expert assessments.

  9. 78 FR 65670 - Agency Information Collection Activities; Proposed Collection; Comment Request; Prior Notice of...

    Science.gov (United States)

    2013-11-01

    ... information required in a request for review. In the event that we place an article of food under hold under... of Dockets Management (HFA- 305), Food and Drug Administration, 5630 Fishers Lane, Rm. 1061... approval from the Office of Management and Budget (OMB) for each collection of information they conduct or...

  10. Path Planning and Navigation for Mobile Robots in a Hybrid Sensor Network without Prior Location Information

    Directory of Open Access Journals (Sweden)

    Zheng Zhang

    2013-03-01

    Full Text Available In a hybrid wireless sensor network with mobile and static nodes, which have no prior geographical knowledge, successful navigation for mobile robots is one of the main challenges. In this paper, we propose two novel navigation algorithms for outdoor environments, which permit robots to travel from one static node to another along a planned path in the sensor field, namely the RAC and the IMAP algorithms. Using this, the robot can navigate without the help of a map, GPS or extra sensor modules, only using the received signal strength indication (RSSI and odometry. Therefore, our algorithms have the advantage of being cost-effective. In addition, a path planning algorithm to schedule mobile robots' travelling paths is presented, which focuses on shorter distances and robust paths for robots by considering the RSSI-Distance characteristics. The simulations and experiments conducted with an autonomous mobile robot show the effectiveness of the proposed algorithms in an outdoor environment.

  11. Informed consent prior to coronary angiography in a real world scenario: what do patients remember?

    Directory of Open Access Journals (Sweden)

    Aslihan Eran

    Full Text Available BACKGROUND: Patients' informed consent is legally essential before elective invasive cardiac angiography (CA and successive intervention can be done. It is unknown to what extent patients can remember previous detailed information given by a specially trained doctor in an optimal scenario as compared to standard care. METHODOLOGY/PRINCIPAL FINDINGS: In this prospective cohort study 150 consecutive in-patients and 50 out-patients were included before elective CA was initiated. The informed consent was provided and documented in in-patients by trained and instructed physicians the day before CA. In contrast, out-patients received standard information by different not trained physicians, who did not know about this investigation. All patients had to sign a form stating that enough information had been given and all questions had been answered sufficiently. One hour before CA an assessment of the patients' knowledge about CA was performed using a standard point-by-point questionnaire by another independent physician. The supplied information was composed of 12 potential complications, 3 general, 4 periprocedural and 4 procedural aspects. 95% of the patients felt that they had been well and sufficiently informed. Less than half of the potential complications could be remembered by the patients and more patients could remember less serious than life-threatening complications (27.9±8.8% vs. 47.1±11.0%; p<0.001. Even obvious complications like local bleeding could not be remembered by 35% of in-patients and 36% of out-patients (p = 0.87. Surprisingly, there were only a few knowledge differences between in- and out-patients. CONCLUSIONS: The knowledge about CA of patients is vague when they give their informed consent. Even structured information given by a specially trained physician did not increase this knowledge.

  12. Adaptive information-theoretic bounded rational decision-making with parametric priors

    OpenAIRE

    Grau-Moya, Jordi; Braun, Daniel A.

    2015-01-01

    Deviations from rational decision-making due to limited computational resources have been studied in the field of bounded rationality, originally proposed by Herbert Simon. There have been a number of different approaches to model bounded rationality ranging from optimality principles to heuristics. Here we take an information-theoretic approach to bounded rationality, where information-processing costs are measured by the relative entropy between a posterior decision strategy and a given fix...

  13. Social Information Is Integrated into Value and Confidence Judgments According to Its Reliability.

    Science.gov (United States)

    De Martino, Benedetto; Bobadilla-Suarez, Sebastian; Nouguchi, Takao; Sharot, Tali; Love, Bradley C

    2017-06-21

    How much we like something, whether it be a bottle of wine or a new film, is affected by the opinions of others. However, the social information that we receive can be contradictory and vary in its reliability. Here, we tested whether the brain incorporates these statistics when judging value and confidence. Participants provided value judgments about consumer goods in the presence of online reviews. We found that participants updated their initial value and confidence judgments in a Bayesian fashion, taking into account both the uncertainty of their initial beliefs and the reliability of the social information. Activity in dorsomedial prefrontal cortex tracked the degree of belief update. Analogous to how lower-level perceptual information is integrated, we found that the human brain integrates social information according to its reliability when judging value and confidence. SIGNIFICANCE STATEMENT The field of perceptual decision making has shown that the sensory system integrates different sources of information according to their respective reliability, as predicted by a Bayesian inference scheme. In this work, we hypothesized that a similar coding scheme is implemented by the human brain to process social signals and guide complex, value-based decisions. We provide experimental evidence that the human prefrontal cortex's activity is consistent with a Bayesian computation that integrates social information that differs in reliability and that this integration affects the neural representation of value and confidence. Copyright © 2017 De Martino et al.

  14. An evaluation of internet use by neurosurgery patients prior to lumbar disc surgery and of information available on internet.

    Science.gov (United States)

    Atci, Ibrahim Burak; Yilmaz, Hakan; Kocaman, Umit; Samanci, Mustafa Yavuz

    2017-07-01

    The aim of this study was to evaluate the Internet use of a group of lumbar disc surgery candidates in order to determine the rate of Internet search by the patients on their disorders and more importantly the reliability of the accessed websites. Fifty patients who were scheduled for lumbar disc surgery were divided into 2 groups, namely patients who accepted the surgery at the first offer and those who wanted to think over. Educational level information was obtained and patients were asked whether they had searched their disorder and offered surgery on the Internet. Then, a questionnaire was administered and the reliability of the websites was evaluated. Correction: The first 30 websites on the first 3 pages of Google ® search engine, the most commonly used search engine in Turkey, were evaluated with the DISCERN ® instrument. Of 50 patients, 33 (66%) had conducted a search for the surgery on the Internet. All university graduates, 88.2% of high school graduates, and 18.7% of primary-secondary school graduates had conducted an Internet search. The quality and reliability of the information was high (4.5 points) for 2 (7.1%) websites, moderate (2.3 points) for 6 websites (21.4%) and poor (1 point) for 20 websites (71.4%) as scored with the DISCERN ® instrument. The mean DISCERN ® score of was 1.1 for websites of health-related institutions or healthcare news, 2.75 for personal websites of physicians and 2.5 for personal websites of non-physicians. The mean DISCERN ® score of all websites was 1.5. Most of the patients undergoing lumbar disc surgery at our clinic had searched information about the surgical procedure on the Internet. We found that 92.9% of the websites evaluated with the DISCERN ® instrument had inadequate information, suggesting low-level reliability. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. 75 FR 12549 - Agency Information Collection Activities; Proposed Collection; Comment Request; Prior Notice of...

    Science.gov (United States)

    2010-03-16

    ... submission for review (Sec. 1.282(a)(1)(i), (a)(1)(ii), and (a)(1)(iii)). In the event that an article of... information required to be included in a request for review. In the event that an article of food has been... of Dockets Management (HFA-305), Food and Drug Administration, 5630 Fishers Lane, rm. 1061, Rockville...

  16. SIPPI: A Matlab toolbox for sampling the solution to inverse problems with complex prior information

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Caroline Looms, Majken

    2013-01-01

    on the solution. The combined states of information (i.e. the solution to the inverse problem) is a probability density function typically referred to as the a posteriori probability density function. We present a generic toolbox for Matlab and Gnu Octave called SIPPI that implements a number of methods...

  17. Improving biological understanding and complex trait prediction by integrating prior information in genomic feature models

    DEFF Research Database (Denmark)

    Edwards, Stefan McKinnon

    externally founded information, such as KEGG pathways, Gene Ontology gene sets, or genomic features, and estimate the joint contribution of the genetic variants within these sets to complex trait phenotypes. The analysis of complex trait phenotypes is hampered by the myriad of genes that control the trait...

  18. Investigation of reliability indicators of information analysis systems based on Markov’s absorbing chain model

    Science.gov (United States)

    Gilmanshin, I. R.; Kirpichnikov, A. P.

    2017-09-01

    In the result of study of the algorithm of the functioning of the early detection module of excessive losses, it is proven the ability to model it by using absorbing Markov chains. The particular interest is in the study of probability characteristics of early detection module functioning algorithm of losses in order to identify the relationship of indicators of reliability of individual elements, or the probability of occurrence of certain events and the likelihood of transmission of reliable information. The identified relations during the analysis allow to set thresholds reliability characteristics of the system components.

  19. Modeling reliability measurement of interface on information system: Towards the forensic of rules

    Science.gov (United States)

    Nasution, M. K. M.; Sitompul, Darwin; Harahap, Marwan

    2018-02-01

    Today almost all machines depend on the software. As a software and hardware system depends also on the rules that are the procedures for its use. If the procedure or program can be reliably characterized by involving the concept of graph, logic, and probability, then regulatory strength can also be measured accordingly. Therefore, this paper initiates an enumeration model to measure the reliability of interfaces based on the case of information systems supported by the rules of use by the relevant agencies. An enumeration model is obtained based on software reliability calculation.

  20. Reliability assessment based on subjective inferences

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    The reliability information which comes from subjective analysis is often incomplete prior. This information can be generally assumed to exist in the form of either a stated prior mean of R (reliability) or a stated prior credibility interval on R. An efficient approach is developed to determine a complete beta prior distribution from the subjective information according to the principle of maximum entropy, and the the reliability of survival/failure product is assessed via Bayes theorem. Numerical examples are presented to illustrate the methods

  1. Metal Artifact Reduction in X-ray Computed Tomography Using Computer-Aided Design Data of Implants as Prior Information.

    Science.gov (United States)

    Ruth, Veikko; Kolditz, Daniel; Steiding, Christian; Kalender, Willi A

    2017-06-01

    The performance of metal artifact reduction (MAR) methods in x-ray computed tomography (CT) suffers from incorrect identification of metallic implants in the artifact-affected volumetric images. The aim of this study was to investigate potential improvements of state-of-the-art MAR methods by using prior information on geometry and material of the implant. The influence of a novel prior knowledge-based segmentation (PS) compared with threshold-based segmentation (TS) on 2 MAR methods (linear interpolation [LI] and normalized-MAR [NORMAR]) was investigated. The segmentation is the initial step of both MAR methods. Prior knowledge-based segmentation uses 3-dimensional registered computer-aided design (CAD) data as prior knowledge to estimate the correct position and orientation of the metallic objects. Threshold-based segmentation uses an adaptive threshold to identify metal. Subsequently, for LI and NORMAR, the selected voxels are projected into the raw data domain to mark metal areas. Attenuation values in these areas are replaced by different interpolation schemes followed by a second reconstruction. Finally, the previously selected metal voxels are replaced by the metal voxels determined by PS or TS in the initial reconstruction. First, we investigated in an elaborate phantom study if the knowledge of the exact implant shape extracted from the CAD data provided by the manufacturer of the implant can improve the MAR result. Second, the leg of a human cadaver was scanned using a clinical CT system before and after the implantation of an artificial knee joint. The results were compared regarding segmentation accuracy, CT number accuracy, and the restoration of distorted structures. The use of PS improved the efficacy of LI and NORMAR compared with TS. Artifacts caused by insufficient segmentation were reduced, and additional information was made available within the projection data. The estimation of the implant shape was more exact and not dependent on a threshold

  2. Monte Carlo reservoir analysis combining seismic reflection data and informed priors

    DEFF Research Database (Denmark)

    Zunino, Andrea; Mosegaard, Klaus; Lange, Katrine

    2015-01-01

    Determination of a petroleum reservoir structure and rock bulk properties relies extensively on inference from reflection seismology. However, classic deterministic methods to invert seismic data for reservoir properties suffer from some limitations, among which are the difficulty of handling...... with the goal to directly infer the rock facies and porosity of a target reservoir zone. We thus combined a rock-physics model with seismic data in a single inversion algorithm. For large data sets, theMcMC method may become computationally impractical, so we relied on multiple-point-based a priori information...... to quantify geologically plausible models. We tested this methodology on a synthetic reservoir model. The solution of the inverse problem was then represented by a collection of facies and porosity reservoir models, which were samples of the posterior distribution. The final product included probability maps...

  3. Resolution improvement of brain PET images using prior information from MRI: clinical application on refractory epilepsy

    International Nuclear Information System (INIS)

    Silva-Rodríguez, Jesus; Tsoumpas, Charalampos; Aguiar, Pablo; Cortes, Julia; Urdaneta, Jesus Lopez

    2015-01-01

    An important counterpart of clinical Positron Emission Tomography (PET) for early diagnosis of neurological diseases is its low resolution. This is particularly important when evaluating diseases related to small hypometabolisms such as epilepsy. The last years, new hybrid systems combining PET with Magnetic Resonance (MR) has been increasingly used for several different clinical applications. One of the advantages of MR is the production of high spatial resolution images and a potential application of PET-MR imaging is the improvement of PET resolution using MR information. A potential advantage of resolution recovery of PET images is the enhancement of contrast delivering at the same time better detectability of small lesions or hypometabolic areas and more accurate quantification over these areas. Recently, Shidahara et al (2009) proposed a new method using wavelet transforms in order to produce PET images with higher resolution. We optimised Shidahara’s method (SFS-RR) to take into account possible shortcomings on the particular clinical datasets, and applied it to a group of patients diagnosed with refractory epilepsy. FDG-PET and MRI images were acquired sequentially and then co-registered using software tools. A complete evaluation of the PET/MR images was performed before and after the correction, including different parameters related with PET quantification, such as atlas-based metabolism asymmetry coefficients and Statistical Parametric Mapping results comparing to a database of 87 healthy subjects. Furthermore, an experienced physician analyzed the results of non-corrected and corrected images in order to evaluate improvements of detectability on a visual inspection. Clinical outcome was used as a gold standard. SFS-RR demonstrated to have a positive impact on clinical diagnosis of small hypometabolisms. New lesions were detected providing additional clinically relevant information on the visual inspection. SPM sensitivity for the detection of small

  4. Resolution improvement of brain PET images using prior information from MRI: clinical application on refractory epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Silva-Rodríguez, Jesus [Instituto de Investigaciones Sanitarias (IDIS), Santiago de Compostela (Spain); Tsoumpas, Charalampos [University of Leeds, Leeds (United Kingdom); Aguiar, Pablo; Cortes, Julia [Nuclear Medicine Department, University Hospital (CHUS), Santiago de Compostela (Spain); Urdaneta, Jesus Lopez [Instituto de Investigaciones Sanitarias (IDIS), Santiago de Compostela (Spain)

    2015-05-18

    An important counterpart of clinical Positron Emission Tomography (PET) for early diagnosis of neurological diseases is its low resolution. This is particularly important when evaluating diseases related to small hypometabolisms such as epilepsy. The last years, new hybrid systems combining PET with Magnetic Resonance (MR) has been increasingly used for several different clinical applications. One of the advantages of MR is the production of high spatial resolution images and a potential application of PET-MR imaging is the improvement of PET resolution using MR information. A potential advantage of resolution recovery of PET images is the enhancement of contrast delivering at the same time better detectability of small lesions or hypometabolic areas and more accurate quantification over these areas. Recently, Shidahara et al (2009) proposed a new method using wavelet transforms in order to produce PET images with higher resolution. We optimised Shidahara’s method (SFS-RR) to take into account possible shortcomings on the particular clinical datasets, and applied it to a group of patients diagnosed with refractory epilepsy. FDG-PET and MRI images were acquired sequentially and then co-registered using software tools. A complete evaluation of the PET/MR images was performed before and after the correction, including different parameters related with PET quantification, such as atlas-based metabolism asymmetry coefficients and Statistical Parametric Mapping results comparing to a database of 87 healthy subjects. Furthermore, an experienced physician analyzed the results of non-corrected and corrected images in order to evaluate improvements of detectability on a visual inspection. Clinical outcome was used as a gold standard. SFS-RR demonstrated to have a positive impact on clinical diagnosis of small hypometabolisms. New lesions were detected providing additional clinically relevant information on the visual inspection. SPM sensitivity for the detection of small

  5. The Impact of the Reliability of Teleinformation Systems on the Quality of Transmitted Information

    Directory of Open Access Journals (Sweden)

    Stawowy Marek

    2016-10-01

    Full Text Available The work describes the impact the reliability of the information quality IQ for information and communication systems. One of the components of IQ is the reliability properties such as relativity, accuracy, timeliness, completeness, consistency, adequacy, accessibility, credibility, congruence. Each of these components of IQ is independent and to properly estimate the value of IQ, use one of the methods of modeling uncertainty. In this article, we used a hybrid method that has been developed jointly by one of the authors. This method is based on the mathematical theory of evidence know as Dempstera-Shafera (DS theory and serial links of dependent hybrid named IQ (hyb.

  6. Distributed Information and Control system reliability enhancement by fog-computing concept application

    Science.gov (United States)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-03-01

    The paper focuses on the information and control system reliability issue. Authors of the current paper propose a new complex approach of information and control system reliability enhancement by application of the computing concept elements. The approach proposed consists of a complex of optimization problems to be solved. These problems are: estimation of computational complexity, which can be shifted to the edge of the network and fog-layer, distribution of computations among the data processing elements and distribution of computations among the sensors. The problems as well as some simulated results and discussion are formulated and presented within this paper.

  7. Using Information From Prior Satellite Scans to Improve Cloud Detection Near the Day-Night Terminator

    Science.gov (United States)

    Yost, Christopher R.; Minnis, Patrick; Trepte, Qing Z.; Palikonda, Rabindra; Ayers, Jeffrey K.; Spangenberg, Doulas A.

    2012-01-01

    With geostationary satellite data it is possible to have a continuous record of diurnal cycles of cloud properties for a large portion of the globe. Daytime cloud property retrieval algorithms are typically superior to nighttime algorithms because daytime methods utilize measurements of reflected solar radiation. However, reflected solar radiation is difficult to accurately model for high solar zenith angles where the amount of incident radiation is small. Clear and cloudy scenes can exhibit very small differences in reflected radiation and threshold-based cloud detection methods have more difficulty setting the proper thresholds for accurate cloud detection. Because top-of-atmosphere radiances are typically more accurately modeled outside the terminator region, information from previous scans can help guide cloud detection near the terminator. This paper presents an algorithm that uses cloud fraction and clear and cloudy infrared brightness temperatures from previous satellite scan times to improve the performance of a threshold-based cloud mask near the terminator. Comparisons of daytime, nighttime, and terminator cloud fraction derived from Geostationary Operational Environmental Satellite (GOES) radiance measurements show that the algorithm greatly reduces the number of false cloud detections and smoothes the transition from the daytime to the nighttime clod detection algorithm. Comparisons with the Geoscience Laser Altimeter System (GLAS) data show that using this algorithm decreases the number of false detections by approximately 20 percentage points.

  8. ACCURACY AND RELIABILITY AS CRITERIA OF INFORMATIVENESS IN THE NEWS STORY

    Directory of Open Access Journals (Sweden)

    Melnikova Ekaterina Aleksandrovna

    2014-12-01

    Full Text Available The article clarifies the meaning of the terms accuracy and reliability of the news story, offers a researcher's approach to obtaining objective data that helps to verify linguistic means of accuracy and reliability presence in the informative structure of the text. The accuracy of the news story is defined as a high relevance degree of event reflection through language representation of its constituents; the reliability is viewed as news story originality that is proved by introducing citations and sources of information considered being trustworthy into the text content. Having based the research on an event nominative density identification method, the author composed nominative charts of 115 news story texts, collected at web-sites of BBC and CNN media corporations; distinguished qualitative and quantitative markers of accuracy and reliability in the news story text; confirmed that the accuracy of the news story is achieved with terminological clearness in nominating event constituents in the text, thematic bind between words, presence of onyms that help deeply identify characteristics of the referent event. The reliability of the text is discovered in eyewitness accounts, quotations, and references to the sources being considered as trustworthy. Accurate revision of associations between accuracy and reliability and informing strategies in digital news nets allowed the author to set two variants of information delivery, that differ in their communicative and pragmatic functions: developing (that informs about major and minor details of an event and truncated (which gives some details thus raising the interest to the event and urging a reader to open a full story.

  9. Bayesian Estimation of Two-Parameter Weibull Distribution Using Extension of Jeffreys' Prior Information with Three Loss Functions

    Directory of Open Access Journals (Sweden)

    Chris Bambey Guure

    2012-01-01

    Full Text Available The Weibull distribution has been observed as one of the most useful distribution, for modelling and analysing lifetime data in engineering, biology, and others. Studies have been done vigorously in the literature to determine the best method in estimating its parameters. Recently, much attention has been given to the Bayesian estimation approach for parameters estimation which is in contention with other estimation methods. In this paper, we examine the performance of maximum likelihood estimator and Bayesian estimator using extension of Jeffreys prior information with three loss functions, namely, the linear exponential loss, general entropy loss, and the square error loss function for estimating the two-parameter Weibull failure time distribution. These methods are compared using mean square error through simulation study with varying sample sizes. The results show that Bayesian estimator using extension of Jeffreys' prior under linear exponential loss function in most cases gives the smallest mean square error and absolute bias for both the scale parameter α and the shape parameter β for the given values of extension of Jeffreys' prior.

  10. A Method to Increase Drivers' Trust in Collision Warning Systems Based on Reliability Information of Sensor

    Science.gov (United States)

    Tsutsumi, Shigeyoshi; Wada, Takahiro; Akita, Tokihiko; Doi, Shun'ichi

    Driver's workload tends to be increased during driving under complicated traffic environments like a lane change. In such cases, rear collision warning is effective for reduction of cognitive workload. On the other hand, it is pointed out that false alarm or missing alarm caused by sensor errors leads to decrease of driver' s trust in the warning system and it can result in low efficiency of the system. Suppose that reliability information of the sensor is provided in real-time. In this paper, we propose a new warning method to increase driver' s trust in the system even with low sensor reliability utilizing the sensor reliability information. The effectiveness of the warning methods is shown by driving simulator experiments.

  11. Epistemic Trust and Education: Effects of Informant Reliability on Student Learning of Decimal Concepts

    Science.gov (United States)

    Durkin, Kelley; Shafto, Patrick

    2016-01-01

    The epistemic trust literature emphasizes that children's evaluations of informants' trustworthiness affects learning, but there is no evidence that epistemic trust affects learning in academic domains. The current study investigated how reliability affects decimal learning. Fourth and fifth graders (N = 122; M[subscript age] = 10.1 years)…

  12. INNOVATIVE METHODS TO EVALUATE THE RELIABILITY OF INFORMATION CONSOLIDATED FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    Irina P. Kurochkina

    2014-01-01

    Full Text Available The article explores the possibility of using foreign innovative methods to assess the reliabilityof information consolidated fi nancial statements of Russian companies. Recommendations aremade under their adaptation and applicationinto commercial organizations. Banish methodindicators are implemented in one of the world’s largest vertically integrated steel and miningcompanies. Audit firms are proposed to usemethods of assessing the reliability of information in the practical application of ISA.

  13. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  14. Evaluating the use of prior information under different pacing conditions on aircraft inspection performance: The use of virtual reality technology

    Science.gov (United States)

    Bowling, Shannon Raye

    The aircraft maintenance industry is a complex system consisting of human and machine components, because of this; much emphasis has been placed on improving aircraft-inspection performance. One proven technique for improving inspection performance is the use of training. There are several strategies that have been implemented for training, one of which is feedforward information. The use of prior information (feedforward) is known to positively affect inspection performance. This information can consist of knowledge about defect characteristics (types, severity/criticality, and location) and the probability of occurrence. Although several studies have been conducted that demonstrate the usefulness of feedforward as a training strategy, there are certain research issues that need to be addressed. This study evaluates the effect of feedforward information in a simulated 3-dimensional environment by the use of virtual reality. A controlled study was conducted to evaluate the effectiveness of feedforward information in a simulated aircraft inspection environment. The study was conducted in two phases. The first phase evaluated the difference between general and detailed inspection at different pacing levels. The second phase evaluated the effect of feedforward information pertaining to severity, probability and location. Analyses of the results showed that subjects performing detailed inspection performed significantly better than while performing general inspection. Pacing also had the effect of reducing performance for both general and detailed inspection. The study also found that as the level of feedforward information increases, performance also increases. In addition to evaluating performance measures, the study also evaluated process and subjective measures. It was found that process measures such as number of fixation points, fixation groups, mean fixation duration, and percent area covered were all affected by the treatment levels. Analyses of the subjective

  15. Fog-computing concept usage as means to enhance information and control system reliability

    Science.gov (United States)

    Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya

    2018-05-01

    This paper focuses on the reliability issue of information and control systems (ICS). The authors propose using the elements of the fog-computing concept to enhance the reliability function. The key idea of fog-computing is to shift computations to the fog-layer of the network, and thus to decrease the workload of the communication environment and data processing components. As for ICS, workload also can be distributed among sensors, actuators and network infrastructure facilities near the sources of data. The authors simulated typical workload distribution situations for the “traditional” ICS architecture and for the one with fogcomputing concept elements usage. The paper contains some models, selected simulation results and conclusion about the prospects of the fog-computing as a means to enhance ICS reliability.

  16. The Outcome and Assessment Information Set (OASIS): A Review of Validity and Reliability

    Science.gov (United States)

    O’CONNOR, MELISSA; DAVITT, JOAN K.

    2015-01-01

    The Outcome and Assessment Information Set (OASIS) is the patient-specific, standardized assessment used in Medicare home health care to plan care, determine reimbursement, and measure quality. Since its inception in 1999, there has been debate over the reliability and validity of the OASIS as a research tool and outcome measure. A systematic literature review of English-language articles identified 12 studies published in the last 10 years examining the validity and reliability of the OASIS. Empirical findings indicate the validity and reliability of the OASIS range from low to moderate but vary depending on the item studied. Limitations in the existing research include: nonrepresentative samples; inconsistencies in methods used, items tested, measurement, and statistical procedures; and the changes to the OASIS itself over time. The inconsistencies suggest that these results are tentative at best; additional research is needed to confirm the value of the OASIS for measuring patient outcomes, research, and quality improvement. PMID:23216513

  17. Creation of reliable relevance judgments in information retrieval systems evaluation experimentation through crowdsourcing: a review.

    Science.gov (United States)

    Samimi, Parnia; Ravana, Sri Devi

    2014-01-01

    Test collection is used to evaluate the information retrieval systems in laboratory-based evaluation experimentation. In a classic setting, generating relevance judgments involves human assessors and is a costly and time consuming task. Researchers and practitioners are still being challenged in performing reliable and low-cost evaluation of retrieval systems. Crowdsourcing as a novel method of data acquisition is broadly used in many research fields. It has been proven that crowdsourcing is an inexpensive and quick solution as well as a reliable alternative for creating relevance judgments. One of the crowdsourcing applications in IR is to judge relevancy of query document pair. In order to have a successful crowdsourcing experiment, the relevance judgment tasks should be designed precisely to emphasize quality control. This paper is intended to explore different factors that have an influence on the accuracy of relevance judgments accomplished by workers and how to intensify the reliability of judgments in crowdsourcing experiment.

  18. Characterizing Volumetric Strain at Brady Hot Springs, Nevada, USA Using Geodetic Data, Numerical Models, and Prior Information

    Science.gov (United States)

    Reinisch, E. C.; Feigl, K. L.; Cardiff, M. A.; Morency, C.; Kreemer, C.; Akerley, J.

    2017-12-01

    Time-dependent deformation has been observed at Brady Hot Springs using data from the Global Positioning System (GPS) and interferometric synthetic aperture radar (InSAR) [e.g., Ali et al. 2016, http://dx.doi.org/10.1016/j.geothermics.2016.01.008]. We seek to determine the geophysical process governing the observed subsidence. As two end-member hypotheses, we consider thermal contraction and a decrease in pore fluid pressure. A decrease in temperature would cause contraction in the subsurface and subsidence at the surface. A decrease in pore fluid pressure would allow the volume of pores to shrink and also produce subsidence. To simulate these processes, we use a dislocation model that assumes uniform elastic properties in a half space [Okada, 1985]. The parameterization consists of many cubic volume elements (voxels), each of which contracts by closing its three mutually orthogonal bisecting square surfaces. Then we use linear inversion to solve for volumetric strain in each voxel given a measurement of range change. To differentiate between the two possible hypotheses, we use a Bayesian framework with geostatistical prior information. We perform inversion using each prior to decide if one leads to a more geophysically reasonable interpretation than the other. This work is part of a project entitled "Poroelastic Tomography by Adjoint Inverse Modeling of Data from Seismology, Geodesy, and Hydrology" and is supported by the Geothermal Technology Office of the U.S. Department of Energy [DE-EE0006760].

  19. Use and perceptions of information among family physicians: sources considered accessible, relevant, and reliable.

    Science.gov (United States)

    Kosteniuk, Julie G; Morgan, Debra G; D'Arcy, Carl K

    2013-01-01

    The research determined (1) the information sources that family physicians (FPs) most commonly use to update their general medical knowledge and to make specific clinical decisions, and (2) the information sources FPs found to be most physically accessible, intellectually accessible (easy to understand), reliable (trustworthy), and relevant to their needs. A cross-sectional postal survey of 792 FPs and locum tenens, in full-time or part-time medical practice, currently practicing or on leave of absence in the Canadian province of Saskatchewan was conducted during the period of January to April 2008. Of 666 eligible physicians, 331 completed and returned surveys, resulting in a response rate of 49.7% (331/666). Medical textbooks and colleagues in the main patient care setting were the top 2 sources for the purpose of making specific clinical decisions. Medical textbooks were most frequently considered by FPs to be reliable (trustworthy), and colleagues in the main patient care setting were most physically accessible (easy to access). When making specific clinical decisions, FPs were most likely to use information from sources that they considered to be reliable and generally physically accessible, suggesting that FPs can best be supported by facilitating easy and convenient access to high-quality information.

  20. Interventions to assist health consumers to find reliable online health information: a comprehensive review.

    Directory of Open Access Journals (Sweden)

    Kenneth Lee

    Full Text Available BACKGROUND: Health information on the Internet is ubiquitous, and its use by health consumers prevalent. Finding and understanding relevant online health information, and determining content reliability, pose real challenges for many health consumers. PURPOSE: To identify the types of interventions that have been implemented to assist health consumers to find reliable online health information, and where possible, describe and compare the types of outcomes studied. DATA SOURCES: PubMed, PsycINFO, CINAHL Plus and Cochrane Library databases; WorldCat and Scirus 'gray literature' search engines; and manual review of reference lists of selected publications. STUDY SELECTION: Publications were selected by firstly screening title, abstract, and then full text. DATA EXTRACTION: Seven publications met the inclusion criteria, and were summarized in a data extraction form. The form incorporated the PICOS (Population Intervention Comparators Outcomes and Study Design Model. Two eligible gray literature papers were also reported. DATA SYNTHESIS: Relevant data from included studies were tabulated to enable descriptive comparison. A brief critique of each study was included in the tables. This review was unable to follow systematic review methods due to the paucity of research and humanistic interventions reported. LIMITATIONS: While extensive, the gray literature search may have had limited reach in some countries. The paucity of research on this topic limits conclusions that may be drawn. CONCLUSIONS: The few eligible studies predominantly adopted a didactic approach to assisting health consumers, whereby consumers were either taught how to find credible websites, or how to use the Internet. Common types of outcomes studied include knowledge and skills pertaining to Internet use and searching for reliable health information. These outcomes were predominantly self-assessed by participants. There is potential for further research to explore other avenues for

  1. Interventions to assist health consumers to find reliable online health information: a comprehensive review.

    Science.gov (United States)

    Lee, Kenneth; Hoti, Kreshnik; Hughes, Jeffery D; Emmerton, Lynne M

    2014-01-01

    Health information on the Internet is ubiquitous, and its use by health consumers prevalent. Finding and understanding relevant online health information, and determining content reliability, pose real challenges for many health consumers. To identify the types of interventions that have been implemented to assist health consumers to find reliable online health information, and where possible, describe and compare the types of outcomes studied. PubMed, PsycINFO, CINAHL Plus and Cochrane Library databases; WorldCat and Scirus 'gray literature' search engines; and manual review of reference lists of selected publications. Publications were selected by firstly screening title, abstract, and then full text. Seven publications met the inclusion criteria, and were summarized in a data extraction form. The form incorporated the PICOS (Population Intervention Comparators Outcomes and Study Design) Model. Two eligible gray literature papers were also reported. Relevant data from included studies were tabulated to enable descriptive comparison. A brief critique of each study was included in the tables. This review was unable to follow systematic review methods due to the paucity of research and humanistic interventions reported. While extensive, the gray literature search may have had limited reach in some countries. The paucity of research on this topic limits conclusions that may be drawn. The few eligible studies predominantly adopted a didactic approach to assisting health consumers, whereby consumers were either taught how to find credible websites, or how to use the Internet. Common types of outcomes studied include knowledge and skills pertaining to Internet use and searching for reliable health information. These outcomes were predominantly self-assessed by participants. There is potential for further research to explore other avenues for assisting health consumers to find reliable online health information, and to assess outcomes via objective measures.

  2. Joint optimization of MIMO radar waveform and biased estimator with prior information in the presence of clutter

    Directory of Open Access Journals (Sweden)

    Liu Hongwei

    2011-01-01

    Full Text Available Abstract In this article, we consider the problem of joint optimization of multi-input multi-output (MIMO radar waveform and biased estimator with prior information on targets of interest in the presence of signal-dependent noise. A novel constrained biased Cramer-Rao bound (CRB based method is proposed to optimize the waveform covariance matrix (WCM and biased estimator such that the performance of parameter estimation can be improved. Under a simplifying assumption, the resultant nonlinear optimization problem is solved resorting to a convex relaxation that belongs to the semidefinite programming (SDP class. An optimal solution of the initial problem is then constructed through a suitable approximation to an optimal solution of the relaxed one (in a least squares (LS sense. Numerical results show that the performance of parameter estimation can be improved considerably by the proposed method compared to uncorrelated waveforms.

  3. Adapting Free, Prior, and Informed Consent (FPIC to Local Contexts in REDD+: Lessons from Three Experiments in Vietnam

    Directory of Open Access Journals (Sweden)

    Thuy Thu Pham

    2015-07-01

    Full Text Available Free, prior, and informed consent (FPIC is a means of ensuring that people’s rights are respected when reducing emissions from deforestation and forest degradation, and enhancing forest carbon stocks (REDD+ projects are established in developing countries. This paper examines how FPIC has been applied in three projects in Vietnam and highlights two key lessons learnt. First, as human rights and democracy are seen as politically sensitive issues in Vietnam, FPIC is likely to be more accepted by the government if it is built upon the national legal framework on citizen rights. Applying FPIC in this context can ensure that both government and citizen’s interests are achieved within the permitted political space. Second, FPIC activities should be seen as a learning process and designed based on local needs and preferences, with accountability of facilitators, two-way and multiple communication strategies, flexibility, and collective action in mind.

  4. Toddlers favor communicatively presented information over statistical reliability in learning about artifacts.

    Directory of Open Access Journals (Sweden)

    Hanna Marno

    Full Text Available Observed associations between events can be validated by statistical information of reliability or by testament of communicative sources. We tested whether toddlers learn from their own observation of efficiency, assessed by statistical information on reliability of interventions, or from communicatively presented demonstration, when these two potential types of evidence of validity of interventions on a novel artifact are contrasted with each other. Eighteen-month-old infants observed two adults, one operating the artifact by a method that was more efficient (2/3 probability of success than that of the other (1/3 probability of success. Compared to the Baseline condition, in which communicative signals were not employed, infants tended to choose the less reliable method to operate the artifact when this method was demonstrated in a communicative manner in the Experimental condition. This finding demonstrates that, in certain circumstances, communicative sanctioning of reliability may override statistical evidence for young learners. Such a bias can serve fast and efficient transmission of knowledge between generations.

  5. Supporting patients in obtaining and oncologists in providing evidence-based health-related quality of life information prior to and after esophageal cancer surgery

    NARCIS (Netherlands)

    Jacobs, M.

    2015-01-01

    The overall aim of this thesis was to support patients in obtaining and oncologists in providing evidence-based HRQL data prior to and following esophageal cancer surgery. This thesis is divided in two parts. In Part I, we addressed the information needs of esophageal cancer patients prior to and

  6. Reliability, Validity, Comparability and Practical Utility of Cybercrime-Related Data, Metrics, and Information

    OpenAIRE

    Nir Kshetri

    2013-01-01

    With an increasing pervasiveness, prevalence and severity of cybercrimes, various metrics, measures and statistics have been developed and used to measure various aspects of this phenomenon. Cybercrime-related data, metrics, and information, however, pose important and difficult dilemmas regarding the issues of reliability, validity, comparability and practical utility. While many of the issues of the cybercrime economy are similar to other underground and underworld industries, this economy ...

  7. Solving crystal structures from powder data. I. The role of the prior information in the two-stage method

    International Nuclear Information System (INIS)

    Altomare, A.; Carrozzini, B.; Giacovazzo, C.; Guagliardi, A.; Moliterni, A.G.G.; Rizzi, R.

    1996-01-01

    For pt.II see ibid., p.674-81, 1996. The principal limitation of the diffraction methods for crystal structure analysis from powder data is originated by the collapse of the three-dimensional reciprocal space into the one dimension of the powder diffraction pattern. The degradation of the information can make difficult even the solution of small crystal structures and can generate inefficiencies in the least-squares methods devoted to crystal structure refinement. In this paper, the current two-stage procedures, the first stage dedicated to powder-pattern decomposition and the second to direct phasing of powder data, are analysed. It is shown that in the first stage such procedures disregard a large amount of information that can become available during the process of crystal structure solution and analysis. The use of such information is essential for making direct-methods procedures more robust and for improving the accuracy of the least-squares techniques. The performances of EXTRA [Altomare, Burla, Cascarano, Giacovazzo, Guagliardi, Moliterni and Polidori (1995). J. Appl. Cryst. 28, 842-846], a program for full-pattern decomposition based on the Le Bail algorithm, and of SIRPOW.92 [Altomare, Burla, Cascarano, Giacovazzo, Guagliardi, Polidori and Camalli (1994). J. Appl. Cryst. 27, 435-436], a direct-methods program optimized for powder data, are discussed in order to offer to the reader a logical pathway for the analysis of the traditional techniques and for the proposition of a new approach. It is shown that pattern-decomposition programs based on the Le Bail algorithm are able to exploit the prior information in a more effective way than Pawley-method-based decomposition programs. (orig.)

  8. Participation with a Punch: Community Referenda on Dam Projects and the Right to Free, Prior, and Informed Consent to Development

    Directory of Open Access Journals (Sweden)

    Brant McGee

    2010-06-01

    Full Text Available The 2000 Report of the World Commission on Dams (WCD found that dams can threaten the resources that provide the basis for indigenous and other peoples’ culture, religion, subsistence, social and family structure – and their very existence, through forced relocation – and lead to ecosystem impacts harmful to agriculture, animals and fish. The WCD recommended the effective participation of potentially impacted local people in decisions regarding dam construction. The international right to free, prior, and informed consent (FPIC accorded to indigenous peoples promises not only the opportunity to participate in decisions affecting their lands and livelihoods but to stop unwanted development by refusing consent as well. The newly developed concept of community referenda, held in areas potentially impacted by development projects, provides an accurate measure of the position of local voters on the proposed project through a democratic process that discourages violence, promotes fair and informed debate, and provides an avenue for communities to express their consent or refusal of a specific project. The legal basis, practical and political implications, and Latin American examples of community referenda are explored as a means of implementing the critical goal of the principle of FPIC, the expression of community will and its conclusive impact on development decision-making.

  9. A Calibrated Power Prior Approach to Borrow Information from Historical Data with Application to Biosimilar Clinical Trials.

    Science.gov (United States)

    Pan, Haitao; Yuan, Ying; Xia, Jielai

    2017-11-01

    A biosimilar refers to a follow-on biologic intended to be approved for marketing based on biosimilarity to an existing patented biological product (i.e., the reference product). To develop a biosimilar product, it is essential to demonstrate biosimilarity between the follow-on biologic and the reference product, typically through two-arm randomization trials. We propose a Bayesian adaptive design for trials to evaluate biosimilar products. To take advantage of the abundant historical data on the efficacy of the reference product that is typically available at the time a biosimilar product is developed, we propose the calibrated power prior, which allows our design to adaptively borrow information from the historical data according to the congruence between the historical data and the new data collected from the current trial. We propose a new measure, the Bayesian biosimilarity index, to measure the similarity between the biosimilar and the reference product. During the trial, we evaluate the Bayesian biosimilarity index in a group sequential fashion based on the accumulating interim data, and stop the trial early once there is enough information to conclude or reject the similarity. Extensive simulation studies show that the proposed design has higher power than traditional designs. We applied the proposed design to a biosimilar trial for treating rheumatoid arthritis.

  10. A Semi-Discrete Landweber-Kaczmarz Method for Cone Beam Tomography and Laminography Exploiting Geometric Prior Information

    Science.gov (United States)

    Vogelgesang, Jonas; Schorr, Christian

    2016-12-01

    We present a semi-discrete Landweber-Kaczmarz method for solving linear ill-posed problems and its application to Cone Beam tomography and laminography. Using a basis function-type discretization in the image domain, we derive a semi-discrete model of the underlying scanning system. Based on this model, the proposed method provides an approximate solution of the reconstruction problem, i.e. reconstructing the density function of a given object from its projections, in suitable subspaces equipped with basis function-dependent weights. This approach intuitively allows the incorporation of additional information about the inspected object leading to a more accurate model of the X-rays through the object. Also, physical conditions of the scanning geometry, like flat detectors in computerized tomography as used in non-destructive testing applications as well as non-regular scanning curves e.g. appearing in computed laminography (CL) applications, are directly taken into account during the modeling process. Finally, numerical experiments of a typical CL application in three dimensions are provided to verify the proposed method. The introduction of geometric prior information leads to a significantly increased image quality and superior reconstructions compared to standard iterative methods.

  11. The Data Reliability of Volunteered Geographic Information with Using Traffic Accident Data

    Science.gov (United States)

    Sevinç, H. K.; Karaş, I. R.

    2017-11-01

    The development of mobile technologies is important in the lives of humans. Mobile devices constitute a great part of the daily lives of people. It has come to such a point that when people first wake up, they check their smart phones for the first thing. Users may share their positions with the GNSS sensors in mobile devices or they can add information about their positions in mobile applications. Users contribute to Geographical Information System with this sharing. These users consist of native (citizens) living in that geographical position not of the CBS specialists. Creating, collecting, sharing and disseminating the geographical data provided by voluntary individuals constitute the Volunteered Geographic Information System. The data in the Volunteered Geographic Information System are received from amateur users. "How reliable will the data received from amateur users instead of specialists of the field be in scientific terms?" In this study, the reliability between the data received from the voluntary users through Volunteered Geographic Information System and real data is investigated. The real data consist of the traffic accident coordinates. The data that will be received from users will be received through the speed values in the relevant coordinates and the marking of the users for possible accident points on the map.

  12. Supporting patients in obtaining and oncologists in providing evidence-based health-related quality of life information prior to and after esophageal cancer surgery

    OpenAIRE

    Jacobs, M.

    2015-01-01

    The overall aim of this thesis was to support patients in obtaining and oncologists in providing evidence-based HRQL data prior to and following esophageal cancer surgery. This thesis is divided in two parts. In Part I, we addressed the information needs of esophageal cancer patients prior to and following esophageal surgery, the barriers and facilitators patients experienced when discussing their information needs with their oncologist, and the development of a web-based question prompt shee...

  13. Online patient information on Vagus Nerve Stimulation: How reliable is it for facilitating shared decision making?

    Science.gov (United States)

    Ved, Ronak; Cobbold, Naomi; Igbagiri, Kueni; Willis, Mark; Leach, Paul; Zaben, Malik

    2017-08-01

    This study evaluates the quality of information available on the internet for carers of children with epilepsy considering treatment with Vagus Nerve Stimulation (VNS). Selected key phrases were entered into two popular search engines (Google™, Yahoo™). These phrases were: "Vagus nerve stimulator", alone and in combination with "childhood epilepsy", "paediatric epilepsy" and "epilepsy in childhood"; "VNS", and "VNS epilepsy". The first 50 hits per search were then screened. Of 600 identified sites, duplicated (262), irrelevant (230) and inaccessible (15) results were excluded. 93 websites were identified for evaluation using the DISCERN instrument, an online validation tool for patient information websites. The mean DISCERN score of all analysed websites was 39/80 (49%; SD 13.5). This equates to Fair to borderline Poor global quality, (Excellent=80-63; Good=62-51; Fair=50-39; Poor=38-27; Very poor=26-15). None of the analysed sites obtained an Excellent quality rating. 13% (12) obtained a Good score, 40% (37) obtained an Average score, 35% (33) obtained a Poor score, and 12% (11) obtained a Very poor score. The cohort of websites scored particularly poorly on assessment of whether reliable, holistic information was presented, for instance provision of reliable sources, (28%, SD 18) and discussion of alternative treatments, (30%, SD 14). To facilitate patient-centred shared decision-making, high quality information needs to be available for patients and families considering VNS. This study identifies that such information is difficult to locate on the internet. There is a need to develop focussed and reliable online patient resources for VNS. Copyright © 2017 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  14. METHODS OF IMPROVING THE RELIABILITY OF THE CONTROL SYSTEM TRACTION POWER SUPPLY OF ELECTRIC TRANSPORT BASED ON AN EXPERT INFORMATION

    Directory of Open Access Journals (Sweden)

    O. O. Matusevych

    2009-03-01

    Full Text Available The author proposed the numerous methods of solving the multi-criterion task – increasing of reliability of control system on the basis of expert information. The information, which allows choosing thoughtfully the method of reliability increasing for a control system of electric transport, is considered.

  15. GP preferences for information systems: conjoint analysis of speed, reliability, access and users.

    Science.gov (United States)

    Wyatt, Jeremy C; Batley, Richard P; Keen, Justin

    2010-10-01

    To elicit the preferences and trade-offs of UK general practitioners about key features of health information systems, to help inform the design of such systems in future. A stated choice study to uncover implicit preferences based on a binary choice between scenarios presented in random order. were all 303 general practice members of the UK Internet service provider, Medix who were approached by email to participate. The main outcome measure was the number of seconds delay in system response that general practitioners were willing to trade off for each key system feature: the reliability of the system, the sites from which the system could be accessed and which staff are able to view patient data. Doctors valued speed of response most in information systems but would be prepared to wait 28 seconds to access a system in exchange for improved reliability from 95% to 99%, a further 2 seconds for an improvement to 99.9% and 27 seconds for access to data from anywhere including their own home compared with one place in a single health care premises. However, they would require a system that was 14 seconds faster to compensate for allowing social care as well as National Health Service staff to read patient data. These results provide important new evidence about which system characteristics doctors value highly, and hence which characteristics designers need to focus on when large scale health information systems are planned. © 2010 Blackwell Publishing Ltd.

  16. Reliability Assessment of Cloud Computing Platform Based on Semiquantitative Information and Evidential Reasoning

    Directory of Open Access Journals (Sweden)

    Hang Wei

    2016-01-01

    Full Text Available A reliability assessment method based on evidential reasoning (ER rule and semiquantitative information is proposed in this paper, where a new reliability assessment architecture including four aspects with both quantitative data and qualitative knowledge is established. The assessment architecture is more objective in describing complex dynamic cloud computing environment than that in traditional method. In addition, the ER rule which has good performance for multiple attribute decision making problem is employed to integrate different types of the attributes in assessment architecture, which can obtain more accurate assessment results. The assessment results of the case study in an actual cloud computing platform verify the effectiveness and the advantage of the proposed method.

  17. Can Internet information on vertebroplasty be a reliable means of patient self-education?

    Science.gov (United States)

    Sullivan, T Barrett; Anderson, Joshua T; Ahn, Uri M; Ahn, Nicholas U

    2014-05-01

    regarding vertebroplasty is not only inadequate for proper patient education, but also potentially misleading as sites are more likely to present benefits of the procedure than risks. Although academic sites might be expected to offer higher-quality information than private, industry, or other sites, our data would suggest that they do not. HONCode certification cannot be used reliably as a means of qualifying website information quality. Academic sites should be expected to set a high standard and alter their Internet presence with adequate information distribution. Certification bodies also should alter their standards to necessitate provision of complete information in addition to emphasizing accurate information. Treating physicians may want to counsel their patients regarding the limitations of information present on the Internet and the pitfalls of current certification systems. Level IV, economic and decision analyses. See the Instructions for Authors for a complete description of levels of evidence.

  18. Proponent-Indigenous agreements and the implementation of the right to free, prior, and informed consent in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Papillon, Martin, E-mail: martin.papillon@umontreal.ca [Département de science politique, Université de Montréal, Pavillon Lionel-Groulx, C. P. 6128, succ. Centre-ville, Montréal, Québec H3C 3J7 (Canada); Rodon, Thierry, E-mail: thierry.rodon@pol.ulaval.ca [Département de science politique, Pavillon Charles-De Koninck, 1030, avenue des Sciences-Humaines, local 4433, Université Laval, Québec, Québec G1V 0A6 (Canada)

    2017-01-15

    Indigenous peoples have gained considerable agency in shaping decisions regarding resource development on their traditional lands. This growing agency is reflected in the emergence of the right to free, prior, and informed consent (FPIC) when Indigenous rights may be adversely affected by major resource development projects. While many governments remain non-committal toward FPIC, corporate actors are more proactive at engaging with Indigenous peoples in seeking their consent to resource extraction projects through negotiated Impact and Benefit Agreements. Focusing on the Canadian context, this article discusses the roots and implications of a proponent-driven model for seeking Indigenous consent to natural resource extraction on their traditional lands. Building on two case studies, the paper argues that negotiated consent through IBAs offers a truncated version of FPIC from the perspective of the communities involved. The deliberative ethic at the core of FPIC is often undermined in the negotiation process associated with proponent-led IBAs. - Highlights: • FPIC is becoming a norm for resource extraction projects on Indigenous lands. • Proponent-led IBAs have become the main instrument to establish FPIC in Canada. • Case studies show elite-driven IBA negotiations do not always create the conditions for FPIC. • We need to pay attention to community deliberations as an inherent aspect of FPIC.

  19. Proponent-Indigenous agreements and the implementation of the right to free, prior, and informed consent in Canada

    International Nuclear Information System (INIS)

    Papillon, Martin; Rodon, Thierry

    2017-01-01

    Indigenous peoples have gained considerable agency in shaping decisions regarding resource development on their traditional lands. This growing agency is reflected in the emergence of the right to free, prior, and informed consent (FPIC) when Indigenous rights may be adversely affected by major resource development projects. While many governments remain non-committal toward FPIC, corporate actors are more proactive at engaging with Indigenous peoples in seeking their consent to resource extraction projects through negotiated Impact and Benefit Agreements. Focusing on the Canadian context, this article discusses the roots and implications of a proponent-driven model for seeking Indigenous consent to natural resource extraction on their traditional lands. Building on two case studies, the paper argues that negotiated consent through IBAs offers a truncated version of FPIC from the perspective of the communities involved. The deliberative ethic at the core of FPIC is often undermined in the negotiation process associated with proponent-led IBAs. - Highlights: • FPIC is becoming a norm for resource extraction projects on Indigenous lands. • Proponent-led IBAs have become the main instrument to establish FPIC in Canada. • Case studies show elite-driven IBA negotiations do not always create the conditions for FPIC. • We need to pay attention to community deliberations as an inherent aspect of FPIC.

  20. The influence of baseline marijuana use on treatment of cocaine dependence: application of an informative-priors Bayesian approach.

    Directory of Open Access Journals (Sweden)

    Charles eGreen

    2012-10-01

    Full Text Available Background: Marijuana use is prevalent among patients with cocaine dependence and often non-exclusionary in clinical trials of potential cocaine medications. The dual-focus of this study was to (1 examine the moderating effect of baseline marijuana use on response to treatment with levodopa/carbidopa for cocaine dependence; and (2 apply an informative-priors, Bayesian approach for estimating the probability of a subgroup-by-treatment interaction effect.Method: A secondary data analysis of two previously published, double-blind, randomized controlled trials provided samples for the historical dataset (Study 1: N = 64 complete observations and current dataset (Study 2: N = 113 complete observations. Negative binomial regression evaluated Treatment Effectiveness Scores (TES as a function of medication condition (levodopa/carbidopa, placebo, baseline marijuana use (days in past 30, and their interaction. Results: Bayesian analysis indicated that there was a 96% chance that baseline marijuana use predicts differential response to treatment with levodopa/carbidopa. Simple effects indicated that among participants receiving levodopa/carbidopa the probability that baseline marijuana confers harm in terms of reducing TES was 0.981; whereas the probability that marijuana confers harm within the placebo condition was 0.163. For every additional day of marijuana use reported at baseline, participants in the levodopa/carbidopa condition demonstrated a 5.4% decrease in TES; while participants in the placebo condition demonstrated a 4.9% increase in TES.Conclusion: The potential moderating effect of marijuana on cocaine treatment response should be considered in future trial designs. Applying Bayesian subgroup analysis proved informative in characterizing this patient-treatment interaction effect.

  1. Is the information about dengue available on Brazilian websites of quality and reliable?

    Directory of Open Access Journals (Sweden)

    Thiago Henrique de Lima

    2016-12-01

    Full Text Available The objective of the present study was to identify and evaluate the content of information about dengue available on Brazilian websites. Thirty-two websites were selected for the analysis. For the evaluation of the content of information about dengue, a form was prepared with 16 topics grouped in six information blocks: etiology/transmission, vector, control and prevention, disease/diagnosis, treatment and epidemiology. The websites were also evaluated according to the following criteria: authorship, update, language, interactivity, scientific basis and graphic elements. The results showed a predominantly lack of information in relation to the topics analyzed in each information block. Regarding the technical quality of the websites, only 28.1% showed some indication of scientific basis and 34.3% contained the date of publication or of the last update. Such results attested the low reliability of the selected websites. Knowing that the internet is an efficient mechanism for disseminating information on health topics, we concluded that the creation of such mechanisms to disseminate correct and comprehensive information about dengue is necessary in order to apply this useful tool in the prevention and control of the disease in Brazil.

  2. The role of quality tools in assessing reliability of the internet for health information.

    Science.gov (United States)

    Hanif, Faisal; Read, Janet C; Goodacre, John A; Chaudhry, Afzal; Gibbs, Paul

    2009-12-01

    The Internet has made it possible for patients and their families to access vast quantities of information that previously would have been difficult for anyone but a physician or librarian to obtain. Health information websites, however, are recognised to differ widely in quality and reliability of their content. This has led to the development of various codes of conduct or quality rating tools to assess the quality of health websites. However, the validity and reliability of these quality tools and their applicability to different health websites also varies. In principle, rating tools should be available to consumers, require a limited number of elements to be assessed, be assessable in all elements, be readable and be able to gauge the readability and consistency of information provided from a patient's view point. This article reviews the literature on the trends of the Internet use for health and analyses various codes of conduct/ethics or 'quality tools' available to monitor the quality of health websites from a patient perspective.

  3. The European reliability data system. An organized information exchange on the operation of European nuclear reactors

    International Nuclear Information System (INIS)

    Mancini, G.; Amesz, J.; Bastianini, P.; Capobianchi, S.

    1983-01-01

    The paper revises the aims and objectives of the European Reliability Data System (ERDS), a centralized system collecting and organizing, at European level, information related to the operation of LWRs. The ERDS project was started in 1977 and after a preliminary feasibility study that ended in 1979 is now proceeding towards the final design and implementation stages. ERDS exploits information collected in national data systems and information deriving from single reactor sources. The paper describes first the development of the four data banks constituting the system: Component Event Data Bank, CEDB; Abnormal Occurrences Reporting System, AORS; Operating Unit Status Report, OUSR; and Generic Reliability Parameter Data Bank, GRPDB. Several typical aspects concerning the project are then outlined from the need of homogeneization of data and therefore the need for setting up reference classifications, to the problem of data transcoding and input into the system. Furthermore, the need is stressed of involving much more deeply nuclear power plant operators into the process of data acquisition by providing them with a useful feedback from the data analysis. (author)

  4. Methods for Calculating Frequency of Maintenance of Complex Information Security System Based on Dynamics of Its Reliability

    Science.gov (United States)

    Varlataya, S. K.; Evdokimov, V. E.; Urzov, A. Y.

    2017-11-01

    This article describes a process of calculating a certain complex information security system (CISS) reliability using the example of the technospheric security management model as well as ability to determine the frequency of its maintenance using the system reliability parameter which allows one to assess man-made risks and to forecast natural and man-made emergencies. The relevance of this article is explained by the fact the CISS reliability is closely related to information security (IS) risks. Since reliability (or resiliency) is a probabilistic characteristic of the system showing the possibility of its failure (and as a consequence - threats to the protected information assets emergence), it is seen as a component of the overall IS risk in the system. As it is known, there is a certain acceptable level of IS risk assigned by experts for a particular information system; in case of reliability being a risk-forming factor maintaining an acceptable risk level should be carried out by the routine analysis of the condition of CISS and its elements and their timely service. The article presents a reliability parameter calculation for the CISS with a mixed type of element connection, a formula of the dynamics of such system reliability is written. The chart of CISS reliability change is a S-shaped curve which can be divided into 3 periods: almost invariable high level of reliability, uniform reliability reduction, almost invariable low level of reliability. Setting the minimum acceptable level of reliability, the graph (or formula) can be used to determine the period of time during which the system would meet requirements. Ideally, this period should not be longer than the first period of the graph. Thus, the proposed method of calculating the CISS maintenance frequency helps to solve a voluminous and critical task of the information assets risk management.

  5. [The analytical reliability of clinical laboratory information and role of the standards in its support].

    Science.gov (United States)

    Men'shikov, V V

    2012-12-01

    The article deals with the factors impacting the reliability of clinical laboratory information. The differences of qualities of laboratory analysis tools produced by various manufacturers are discussed. These characteristics are the causes of discrepancy of the results of laboratory analyses of the same analite. The role of the reference system in supporting the comparability of laboratory analysis results is demonstrated. The project of national standard is presented to regulate the requirements to standards and calibrators for analysis of qualitative and non-metrical characteristics of components of biomaterials.

  6. Based on Weibull Information Fusion Analysis Semiconductors Quality the Key Technology of Manufacturing Execution Systems Reliability

    Science.gov (United States)

    Huang, Zhi-Hui; Tang, Ying-Chun; Dai, Kai

    2016-05-01

    Semiconductor materials and Product qualified rate are directly related to the manufacturing costs and survival of the enterprise. Application a dynamic reliability growth analysis method studies manufacturing execution system reliability growth to improve product quality. Refer to classical Duane model assumptions and tracking growth forecasts the TGP programming model, through the failure data, established the Weibull distribution model. Combining with the median rank of average rank method, through linear regression and least squares estimation method, match respectively weibull information fusion reliability growth curve. This assumption model overcome Duane model a weakness which is MTBF point estimation accuracy is not high, through the analysis of the failure data show that the method is an instance of the test and evaluation modeling process are basically identical. Median rank in the statistics is used to determine the method of random variable distribution function, which is a good way to solve the problem of complex systems such as the limited sample size. Therefore this method has great engineering application value.

  7. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    International Nuclear Information System (INIS)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee; Lee, Minuk; Choi, Jong-su; Hong, Sup

    2015-01-01

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF

  8. Threshold Estimation of Generalized Pareto Distribution Based on Akaike Information Criterion for Accurate Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Seunghoon; Lim, Woochul; Cho, Su-gil; Park, Sanghyun; Lee, Tae Hee [Hanyang University, Seoul (Korea, Republic of); Lee, Minuk; Choi, Jong-su; Hong, Sup [Korea Research Insitute of Ships and Ocean Engineering, Daejeon (Korea, Republic of)

    2015-02-15

    In order to perform estimations with high reliability, it is necessary to deal with the tail part of the cumulative distribution function (CDF) in greater detail compared to an overall CDF. The use of a generalized Pareto distribution (GPD) to model the tail part of a CDF is receiving more research attention with the goal of performing estimations with high reliability. Current studies on GPDs focus on ways to determine the appropriate number of sample points and their parameters. However, even if a proper estimation is made, it can be inaccurate as a result of an incorrect threshold value. Therefore, in this paper, a GPD based on the Akaike information criterion (AIC) is proposed to improve the accuracy of the tail model. The proposed method determines an accurate threshold value using the AIC with the overall samples before estimating the GPD over the threshold. To validate the accuracy of the method, its reliability is compared with that obtained using a general GPD model with an empirical CDF.

  9. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  10. From Dams to Development Justice: Progress with 'Free, Prior and Informed Consent' Since the World Commission on Dams

    Directory of Open Access Journals (Sweden)

    Joji Cariño

    2010-06-01

    Full Text Available The World Commission on Dams (WCD helped establish as development best practice the requirement to respect the right of indigenous peoples to give or withhold their 'free, prior and informed consent' (FPIC to development projects that will affect them. Recognition of this right helps redress the unequal power relations between indigenous peoples and others seeking access to their lands and resources. In this Viewpoint, we examine the evolution of policy in the ten years since the publication of the WCD Report, and how FPIC has been affirmed as a right of indigenous peoples under international human rights law and as industry best practice for extractive industries, logging, forestry plantations, palm oil, protected areas and, most recently, for projects to reduce greenhouse gas (GHG emissions from deforestation and forest degradation. To date, relatively few national legal frameworks explicitly require respect for this right and World Bank standards have yet to be revised in line with these advances in international law. We analyse how international law also needs to clarify how the right to FPIC relates to the State’s power to impose resource exploitation in the 'national interest' and whether 'local communities' more broadly also enjoy the right to FPIC. In practice, as documented in this Viewpoint and in the cases we review, the right to FPIC is widely abused by corporations and State agencies. A growing tendency to reduce implementation of FPIC to a simplified check list of actions for outsiders to follow, risks again removing control over decisions from indigenous peoples. For FPIC to be effective it must respect indigenous peoples’ rights to control their customary lands, represent themselves through their own institutions and make decisions according to procedures and rhythms of their choosing.

  11. The Trafigura Case and the System of Prior Informed Consent Under the Basel Convention – A Broken System?

    Directory of Open Access Journals (Sweden)

    Gary Cox

    2010-12-01

    Full Text Available The much publicised Trafigura case of the illegal dumping of hazardous petrochemical waste in and around Abidjan in Côte d’Ivoire has reignited the debate about the international trade in hazardous wastes as well as issues of international corporate social responsibility. The incident, which took place in August 2006, highlights major flaws in the existing international regulatory system, particularly around the prior informed consent (PIC procedure. PIC forms the keystone of the 1989 Basel Convention on the Transboundary Movement of Hazardous Wastes. This article focuses on the effectiveness of the PIC procedures under the Basel Convention in the light of the response to the Trafigura incident. The incident exemplifies the failures of the PIC system under the Basel Convention. It reveals confusion on the part of regulatory authorities, failure to take prompt and appropriate action by the authorities involved, a lack of proactive supervisory intervention on the part of the Basel Secretariat, and a more far-reaching lack of developing country support for capacity building and technical assistance. There is a need for a more thorough-going approach to the assessment of environmentally sound management in developing countries. More fundamentally, meaningful consent encompasses the human rights dimension of hazardous wastes on local communities. Efforts aimed at increasing co-operation between the Basel, Rotterdam, Stockholm and MARPOL Conventions should be fully supported but they should be rapidly complemented by addressing deficiencies at ‘the sharp end’ around compliance and the effectiveness of the current system of PIC. A more integrated multilateral environmental regime dealing with all aspects of hazardous chemicals and wastes is warranted based on a wider focus on common concern for the global environment.

  12. Reliability, Validity, Comparability and Practical Utility of Cybercrime-Related Data, Metrics, and Information

    Directory of Open Access Journals (Sweden)

    Nir Kshetri

    2013-02-01

    Full Text Available With an increasing pervasiveness, prevalence and severity of cybercrimes, various metrics, measures and statistics have been developed and used to measure various aspects of this phenomenon. Cybercrime-related data, metrics, and information, however, pose important and difficult dilemmas regarding the issues of reliability, validity, comparability and practical utility. While many of the issues of the cybercrime economy are similar to other underground and underworld industries, this economy also has various unique aspects. For one thing, this industry also suffers from a problem partly rooted in the incredibly broad definition of the term “cybercrime”. This article seeks to provide insights and analysis into this phenomenon, which is expected to advance our understanding into cybercrime-related information.

  13. [The external evaluation of study quality: the role in maintaining the reliability of laboratory information].

    Science.gov (United States)

    Men'shikov, V V

    2013-08-01

    The external evaluation of quality of clinical laboratory examinations was gradually introduced in USSR medical laboratories since 1970s. In Russia, in the middle of 1990 a unified all-national system of external evaluation quality was organized known as the Federal center of external evaluation of quality at the basis of laboratory of the state research center of preventive medicine. The main positions of policy in this area were neatly formulated in the guidance documents of ministry of Health. Nowadays, the center of external evaluation of quality proposes 100 and more types of control studies and permanently extends their specter starting from interests of different disciplines of clinical medicine. The consistent participation of laboratories in the cycles of external evaluation of quality intrinsically promotes improvement of indicators of properness and precision of analysis results and increases reliability of laboratory information. However, a significant percentage of laboratories does not participate at all in external evaluation of quality or takes part in control process irregularly and in limited number of tests. The managers of a number of medical organizations disregard the application of the proposed possibilities to increase reliability of laboratory information and limit financing of studies in the field of quality control. The article proposes to adopt the national standard on the basis of ISO 17043 "Evaluation of compliance. The common requirements of professional competence testing".

  14. A data-informed PIF hierarchy for model-based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Groth, Katrina M.; Mosleh, Ali

    2012-01-01

    This paper addresses three problems associated with the use of Performance Shaping Factors in Human Reliability Analysis. (1) There are more than a dozen Human Reliability Analysis (HRA) methods that use Performance Influencing Factors (PIFs) or Performance Shaping Factors (PSFs) to model human performance, but there is not a standard set of PIFs used among the methods, nor is there a framework available to compare the PIFs used in various methods. (2) The PIFs currently in use are not defined specifically enough to ensure consistent interpretation of similar PIFs across methods. (3) There are few rules governing the creation, definition, and usage of PIF sets. This paper introduces a hierarchical set of PIFs that can be used for both qualitative and quantitative HRA. The proposed PIF set is arranged in a hierarchy that can be collapsed or expanded to meet multiple objectives. The PIF hierarchy has been developed with respect to a set fundamental principles necessary for PIF sets, which are also introduced in this paper. This paper includes definitions of the PIFs to allow analysts to map the proposed PIFs onto current and future HRA methods. The standardized PIF hierarchy will allow analysts to combine different types of data and will therefore make the best use of the limited data in HRA. The collapsible hierarchy provides the structure necessary to combine multiple types of information without reducing the quality of the information.

  15. The availability of reliable information about medicines in Serbia for health professionals summary of product characteristics

    Directory of Open Access Journals (Sweden)

    Đukić Ljiljana C.

    2015-01-01

    Full Text Available Introduction Today, there are many drugs for the treatment of a large number of indicator areas. Significant financial resources are invested in research with the aim of introducing reliable therapeutics to therapy. Therefore, it is necessary to provide health care professionals exact information about new therapies. The overall process of scientific data, ideas and information exchange is possible through numerous communications of modern IT tools. Methodology According to the Law, key information on registered drug is included in the Summary of Product Characteristics (SPC for health professionals, which is harmonized with EU directives and regulations (SmPC.Protocol content and structure of the information provided in SPC is determined in the guidelines of the EU, therefore, a unique set of data is established for all the drugs registered in Serbia. Topic This paper presents the key segments of SPC, with special reference to the description of the regulations that are required for data related to indications, mechanism of action, dosage, contraindications, side effects, interactions and other important information regarding the profile of the drug, which are standardized and harmonized with the structure of identical documents which operate at the EU level, or EMEA. Conclusions SPC is the regulatory determined technical document on medicinal products in the RS in which there are listed scientifically proven, clinical and pharmacological data and information on the profile of the drug, which are essential for health professionals - doctors and pharmacists in the implementation of pharmacotherapy in our society. This document is the starting point for the development of applied Pharmacoinformatics and it includes a range of activities important for the development of appropriate manuals and makes available data and information for monitoring indicators of the national policy on drugs and modern effective drugs treatment.

  16. Waste container weighing data processing to create reliable information of household waste generation.

    Science.gov (United States)

    Korhonen, Pirjo; Kaila, Juha

    2015-05-01

    Household mixed waste container weighing data was processed by knowledge discovery and data mining techniques to create reliable information of household waste generation. The final data set included 27,865 weight measurements covering the whole year 2013 and it was selected from a database of Helsinki Region Environmental Services Authority, Finland. The data set contains mixed household waste arising in 6m(3) containers and it was processed identifying missing values and inconsistently low and high values as errors. The share of missing values and errors in the data set was 0.6%. This provides evidence that the waste weighing data gives reliable information of mixed waste generation at collection point level. Characteristic of mixed household waste arising at the waste collection point level is a wide variation between pickups. The seasonal variation pattern as a result of collective similarities in behaviour of households was clearly detected by smoothed medians of waste weight time series. The evaluation of the collection time series against the defined distribution range of pickup weights on the waste collection point level shows that 65% of the pickups were from collection points with optimally dimensioned container capacity and the collection points with over- and under-dimensioned container capacities were noted in 9.5% and 3.4% of all pickups, respectively. Occasional extra waste in containers occurred in 21.2% of the pickups indicating the irregular behaviour of individual households. The results of this analysis show that processing waste weighing data using knowledge discovery and data mining techniques provides trustworthy information of household waste generation and its variations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  18. Visual reliability and information rate in the retina of a nocturnal bee.

    Science.gov (United States)

    Frederiksen, Rikard; Wcislo, William T; Warrant, Eric J

    2008-03-11

    Nocturnal animals relying on vision typically have eyes that are optically and morphologically adapted for both increased sensitivity and greater information capacity in dim light. Here, we investigate whether adaptations for increased sensitivity also are found in their photoreceptors by using closely related and fast-flying nocturnal and diurnal bees as model animals. The nocturnal bee Megalopta genalis is capable of foraging and homing by using visually discriminated landmarks at starlight intensities. Megalopta's near relative, Lasioglossum leucozonium, performs these tasks only in bright sunshine. By recording intracellular responses to Gaussian white-noise stimuli, we show that photoreceptors in Megalopta actually code less information at most light levels than those in Lasioglossum. However, as in several other nocturnal arthropods, Megalopta's photoreceptors possess a much greater gain of transduction, indicating that nocturnal photoreceptors trade information capacity for sensitivity. By sacrificing photoreceptor signal-to-noise ratio and information capacity in dim light for an increased gain and, thus, an increased sensitivity, this strategy can benefit nocturnal insects that use neural summation to improve visual reliability at night.

  19. A Novel Evaluation Method for Building Construction Project Based on Integrated Information Entropy with Reliability Theory

    Directory of Open Access Journals (Sweden)

    Xiao-ping Bai

    2013-01-01

    Full Text Available Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  20. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    Science.gov (United States)

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  1. Application of Vibration and Oil Analysis for Reliability Information on Helicopter Main Rotor Gearbox

    Science.gov (United States)

    Murrad, Muhamad; Leong, M. Salman

    Based on the experiences of the Malaysian Armed Forces (MAF), failure of the main rotor gearbox (MRGB) was one of the major contributing factors to helicopter breakdowns. Even though vibration and oil analysis are the effective techniques for monitoring the health of helicopter components, these two techniques were rarely combined to form an effective assessment tool in MAF. Results of the oil analysis were often used only for oil changing schedule while assessments of MRGB condition were mainly based on overall vibration readings. A study group was formed and given a mandate to improve the maintenance strategy of S61-A4 helicopter fleet in the MAF. The improvement consisted of a structured approach to the reassessment/redefinition suitable maintenance actions that should be taken for the MRGB. Basic and enhanced tools for condition monitoring (CM) are investigated to address the predominant failures of the MRGB. Quantitative accelerated life testing (QALT) was considered in this work with an intent to obtain the required reliability information in a shorter time with tests under normal stress conditions. These tests when performed correctly can provide valuable information about MRGB performance under normal operating conditions which enable maintenance personnel to make decision more quickly, accurately and economically. The time-to-failure and probability of failure information of the MRGB were generated by applying QALT analysis principles. This study is anticipated to make a dramatic change in its approach to CM, bringing significant savings and various benefits to MAF.

  2. Improving preimplantation genetic diagnosis (PGD) reliability by selection of sperm donor with the most informative haplotype.

    Science.gov (United States)

    Malcov, Mira; Gold, Veronica; Peleg, Sagit; Frumkin, Tsvia; Azem, Foad; Amit, Ami; Ben-Yosef, Dalit; Yaron, Yuval; Reches, Adi; Barda, Shimi; Kleiman, Sandra E; Yogev, Leah; Hauser, Ron

    2017-04-26

    The study is aimed to describe a novel strategy that increases the accuracy and reliability of PGD in patients using sperm donation by pre-selecting the donor whose haplotype does not overlap the carrier's one. A panel of 4-9 informative polymorphic markers, flanking the mutation in carriers of autosomal dominant/X-linked disorders, was tested in DNA of sperm donors before PGD. Whenever the lengths of donors' repeats overlapped those of the women, additional donors' DNA samples were analyzed. The donor that demonstrated the minimal overlapping with the patient was selected for IVF. In 8 out of 17 carriers the markers of the initially chosen donors overlapped the patients' alleles and 2-8 additional sperm donors for each patient were haplotyped. The selection of additional sperm donors increased the number of informative markers and reduced misdiagnosis risk from 6.00% ± 7.48 to 0.48% ±0.68. The PGD results were confirmed and no misdiagnosis was detected. Our study demonstrates that pre-selecting a sperm donor whose haplotype has minimal overlapping with the female's haplotype, is critical for reducing the misdiagnosis risk and ensuring a reliable PGD. This strategy may contribute to prevent the transmission of affected IVF-PGD embryos using a simple and economical procedure. All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. DNA testing of donors was approved by the institutional Helsinki committee (registration number 319-08TLV, 2008). The present study was approved by the institutional Helsinki committee (registration number 0385-13TLV, 2013).

  3. Effectiveness of different approaches to disseminating traveler information on travel time reliability.

    Science.gov (United States)

    2014-01-01

    The second Strategic Highway Research Program (SHRP 2) Reliability program aims to improve trip time reliability by reducing the frequency and effects of events that cause travel times to fluctuate unpredictably. Congestion caused by unreliable, or n...

  4. Development of E-Info geneca: a website providing computer-tailored information and question prompt prior to breast cancer genetic counseling.

    NARCIS (Netherlands)

    Albada, A.; Dulmen, S. van; Otten, R.; Bensing, J.M.; Ausems, M.G.E.M.

    2009-01-01

    This article describes the stepwise development of the website ‘E-info geneca’. The website provides counselees in breast cancer genetic counseling with computer-tailored information and a question prompt prior to their first consultation. Counselees generally do not know what to expect from genetic

  5. Reliability and Validity of the Clinical Dementia Rating for Community-Living Elderly Subjects without an Informant

    Directory of Open Access Journals (Sweden)

    Ma Shwe Zin Nyunt

    2013-10-01

    Full Text Available Background: The Clinical Dementia Rating (CDR scale is widely used to assess cognitive impairment in Alzheimer's disease. It requires collateral information from a reliable informant who is not available in many instances. We adapted the original CDR scale for use with elderly subjects without an informant (CDR-NI and evaluated its reliability and validity for assessing mild cognitive impairment (MCI and dementia among community-dwelling elderly subjects. Method: At two consecutive visits 1 week apart, nurses trained in CDR assessment interviewed, observed and rated cognitive and functional performance according to a protocol in 90 elderly subjects with suboptimal cognitive performance [Mini-Mental State Examination (MMSE Results: The CDR-NI scores (0, 0.5, 1 showed good internal consistency (Crohnbach's a 0.83-0.84, inter-rater reliability (κ 0.77-1.00 for six domains and 0.95 for global rating and test-retest reliability (κ 0.75-1.00 for six domains and 0.80 for global rating, good agreement (κ 0.79 with the clinical assessment status of MCI (n = 37 and dementia (n = 4 and significant differences in the mean scores for MMSE, MOCA and Instrumental Activities of Daily Living (ANOVA global p Conclusion: Owing to the protocol of the interviews, assessments and structured observations gathered during the two visits, CDR-NI provides valid and reliable assessment of MCI and dementia in community-living elderly subjects without an informant.

  6. Reliability and Validity of the Clinical Dementia Rating for Community-Living Elderly Subjects without an Informant.

    Science.gov (United States)

    Nyunt, Ma Shwe Zin; Chong, Mei Sian; Lim, Wee Shiong; Lee, Tih Shih; Yap, Philip; Ng, Tze Pin

    2013-01-01

    The Clinical Dementia Rating (CDR) scale is widely used to assess cognitive impairment in Alzheimer's disease. It requires collateral information from a reliable informant who is not available in many instances. We adapted the original CDR scale for use with elderly subjects without an informant (CDR-NI) and evaluated its reliability and validity for assessing mild cognitive impairment (MCI) and dementia among community-dwelling elderly subjects. At two consecutive visits 1 week apart, nurses trained in CDR assessment interviewed, observed and rated cognitive and functional performance according to a protocol in 90 elderly subjects with suboptimal cognitive performance [Mini-Mental State Examination (MMSE) reliability (κ 0.77-1.00 for six domains and 0.95 for global rating) and test-retest reliability (κ 0.75-1.00 for six domains and 0.80 for global rating), good agreement (κ 0.79) with the clinical assessment status of MCI (n = 37) and dementia (n = 4) and significant differences in the mean scores for MMSE, MOCA and Instrumental Activities of Daily Living (ANOVA global p reliable assessment of MCI and dementia in community-living elderly subjects without an informant.

  7. The Effect of Incorrect Reliability Information on Expectations, Perceptions, and Use of Automation.

    Science.gov (United States)

    Barg-Walkow, Laura H; Rogers, Wendy A

    2016-03-01

    We examined how providing artificially high or low statements about automation reliability affected expectations, perceptions, and use of automation over time. One common method of introducing automation is providing explicit statements about the automation's capabilities. Research is needed to understand how expectations from such introductions affect perceptions and use of automation. Explicit-statement introductions were manipulated to set higher-than (90%), same-as (75%), or lower-than (60%) levels of expectations in a dual-task scenario with 75% reliable automation. Two experiments were conducted to assess expectations, perceptions, compliance, reliance, and task performance over (a) 2 days and (b) 4 days. The baseline assessments showed initial expectations of automation reliability matched introduced levels of expectation. For the duration of each experiment, the lower-than groups' perceptions were lower than the actual automation reliability. However, the higher-than groups' perceptions were no different from actual automation reliability after Day 1 in either study. There were few differences between groups for automation use, which generally stayed the same or increased with experience using the system. Introductory statements describing artificially low automation reliability have a long-lasting impact on perceptions about automation performance. Statements including incorrect automation reliability do not appear to affect use of automation. Introductions should be designed according to desired outcomes for expectations, perceptions, and use of the automation. Low expectations have long-lasting effects. © 2015, Human Factors and Ergonomics Society.

  8. Independent component analysis using prior information for signal detection in a functional imaging system of the retina

    NARCIS (Netherlands)

    Barriga, E. Simon; Pattichis, Marios; Ts’o, Dan; Abramoff, Michael; Kardon, Randy; Kwon, Young; Soliz, Peter

    2011-01-01

    Independent component analysis (ICA) is a statistical technique that estimates a set of sources mixed by an unknown mixing matrix using only a set of observations. For this purpose, the only assumption is that the sources are statistically independent. In many applications, some information about

  9. Reliability Information Analysis Center 1st Quarter 2007, Technical Area Task (TAT) Report

    Science.gov (United States)

    2007-02-05

    Library or [twField/Test 217Plus Ally w/ a.romtu DAAData Experience Data Need t( rdito Trqnd • s aa(Model) develol analisis Mappng & ANLED217Plu...of collected reliability data and have discovered that even with sparse data, analysis of the data shows clustering of reliability data by equipment...intended search target. Conceptually cluster discovered data to allow more detailed analysis by equipment type. For example, it may be useful to

  10. Completeness and reliability of mortality data in Viet Nam: Implications for the national routine health management information system.

    Science.gov (United States)

    Hong, Tran Thi; Phuong Hoa, Nguyen; Walker, Sue M; Hill, Peter S; Rao, Chalapati

    2018-01-01

    Mortality statistics form a crucial component of national Health Management Information Systems (HMIS). However, there are limitations in the availability and quality of mortality data at national level in Viet Nam. This study assessed the completeness of recorded deaths and the reliability of recorded causes of death (COD) in the A6 death registers in the national routine HMIS in Viet Nam. 1477 identified deaths in 2014 were reviewed in two provinces. A capture-recapture method was applied to assess the completeness of the A6 death registers. 1365 household verbal autopsy (VA) interviews were successfully conducted, and these were reviewed by physicians who assigned multiple and underlying cause of death (UCOD). These UCODs from VA were then compared with the CODs recorded in the A6 death registers, using kappa scores to assess the reliability of the A6 death register diagnoses. The overall completeness of the A6 death registers in the two provinces was 89.3% (95%CI: 87.8-90.8). No COD recorded in the A6 death registers demonstrated good reliability. There is very low reliability in recording of cardiovascular deaths (kappa for stroke = 0.47 and kappa for ischaemic heart diseases = 0.42) and diabetes (kappa = 0.33). The reporting of deaths due to road traffic accidents, HIV and some cancers are at a moderate level of reliability with kappa scores ranging between 0.57-0.69 (pViet Nam.

  11. Prior Learning of Relevant Nonaversive Information Is a Boundary Condition for Avoidance Memory Reconsolidation in the Rat Hippocampus.

    Science.gov (United States)

    Radiske, Andressa; Gonzalez, Maria Carolina; Conde-Ocazionez, Sergio A; Feitosa, Anatildes; Köhler, Cristiano A; Bevilaqua, Lia R; Cammarota, Martín

    2017-10-04

    Reactivated memories can be modified during reconsolidation, making this process a potential therapeutic target for posttraumatic stress disorder (PTSD), a mental illness characterized by the recurring avoidance of situations that evoke trauma-related fears. However, avoidance memory reconsolidation depends on a set of still loosely defined boundary conditions, limiting the translational value of basic research. In particular, the involvement of the hippocampus in fear-motivated avoidance memory reconsolidation remains controversial. Combining behavioral and electrophysiological analyses in male Wistar rats, we found that previous learning of relevant nonaversive information is essential to elicit the participation of the hippocampus in avoidance memory reconsolidation, which is associated with an increase in theta- and gamma-oscillation power and cross-frequency coupling in dorsal CA1 during reactivation of the avoidance response. Our results indicate that the hippocampus is involved in memory reconsolidation only when reactivation results in contradictory representations regarding the consequences of avoidance and suggest that robust nesting of hippocampal theta-gamma rhythms at the time of retrieval is a specific reconsolidation marker. SIGNIFICANCE STATEMENT Posttraumatic stress disorder (PTSD) is characterized by maladaptive avoidance responses to stimuli or behaviors that represent or bear resemblance to some aspect of a traumatic experience. Disruption of reconsolidation, the process by which reactivated memories become susceptible to modifications, is a promising approach for treating PTSD patients. However, much of what is known about fear-motivated avoidance memory reconsolidation derives from studies based on fear conditioning instead of avoidance-learning paradigms. Using a step-down inhibitory avoidance task in rats, we found that the hippocampus is involved in memory reconsolidation only when the animals acquired the avoidance response in an

  12. Normalization to specific gravity prior to analysis improves information recovery from high resolution mass spectrometry metabolomic profiles of human urine.

    Science.gov (United States)

    Edmands, William M B; Ferrari, Pietro; Scalbert, Augustin

    2014-11-04

    Extraction of meaningful biological information from urinary metabolomic profiles obtained by liquid-chromatography coupled to mass spectrometry (MS) necessitates the control of unwanted sources of variability associated with large differences in urine sample concentrations. Different methods of normalization either before analysis (preacquisition normalization) through dilution of urine samples to the lowest specific gravity measured by refractometry, or after analysis (postacquisition normalization) to urine volume, specific gravity and median fold change are compared for their capacity to recover lead metabolites for a potential future use as dietary biomarkers. Twenty-four urine samples of 19 subjects from the European Prospective Investigation into Cancer and nutrition (EPIC) cohort were selected based on their high and low/nonconsumption of six polyphenol-rich foods as assessed with a 24 h dietary recall. MS features selected on the basis of minimum discriminant selection criteria were related to each dietary item by means of orthogonal partial least-squares discriminant analysis models. Normalization methods ranked in the following decreasing order when comparing the number of total discriminant MS features recovered to that obtained in the absence of normalization: preacquisition normalization to specific gravity (4.2-fold), postacquisition normalization to specific gravity (2.3-fold), postacquisition median fold change normalization (1.8-fold increase), postacquisition normalization to urinary volume (0.79-fold). A preventative preacquisition normalization based on urine specific gravity was found to be superior to all curative postacquisition normalization methods tested for discovery of MS features discriminant of dietary intake in these urinary metabolomic datasets.

  13. Health Information Needs and Reliability of Sources Among Nondegree Health Sciences Students: A Prerequisite for Designing eHealth Literacy.

    Science.gov (United States)

    Haruna, Hussein; Tshuma, Ndumiso; Hu, Xiao

    Understanding health information needs and health-seeking behavior is a prerequisite for developing an electronic health information literacy (EHIL) or eHealth literacy program for nondegree health sciences students. At present, interest in researching health information needs and reliable sources paradigms has gained momentum in many countries. However, most studies focus on health professionals and students in higher education institutions. The present study was aimed at providing new insight and filling the existing gap by examining health information needs and reliability of sources among nondegree health sciences students in Tanzania. A cross-sectional study was conducted in 15 conveniently selected health training institutions, where 403 health sciences students were participated. Thirty health sciences students were both purposely and conveniently chosen from each health-training institution. The selected students were pursuing nursing and midwifery, clinical medicine, dentistry, environmental health sciences, pharmacy, and medical laboratory sciences courses. Involved students were either in their first year, second year, or third year of study. Health sciences students' health information needs focus on their educational requirements, clinical practice, and personal information. They use print, human, and electronic health information. They lack eHealth research skills in navigating health information resources and have insufficient facilities for accessing eHealth information, a lack of specialists in health information, high costs for subscription electronic information, and unawareness of the availability of free Internet and other online health-related databases. This study found that nondegree health sciences students have limited skills in EHIL. Thus, designing and incorporating EHIL skills programs into the curriculum of nondegree health sciences students is vital. EHIL is a requirement common to all health settings, learning environments, and

  14. SU-F-J-23: Field-Of-View Expansion in Cone-Beam CT Reconstruction by Use of Prior Information

    Energy Technology Data Exchange (ETDEWEB)

    Haga, A; Magome, T; Nakano, M; Nakagawa, K [University of Tokyo Hospital, Tokyo (Japan); Kotoku, J [Teikyo University, Tokyo (Japan)

    2016-06-15

    Purpose: Cone-beam CT (CBCT) has become an integral part of online patient setup in an image-guided radiation therapy (IGRT). In addition, the utility of CBCT for dose calculation has actively been investigated. However, the limited size of field-of-view (FOV) and resulted CBCT image with a lack of peripheral area of patient body prevents the reliability of dose calculation. In this study, we aim to develop an FOV expanded CBCT in IGRT system to allow the dose calculation. Methods: Three lung cancer patients were selected in this study. We collected the cone-beam projection images in the CBCT-based IGRT system (X-ray volume imaging unit, ELEKTA), where FOV size of the provided CBCT with these projections was 410 × 410 mm{sup 2} (normal FOV). Using these projections, CBCT with a size of 728 × 728 mm{sup 2} was reconstructed by a posteriori estimation algorithm including a prior image constrained compressed sensing (PICCS). The treatment planning CT was used as a prior image. To assess the effectiveness of FOV expansion, a dose calculation was performed on the expanded CBCT image with region-of-interest (ROI) density mapping method, and it was compared with that of treatment planning CT as well as that of CBCT reconstructed by filtered back projection (FBP) algorithm. Results: A posteriori estimation algorithm with PICCS clearly visualized an area outside normal FOV, whereas the FBP algorithm yielded severe streak artifacts outside normal FOV due to under-sampling. The dose calculation result using the expanded CBCT agreed with that using treatment planning CT very well; a maximum dose difference was 1.3% for gross tumor volumes. Conclusion: With a posteriori estimation algorithm, FOV in CBCT can be expanded. Dose comparison results suggested that the use of expanded CBCTs is acceptable for dose calculation in adaptive radiation therapy. This study has been supported by KAKENHI (15K08691).

  15. Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability in the Deployable Disbursing System

    Science.gov (United States)

    2009-02-17

    Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability in the Deployable...TITLE AND SUBTITLE Identification of Classified Information in Unclassified DoD Systems During the Audit of Internal Controls and Data Reliability...Systems During the Audit ofInternal Controls and Data Reliability in the Deployable Disbursing System (Report No. D-2009-054) Weare providing this

  16. Practical tool to assess reliability of web-based medicines information.

    Science.gov (United States)

    Lebanova, Hristina; Getov, Ilko; Grigorov, Evgeni

    2014-02-01

    Information disseminated by medicines information systems is not always easy to apply. Nowadays internet provides access to enormous volume and range of health information that was previously inaccessible both for medical specialists and consumers. The aim of this study is to assess internet as a source of drug and health related information and to create test methodology to evaluate the top 10 visited health-related web-sites in Bulgaria. Using existing scientific methodologies for evaluation of web sources, a new algorithm of three-step approach consisting of score-card validation of the drug-related information in the 10 most visited Bulgarian web-sites was created. In many cases the drug information in the internet sites contained errors and discrepancies. Some of the published materials were not validated; they were out-of-date and could cause confusion for consumers. The quality of the online health information is a cause for considerable information noise and threat to patients' safety and rational drug use. There is a need of monitoring the drugs information available online in order to prevent patient misinformation and confusion that could lead to medication errors and abuse.

  17. The Prior-project

    DEFF Research Database (Denmark)

    Engerer, Volkmar Paul; Roued-Cunliffe, Henriette; Albretsen, Jørgen

    digitisation of Arthur Prior’s Nachlass kept in the Bodleian Library, Oxford. The DH infrastructure in question is the Prior Virtual Lab (PVL). PVL was established in 2011 in order to provide researchers in the field of temporal logic easy access to the papers of Arthur Norman Prior (1914-1969), and officially......In this paper, we present a DH research infrastructure which relies heavily on a combination of domain knowledge with information technology. The general goal is to develop tools to aid scholars in their interpretations and understanding of temporal logic. This in turn is based on an extensive...

  18. The reliability of manual reporting of clinical events in an anesthesia information management system (AIMS).

    Science.gov (United States)

    Simpao, Allan F; Pruitt, Eric Y; Cook-Sather, Scott D; Gurnaney, Harshad G; Rehman, Mohamed A

    2012-12-01

    Manual incident reports significantly under-report adverse clinical events when compared with automated recordings of intraoperative data. Our goal was to determine the reliability of AIMS and CQI reports of adverse clinical events that had been witnessed and recorded by research assistants. The AIMS and CQI records of 995 patients aged 2-12 years were analyzed to determine if anesthesia providers had properly documented the emesis events that were observed and recorded by research assistants who were present in the operating room at the time of induction. Research assistants recorded eight cases of emesis during induction that were confirmed with the attending anesthesiologist at the time of induction. AIMS yielded a sensitivity of 38 % (95 % confidence interval [CI] 8.5-75.5 %), while the sensitivity of CQI reporting was 13 % (95 % CI 0.3-52.7 %). The low sensitivities of the AIMS and CQI reports suggest that user-reported AIMS and CQI data do not reliably include significant clinical events.

  19. Accommodating Uncertainty in Prior Distributions

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-01-19

    A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.

  20. Frontiers of reliability

    CERN Document Server

    Basu, Asit P; Basu, Sujit K

    1998-01-01

    This volume presents recent results in reliability theory by leading experts in the world. It will prove valuable for researchers, and users of reliability theory. It consists of refereed invited papers on a broad spectrum of topics in reliability. The subjects covered include Bayesian reliability, Bayesian reliability modeling, confounding in a series system, DF tests, Edgeworth approximation to reliability, estimation under random censoring, fault tree reduction for reliability, inference about changes in hazard rates, information theory and reliability, mixture experiment, mixture of Weibul

  1. Trust in Testimony about Strangers: Young Children Prefer Reliable Informants Who Make Positive Attributions

    Science.gov (United States)

    Boseovski, Janet J.

    2012-01-01

    Young children have been described as critical consumers of information, particularly in the domain of language learning. Indeed, children are more likely to learn novel words from people with accurate histories of object labeling than with inaccurate ones. But what happens when informant testimony conflicts with a tendency to see the world in a…

  2. Validation of Land Cover Products Using Reliability Evaluation Methods

    Directory of Open Access Journals (Sweden)

    Wenzhong Shi

    2015-06-01

    Full Text Available Validation of land cover products is a fundamental task prior to data applications. Current validation schemes and methods are, however, suited only for assessing classification accuracy and disregard the reliability of land cover products. The reliability evaluation of land cover products should be undertaken to provide reliable land cover information. In addition, the lack of high-quality reference data often constrains validation and affects the reliability results of land cover products. This study proposes a validation schema to evaluate the reliability of land cover products, including two methods, namely, result reliability evaluation and process reliability evaluation. Result reliability evaluation computes the reliability of land cover products using seven reliability indicators. Process reliability evaluation analyzes the reliability propagation in the data production process to obtain the reliability of land cover products. Fuzzy fault tree analysis is introduced and improved in the reliability analysis of a data production process. Research results show that the proposed reliability evaluation scheme is reasonable and can be applied to validate land cover products. Through the analysis of the seven indicators of result reliability evaluation, more information on land cover can be obtained for strategic decision-making and planning, compared with traditional accuracy assessment methods. Process reliability evaluation without the need for reference data can facilitate the validation and reflect the change trends of reliabilities to some extent.

  3. Development of a Tablet-based symbol digit modalities test for reliably assessing information processing speed in patients with stroke.

    Science.gov (United States)

    Tung, Li-Chen; Yu, Wan-Hui; Lin, Gong-Hong; Yu, Tzu-Ying; Wu, Chien-Te; Tsai, Chia-Yin; Chou, Willy; Chen, Mei-Hsiang; Hsieh, Ching-Lin

    2016-09-01

    To develop a Tablet-based Symbol Digit Modalities Test (T-SDMT) and to examine the test-retest reliability and concurrent validity of the T-SDMT in patients with stroke. The study had two phases. In the first phase, six experts, nine college students and five outpatients participated in the development and testing of the T-SDMT. In the second phase, 52 outpatients were evaluated twice (2 weeks apart) with the T-SDMT and SDMT to examine the test-retest reliability and concurrent validity of the T-SDMT. The T-SDMT was developed via expert input and college student/patient feedback. Regarding test-retest reliability, the practise effects of the T-SDMT and SDMT were both trivial (d=0.12) but significant (p≦0.015). The improvement in the T-SDMT (4.7%) was smaller than that in the SDMT (5.6%). The minimal detectable changes (MDC%) of the T-SDMT and SDMT were 6.7 (22.8%) and 10.3 (32.8%), respectively. The T-SDMT and SDMT were highly correlated with each other at the two time points (Pearson's r=0.90-0.91). The T-SDMT demonstrated good concurrent validity with the SDMT. Because the T-SDMT had a smaller practise effect and less random measurement error (superior test-retest reliability), it is recommended over the SDMT for assessing information processing speed in patients with stroke. Implications for Rehabilitation The Symbol Digit Modalities Test (SDMT), a common measure of information processing speed, showed a substantial practise effect and considerable random measurement error in patients with stroke. The Tablet-based SDMT (T-SDMT) has been developed to reduce the practise effect and random measurement error of the SDMT in patients with stroke. The T-SDMT had smaller practise effect and random measurement error than the SDMT, which can provide more reliable assessments of information processing speed.

  4. Validity and reliability testing of two instruments to measure breast cancer patients' concerns and information needs relating to radiation therapy

    Directory of Open Access Journals (Sweden)

    Kristjanson Linda J

    2007-11-01

    Full Text Available Abstract Background It is difficult to determine the most effective approach to patient education or tailor education interventions for patients in radiotherapy without tools that assess patients' specific radiation therapy information needs and concerns. Therefore, the aim of this study was to develop psychometrically sound tools to adequately determine the concerns and information needs of cancer patients during radiation therapy. Patients and Methods Two tools were developed to (1 determine patients concerns about radiation therapy (RT Concerns Scale and (2 ascertain patient's information needs at different time point during their radiation therapy (RT Information Needs Scale. Tools were based on previous research by the authors, published literature on breast cancer and radiation therapy and information behaviour research. Thirty-one breast cancer patients completed the questionnaire on one occasion and thirty participants completed the questionnaire on a second occasion to facilitate test-retest reliability. One participant's responses were removed from the analysis. Results were analysed for content validity, internal consistency and stability over time. Results Both tools demonstrated high internal consistency and adequate stability over time. The nine items in the RT Concerns Scale were retained because they met all pre-set psychometric criteria. Two items were deleted from the RT Information Needs Scale because they did not meet content validity criteria and did not achieve pre-specified criteria for internal consistency. This tool now contains 22 items. Conclusion This paper provides preliminary data suggesting that the two tools presented are reliable and valid and would be suitable for use in trials or in the clinical setting.

  5. Minimal information: an urgent need to assess the functional reliability of recombinant proteins used in biological experiments

    Directory of Open Access Journals (Sweden)

    de Marco Ario

    2008-07-01

    Full Text Available Abstract Structural characterization of proteins used in biological experiments is largely neglected. In most publications, the information available is totally insufficient to judge the functionality of the proteins used and, therefore, the significance of identified protein-protein interactions (was the interaction specific or due to unspecific binding of misfolded protein regions? or reliability of kinetic and thermodynamic data (how much protein was in its native form?. As a consequence, the results of single experiments might not only become questionable, but the whole reliability of systems biology, built on these fundaments, would be weakened. The introduction of Minimal Information concerning purified proteins to add as metadata to the main body of a manuscript would render straightforward the assessment of their functional and structural qualities and, consequently, of results obtained using these proteins. Furthermore, accepted standards for protein annotation would simplify data comparison and exchange. This article has been envisaged as a proposal for aggregating scientists who share the opinion that the scientific community needs a platform for Minimum Information for Protein Functionality Evaluation (MIPFE.

  6. Impact of Transport Layer Protocols on Reliable Information Access in Smart Grids

    DEFF Research Database (Denmark)

    Shahid, Kamal; Saeed, Aamir; Kristensen, Thomas le Fevre

    2017-01-01

    Time is critical for certain types of dynamic information (e.g. frequency control) in a smart grid scenario. The usefulness of such information depends upon the arrival within a specific frame of time, which in other case may not serve the purpose and effect controller’s performance....... The question is addressed by analyzing the performance of UDP and TCP over imperfect network conditions to show how the selection of transport layer protocol can dramatically affect controller’s performance. This analysis is based on a quality metric called mismatch probability that considers occurrence...

  7. Treatment of Passive Component Reliability in Risk-Informed Safety Margin Characterization FY 2010 Report

    Energy Technology Data Exchange (ETDEWEB)

    Robert W Youngblood

    2010-09-01

    The Risk-Informed Safety Margin Characterization (RISMC) pathway is a set of activities defined under the U.S. Department of Energy (DOE) Light Water Reactor Sustainability Program. The overarching objective of RISMC is to support plant life-extension decision-making by providing a state-of-knowledge characterization of safety margins in key systems, structures, and components (SSCs). A technical challenge at the core of this effort is to establish the conceptual and technical feasibility of analyzing safety margin in a risk-informed way, which, unlike conventionally defined deterministic margin analysis, is founded on probabilistic characterizations of SSC performance.

  8. Process-aware information system development for the healthcare domain : consistency, reliability and effectiveness

    NARCIS (Netherlands)

    Mans, R.S.; Aalst, van der W.M.P.; Russell, N.C.; Bakker, P.J.M.; Moleman, A.J.; Rinderle-Ma, S.; Sadiq, S.; Leymann, F.

    2010-01-01

    Optimal support for complex healthcare processes cannot be provided by a single out-of-the-box Process-Aware Information System and necessitates the construction of customized applications based on these systems. In order to allow for the seamless integration of the new technology into the existing

  9. Is anterior N2 enhancement a reliable electrophysiological index of concealed information?

    Science.gov (United States)

    Ganis, Giorgio; Bridges, David; Hsu, Chun-Wei; Schendan, Haline E

    2016-12-01

    Concealed information tests (CITs) are used to determine whether an individual possesses information about an item of interest. Event-related potential (ERP) measures in CITs have focused almost exclusively on the P3b component, showing that this component is larger when lying about the item of interest (probe) than telling the truth about control items (irrelevants). Recent studies have begun to examine other ERP components, such as the anterior N2, with mixed results. A seminal CIT study found that visual probes elicit a larger anterior N2 than irrelevants (Gamer and Berti, 2010) and suggested that this component indexes cognitive control processes engaged when lying about probes. However, this study did not control for potential intrinsic differences among the stimuli: the same probe and irrelevants were used for all participants, and there was no control condition composed of uninformed participants. Here, first we show that the N2 effect found in the study by Gamer and Berti (2010) was in large part due to stimulus differences, as the effect observed in a concealed information condition was comparable to that found in two matched control conditions without any concealed information (Experiments 1 and 2). Next, we addressed the issue of the generality of the N2 findings by counterbalancing a new set of stimuli across participants and by using a control condition with uninformed participants (Experiment 3). Results show that the probe did not elicit a larger anterior N2 than the irrelevants under these controlled conditions. These findings suggest that caution should be taken in using the N2 as an index of concealed information in CITs. Furthermore, they are a reminder that results of CIT studies (not only with ERPs) performed without stimulus counterbalancing and suitable control conditions may be confounded by differential intrinsic properties of the stimuli employed. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Improvement of Steam Turbine Operational Performance and Reliability with using Modern Information Technologies

    Science.gov (United States)

    Brezgin, V. I.; Brodov, Yu M.; Kultishev, A. Yu

    2017-11-01

    The report presents improvement methods review in the fields of the steam turbine units design and operation based on modern information technologies application. In accordance with the life cycle methodology support, a conceptual model of the information support system during life cycle main stages (LC) of steam turbine unit is suggested. A classifying system, which ensures the creation of sustainable information links between the engineer team (manufacture’s plant) and customer organizations (power plants), is proposed. Within report, the principle of parameterization expansion beyond the geometric constructions at the design and improvement process of steam turbine unit equipment is proposed, studied and justified. The report presents the steam turbine unit equipment design methodology based on the brand new oil-cooler design system that have been developed and implemented by authors. This design system combines the construction subsystem, which is characterized by extensive usage of family tables and templates, and computation subsystem, which includes a methodology for the thermal-hydraulic zone-by-zone oil coolers design calculations. The report presents data about the developed software for operational monitoring, assessment of equipment parameters features as well as its implementation on five power plants.

  11. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  12. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  13. Diverse Data Sets Can Yield Reliable Information through Mechanistic Modeling: Salicylic Acid Clearance.

    Science.gov (United States)

    Raymond, G M; Bassingthwaighte, J B

    This is a practical example of a powerful research strategy: putting together data from studies covering a diversity of conditions can yield a scientifically sound grasp of the phenomenon when the individual observations failed to provide definitive understanding. The rationale is that defining a realistic, quantitative, explanatory hypothesis for the whole set of studies, brings about a "consilience" of the often competing hypotheses considered for individual data sets. An internally consistent conjecture linking multiple data sets simultaneously provides stronger evidence on the characteristics of a system than does analysis of individual data sets limited to narrow ranges of conditions. Our example examines three very different data sets on the clearance of salicylic acid from humans: a high concentration set from aspirin overdoses; a set with medium concentrations from a research study on the influences of the route of administration and of sex on the clearance kinetics, and a set on low dose aspirin for cardiovascular health. Three models were tested: (1) a first order reaction, (2) a Michaelis-Menten (M-M) approach, and (3) an enzyme kinetic model with forward and backward reactions. The reaction rates found from model 1 were distinctly different for the three data sets, having no commonality. The M-M model 2 fitted each of the three data sets but gave a reliable estimates of the Michaelis constant only for the medium level data (K m = 24±5.4 mg/L); analyzing the three data sets together with model 2 gave K m = 18±2.6 mg/L. (Estimating parameters using larger numbers of data points in an optimization increases the degrees of freedom, constraining the range of the estimates). Using the enzyme kinetic model (3) increased the number of free parameters but nevertheless improved the goodness of fit to the combined data sets, giving tighter constraints, and a lower estimated K m = 14.6±2.9 mg/L, demonstrating that fitting diverse data sets with a single model

  14. A Framework for Reliable Reception of Wireless Metering Data using Protocol Side Information

    DEFF Research Database (Denmark)

    Melchior Jacobsen, Rasmus; Popovski, Petar

    2013-01-01

    the deterministic protocol structure to obtain side information and group the packets from the same meter. We derive the probability of falsely pairing packets from different senders in the simple case of no channel errors, and show through simulation and data from an experimental deployment the probability...... of false pairing with channel errors. The pairing is an essential step towards recovery of metering data from as many as possible meters under harsh channel conditions. From the experiment we find that more than 15% of all conducted pairings are between two erroneous packets, which sets an upper bound...

  15. [Are newspapers a reliable source of information about doping in sports?].

    Science.gov (United States)

    Durrieu, Geneviève; Gorsse, Elisabeth; Montastruc, Jean-Louis

    2004-01-01

    To study the coverage by French newspapers of doping in sports, we performed a systematic review of articles appearing between January and March 2003 on the following French websites: L'Equipe, Le Monde, Le Figaro, Libération, La Dépêche du Midi and Agence France-Presse (AFP). We recorded a total of 58 articles about doping. Among them, 48 (83%) were collected from the AFP news. L'Equipe, a French sports newspaper, published seven articles (12%). Most of the recorded data reported results of worldwide antidoping control (71%). No information about new drugs was found. The analysis of the selected articles pointed out the following: (i) the seriousness of observations related to doping since, during this 3-month period, we noted two deaths of athletes; (ii) the risks associated with the use of dietary supplements, particularly products including amphetamine derivatives; (iii) the interest in judicial investigation as an information source about doping in sports (investigation of suspicious deaths of Italian football players); and (iv) identification of the sports involved in doping (cycling, but also athletics, football, rugby). Systematic analysis of newspaper reports can be considered as a relevant method for monitoring the pharmacovigilance and pharmacoepidemiology of doping in sports.

  16. Reliable structural information from multiscale decomposition with the Mellor-Brady filter

    Science.gov (United States)

    Szilágyi, Tünde; Brady, Michael

    2009-08-01

    Image-based medical diagnosis typically relies on the (poorly reproducible) subjective classification of textures in order to differentiate between diseased and healthy pathology. Clinicians claim that significant benefits would arise from quantitative measures to inform clinical decision making. The first step in generating such measures is to extract local image descriptors - from noise corrupted and often spatially and temporally coarse resolution medical signals - that are invariant to illumination, translation, scale and rotation of the features. The Dual-Tree Complex Wavelet Transform (DT-CWT) provides a wavelet multiresolution analysis (WMRA) tool e.g. in 2D with good properties, but has limited rotational selectivity. Also, it requires computationally-intensive steering due to the inherently 1D operations performed. The monogenic signal, which is defined in n >= 2D with the Riesz transform gives excellent orientation information without the need for steering. Recent work has suggested the Monogenic Riesz-Laplace wavelet transform as a possible tool for integrating these two concepts into a coherent mathematical framework. We have found that the proposed construction suffers from a lack of rotational invariance and is not optimal for retrieving local image descriptors. In this paper we show: 1. Local frequency and local phase from the monogenic signal are not equivalent, especially in the phase congruency model of a "feature", and so they are not interchangeable for medical image applications. 2. The accuracy of local phase computation may be improved by estimating the denoising parameters while maximizing a new measure of "featureness".

  17. Challenges and opportunities for informational societies from the present to become reliable learning and knowledge societies

    Directory of Open Access Journals (Sweden)

    Eduardo ROMERO SÁNCHEZ

    2013-12-01

    Full Text Available This article pretend to describe the principal social trends and cultural features that prevail today, too look the philosophical foundation of thinking, feel , living and to give an educative  response and accepted to the axiological reality  and cultural present. In modern western societies great paradoxes and contradictions coexist: economical growth, technological development and greater dimensions of freedom, but also great consumption, cultural deterioration, technological dependence and unique thought. Given this we talk about the great possibilities and at the same time of the terrible threats that exist in that modern information societies. In order to become acquainted with this reality, we have focused the analysis in 3 key aspects: the impact of digital devolution, the condition of culture in contemporary society, and the need of a “new education”.

  18. Data integration and knowledge discovery in biomedical databases. Reliable information from unreliable sources

    Directory of Open Access Journals (Sweden)

    A Mitnitski

    2003-01-01

    Full Text Available To better understand information about human health from databases we analyzed three datasets collected for different purposes in Canada: a biomedical database of older adults, a large population survey across all adult ages, and vital statistics. Redundancy in the variables was established, and this led us to derive a generalized (macroscopic state variable, being a fitness/frailty index that reflects both individual and group health status. Evaluation of the relationship between fitness/frailty and the mortality rate revealed that the latter could be expressed in terms of variables generally available from any cross-sectional database. In practical terms, this means that the risk of mortality might readily be assessed from standard biomedical appraisals collected for other purposes.

  19. Reliable Design Versus Trust

    Science.gov (United States)

    Berg, Melanie; LaBel, Kenneth A.

    2016-01-01

    This presentation focuses on reliability and trust for the users portion of the FPGA design flow. It is assumed that the manufacturer prior to hand-off to the user tests FPGA internal components. The objective is to present the challenges of creating reliable and trusted designs. The following will be addressed: What makes a design vulnerable to functional flaws (reliability) or attackers (trust)? What are the challenges for verifying a reliable design versus a trusted design?

  20. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  1. How do nurses and teachers perform breast self-examination: are they reliable sources of information?

    Directory of Open Access Journals (Sweden)

    Ozvurmaz Safiye

    2007-06-01

    Full Text Available Abstract Background Breast cancer is the most common cause of cancer-related deaths among women worldwide. The aim of the present study was to determine and compare knowledge, behavior and attitudes among female nurses and teachers concerning breast self-examination (BSE. Methods Two-hundred and eighty nine women working in Aydin, Turkey (125 nurses and 164 teachers were included in the study. The data were collected using a questionnaire designed to measure the knowledge, attitudes and behavior of the groups. Analysis involved percentiles, χ2 tests, t tests and factor analysis. Results The knowledge of nurses about BSE was higher than that of teachers (81.5% versus 45.1%; p 0.05, whereas skills in performing self-examination were higher in nurses (p Conclusion We conclude that nurses and teachers should be supported with information enabling them to accomplish their roles in the community. To improve BSE practice, it is crucial to coordinate continuous and planned education.

  2. The reliability of financial information of charitable organizations: an exploratory study based on the Benford’s Law

    Directory of Open Access Journals (Sweden)

    Marco Antonio Figueiredo Milani Filho

    2013-08-01

    Full Text Available Benford's Law (BL is a logarithmic distribution which is useful to detect abnormal patterns of digits in number sets. It is often used as a primary data auditing method for detecting traces of errors, illegal practices or undesired occurrences, such as fraud and earning management. In this descriptive study, I analyzed the financial information (revenue and expenditure of the registered charitable hospitals located in Ontario and Quebec, which have the majority (71.4% of these organizations within Canada. The aim of this study was to verify the reliability of the financial data of the respective hospitals, using the probability distribution predicted by Benford’s Law as a proxy of reliability. The sample was composed by 1,334 observations related to 339 entities operating in the tax year 2009 and 328 entities in 2010, gathered from the Canada Revenue Agency’s database. To analyze the discrepancies between the actual and expected frequencies of the significant-digit, two statistics were calculated: Z-test and Pearson’s chi-square test. The results show that, with a confidence level of 95%, the data set of the organizations located in Ontario and Quebec have similar distribution to the BL, suggesting that, in a preliminary analysis, their financial data are free from bias.

  3. The Role of Web-Based Health Information in Help-Seeking Behavior Prior to a Diagnosis of Lung Cancer: A Mixed-Methods Study.

    Science.gov (United States)

    Mueller, Julia; Jay, Caroline; Harper, Simon; Todd, Chris

    2017-06-08

    Delays to diagnosis in lung cancer can lead to reduced chance of survival, and patients often wait for several months before presenting symptoms. The time between first symptom recognition until diagnosis has been theorized into three intervals: symptom appraisal, help-seeking, and diagnostic interval (here: "pathway to diagnosis"). Interventions are needed to reduce delays to diagnosis in lung cancer. The Web has become an important lay health information source and could potentially play a role in this pathway to diagnosis. Our overall aim was to gain a preliminary insight into whether Web-based information plays a role in the pathway to diagnosis in lung cancer in order to assess whether it may be possible to leverage this information source to reduce delays to diagnosis. Patients diagnosed with lung cancer in the 6 months before study entry completed a survey about whether (and how, if yes) they had used the Web to appraise their condition prior to diagnosis. Based on survey responses, we purposively sampled patients and their next-of-kin for semistructured interviews (24 interviews; 33 participants). Interview data were analyzed qualitatively using Framework Analysis in the context of the pathway to diagnosis model. A total of 113 patients completed the survey (age: mean 67.0, SD 8.8 years). In all, 20.4% (23/113) reported they or next-of-kin had researched their condition online before the diagnosis. The majority of searches (20/23, 87.0%) were conducted by or with the help of next-of-kin. Interview results suggest that patients and next-of-kin perceived an impact of the information found online on all three intervals in the time to diagnosis. In the appraisal interval, participants used online information to evaluate symptoms and possible causes. In the help-seeking interval, the Web was used to inform the decision of whether to present to health services. In the diagnostic interval, it was used to evaluate health care professionals' advice, to support

  4. Impact of Advanced Alarm Systems and Information Displays on Human Reliability in the Digital Control Room of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Jong Hyun; Dang, Vinh N

    2011-01-01

    This paper discusses the potential impacts of two advanced features of digital control rooms, alarm systems and information display systems, on the Human Reliability Analysis (HRA) in nuclear power plants. Although the features of digital control rooms have already been implemented in new or upgraded nuclear power plants, HRAs have so far not taken much credit for these features. In this circumstance, this paper aims at examining the potential effects of these features on human performance and discussing how these effects can be addressed with existing HRA methods. A conclusion derivable from past experimental studies is that those features are supportive in the severe conditions such as complex scenarios and knowledge-based works. However, in the less complex scenarios and rule-based work, they may have no difference with or sometimes negative impacts on operator performance. The discussion about the impact on the HRA is provided on the basis on the THERP method

  5. The Healthcare Improvement Scotland evidence note rapid review process: providing timely, reliable evidence to inform imperative decisions on healthcare.

    Science.gov (United States)

    McIntosh, Heather M; Calvert, Julie; Macpherson, Karen J; Thompson, Lorna

    2016-06-01

    Rapid review has become widely adopted by health technology assessment agencies in response to demand for evidence-based information to support imperative decisions. Concern about the credibility of rapid reviews and the reliability of their findings has prompted a call for wider publication of their methods. In publishing this overview of the accredited rapid review process developed by Healthcare Improvement Scotland, we aim to raise awareness of our methods and advance the discourse on best practice. Healthcare Improvement Scotland produces rapid reviews called evidence notes using a process that has achieved external accreditation through the National Institute for Health and Care Excellence. Key components include a structured approach to topic selection, initial scoping, considered stakeholder involvement, streamlined systematic review, internal quality assurance, external peer review and updating. The process was introduced in 2010 and continues to be refined over time in response to user feedback and operational experience. Decision-makers value the responsiveness of the process and perceive it as being a credible source of unbiased evidence-based information supporting advice for NHSScotland. Many agencies undertaking rapid reviews are striving to balance efficiency with methodological rigour. We agree that there is a need for methodological guidance and that it should be informed by better understanding of current approaches and the consequences of different approaches to streamlining systematic review methods. Greater transparency in the reporting of rapid review methods is essential to enable that to happen.

  6. Experimentally derived δ¹³C and δ¹⁵N discrimination factors for gray wolves and the impact of prior information in Bayesian mixing models.

    Directory of Open Access Journals (Sweden)

    Jonathan J Derbridge

    Full Text Available Stable isotope analysis of diet has become a common tool in conservation research. However, the multiple sources of uncertainty inherent in this analysis framework involve consequences that have not been thoroughly addressed. Uncertainty arises from the choice of trophic discrimination factors, and for Bayesian stable isotope mixing models (SIMMs, the specification of prior information; the combined effect of these aspects has not been explicitly tested. We used a captive feeding study of gray wolves (Canis lupus to determine the first experimentally-derived trophic discrimination factors of C and N for this large carnivore of broad conservation interest. Using the estimated diet in our controlled system and data from a published study on wild wolves and their prey in Montana, USA, we then investigated the simultaneous effect of discrimination factors and prior information on diet reconstruction with Bayesian SIMMs. Discrimination factors for gray wolves and their prey were 1.97‰ for δ13C and 3.04‰ for δ15N. Specifying wolf discrimination factors, as opposed to the commonly used red fox (Vulpes vulpes factors, made little practical difference to estimates of wolf diet, but prior information had a strong effect on bias, precision, and accuracy of posterior estimates. Without specifying prior information in our Bayesian SIMM, it was not possible to produce SIMM posteriors statistically similar to the estimated diet in our controlled study or the diet of wild wolves. Our study demonstrates the critical effect of prior information on estimates of animal diets using Bayesian SIMMs, and suggests species-specific trophic discrimination factors are of secondary importance. When using stable isotope analysis to inform conservation decisions researchers should understand the limits of their data. It may be difficult to obtain useful information from SIMMs if informative priors are omitted and species-specific discrimination factors are unavailable.

  7. Reliability of the information about the history of diagnosis and treatment of hypertension. Differences in regard to sex, age, and educational level. The pró-saúde study

    Directory of Open Access Journals (Sweden)

    Faerstein Eduardo

    2001-01-01

    Full Text Available OBJECTIVE: To assess the intraobserver reliability of the information about the history of diagnosis and treatment of hypertension. METHODS: A multidimensional health questionnaire, which was filled out by the interviewees, was applied twice with an interval of 2 weeks, in July '99, to 192 employees of the University of the State of Rio de Janeiro (UERJ, stratified by sex, age, and educational level. The intraobserver reliability of the answers provided was estimated by the kappa statistic and by the coefficient of intraclass correlation (CICC. RESULTS: The general kappa (k statistic was 0.75 (95% CI=0.73-0.77. Reliability was higher among females (k=0.88, 95% CI=0.85-0.91 than among males (k=0.62, 95% CI=0.59-0.65.The reliability was higher among individuals 40 years of age or older (k=0.79; 95% CI=0.73-0.84 than those from 18 to 39 years (k=0.52; 95% CI=0.45-0.57. Finally, the kappa statistic was higher among individuals with a university educational level (k=0.86; 95% CI=0.81-0.91 than among those with high school educational level (k=0.61; 95% CI=0.53-0.70 or those with middle school educational level (k=0.68; 95% CI=0.64-0.72. The coefficient of intraclass correlation estimated by the intraobserver agreement in regard to age at the time of the diagnosis of hypertension was 0.74. A perfect agreement between the 2 answers (k=1.00 was observed for 22 interviewees who reported prior prescription of antihypertensive medication. CONCLUSION: In the population studied, estimates of the reliability of the history of medical diagnosis of hypertension and its treatment ranged from substantial to almost perfect reliability.

  8. An audit of the reliability of influenza vaccination and medical information extracted from eHealth records in general practice.

    Science.gov (United States)

    Regan, Annette K; Gibbs, Robyn A; Effler, Paul V

    2018-05-31

    To evaluate the reliability of information in general practice (GP) electronic health records (EHRs), 2100 adult patients were randomly selected for interview regarding the presence of specific medical conditions and recent influenza vaccination. Agreement between self-report and data extracted from EHRs was compared using Cohen's kappa coefficient (k) and interpreted in accordance with Altman's Kappa Benchmarking criteria; 377 (18%) patients declined participation, and 608 (29%) could not be contacted. Of 1115 (53%) remaining, 856 (77%) were active patients (≥3 visits to the GP practice in the last two years) who provided complete information for analysis. Although a higher proportion of patients self-reported being vaccinated or having a medical condition compared to the EHR (50.7% vs 36.9%, and 39.4% vs 30.3%, respectively), there was "good" agreement between self-report and EHR for both vaccination status (κ = 0.67) and medical conditions (κ = 0.66). These findings suggest EHR may be useful for public health surveillance. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.

  9. Elapsed decision time affects the weighting of prior probability in a perceptual decision task

    Science.gov (United States)

    Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.

    2012-01-01

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274

  10. Estimating 4D-CBCT from prior information and extremely limited angle projections using structural PCA and weighted free-form deformation for lung radiotherapy.

    Science.gov (United States)

    Harris, Wendy; Zhang, You; Yin, Fang-Fang; Ren, Lei

    2017-03-01

    To investigate the feasibility of using structural-based principal component analysis (PCA) motion-modeling and weighted free-form deformation to estimate on-board 4D-CBCT using prior information and extremely limited angle projections for potential 4D target verification of lung radiotherapy. A technique for lung 4D-CBCT reconstruction has been previously developed using a deformation field map (DFM)-based strategy. In the previous method, each phase of the 4D-CBCT was generated by deforming a prior CT volume. The DFM was solved by a motion model extracted by a global PCA and free-form deformation (GMM-FD) technique, using a data fidelity constraint and deformation energy minimization. In this study, a new structural PCA method was developed to build a structural motion model (SMM) by accounting for potential relative motion pattern changes between different anatomical structures from simulation to treatment. The motion model extracted from planning 4DCT was divided into two structures: tumor and body excluding tumor, and the parameters of both structures were optimized together. Weighted free-form deformation (WFD) was employed afterwards to introduce flexibility in adjusting the weightings of different structures in the data fidelity constraint based on clinical interests. XCAT (computerized patient model) simulation with a 30 mm diameter lesion was simulated with various anatomical and respiratory changes from planning 4D-CT to on-board volume to evaluate the method. The estimation accuracy was evaluated by the volume percent difference (VPD)/center-of-mass-shift (COMS) between lesions in the estimated and "ground-truth" on-board 4D-CBCT. Different on-board projection acquisition scenarios and projection noise levels were simulated to investigate their effects on the estimation accuracy. The method was also evaluated against three lung patients. The SMM-WFD method achieved substantially better accuracy than the GMM-FD method for CBCT estimation using extremely

  11. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  12. Methods of Estimation the Reliability and Increasing the Informativeness of the Laboratory Results (Analysis of the Laboratory Case of Measurement the Indicators of Thyroid Function)

    OpenAIRE

    N A Kovyazina; N A Alhutova; N N Zybina; N M Kalinina

    2014-01-01

    The goal of the study was to demonstrate the multilevel laboratory quality management system and point at the methods of estimating the reliability and increasing the amount of information content of the laboratory results (on the example of the laboratory case). Results. The article examines the stages of laboratory quality management which has helped to estimate the reliability of the results of determining Free T3, Free T4 and TSH. The measurement results are presented by the expanded unce...

  13. The reliability and validity of the informant AD8 by comparison with a series of cognitive assessment tools in primary healthcare.

    Science.gov (United States)

    Shaik, Muhammad Amin; Xu, Xin; Chan, Qun Lin; Hui, Richard Jor Yeong; Chong, Steven Shih Tsze; Chen, Christopher Li-Hsian; Dong, YanHong

    2016-03-01

    The validity and reliability of the informant AD8 in primary healthcare has not been established. Therefore, the present study examined the validity and reliability of the informant AD8 in government subsidized primary healthcare centers in Singapore. Eligible patients (≥60 years old) were recruited from primary healthcare centers and their informants received the AD8. Patient-informant dyads who agreed for further cognitive assessments received the Mini-Mental State Examination (MMSE), Montreal Cognitive Assessment (MoCA), Clinical Dementia Rating (CDR), and a locally validated formal neuropsychological battery at a research center in a tertiary hospital. 1,082 informants completed AD8 assessment at two primary healthcare centers. Of these, 309 patients-informant dyads were further assessed, of whom 243 (78.6%) were CDR = 0; 22 (7.1%) were CDR = 0.5; and 44 (14.2%) were CDR≥1. The mean administration time of the informant AD8 was 2.3 ± 1.0 minutes. The informant AD8 demonstrated good internal consistency (Cronbach's α = 0.85); inter-rater reliability (Intraclass Correlation Coefficient (ICC) = 0.85); and test-retest reliability (weighted κ = 0.80). Concurrent validity, as measured by the correlation between total AD8 scores and CDR global (R = 0.65, p validity, as measured by convergent validity (R ≥ 0.4) between individual items of AD8 with CDR and neuropsychological domains was acceptable. The informant AD8 demonstrated good concurrent and construct validity and is a reliable measure to detect cognitive dysfunction in primary healthcare.

  14. PREP KITT, System Reliability by Fault Tree Analysis. PREP, Min Path Set and Min Cut Set for Fault Tree Analysis, Monte-Carlo Method. KITT, Component and System Reliability Information from Kinetic Fault Tree Theory

    International Nuclear Information System (INIS)

    Vesely, W.E.; Narum, R.E.

    1997-01-01

    1 - Description of problem or function: The PREP/KITT computer program package obtains system reliability information from a system fault tree. The PREP program finds the minimal cut sets and/or the minimal path sets of the system fault tree. (A minimal cut set is a smallest set of components such that if all the components are simultaneously failed the system is failed. A minimal path set is a smallest set of components such that if all of the components are simultaneously functioning the system is functioning.) The KITT programs determine reliability information for the components of each minimal cut or path set, for each minimal cut or path set, and for the system. Exact, time-dependent reliability information is determined for each component and for each minimal cut set or path set. For the system, reliability results are obtained by upper bound approximations or by a bracketing procedure in which various upper and lower bounds may be obtained as close to one another as desired. The KITT programs can handle independent components which are non-repairable or which have a constant repair time. Any assortment of non-repairable components and components having constant repair times can be considered. Any inhibit conditions having constant probabilities of occurrence can be handled. The failure intensity of each component is assumed to be constant with respect to time. The KITT2 program can also handle components which during different time intervals, called phases, may have different reliability properties. 2 - Method of solution: The PREP program obtains minimal cut sets by either direct deterministic testing or by an efficient Monte Carlo algorithm. The minimal path sets are obtained using the Monte Carlo algorithm. The reliability information is obtained by the KITT programs from numerical solution of the simple integral balance equations of kinetic tree theory. 3 - Restrictions on the complexity of the problem: The PREP program will obtain the minimal cut and

  15. Reliability assessment based on small samples of normal distribution

    International Nuclear Information System (INIS)

    Ma Zhibo; Zhu Jianshi; Xu Naixin

    2003-01-01

    When the pertinent parameter involved in reliability definition complies with normal distribution, the conjugate prior of its distributing parameters (μ, h) is of normal-gamma distribution. With the help of maximum entropy and the moments-equivalence principles, the subjective information of the parameter and the sampling data of its independent variables are transformed to a Bayesian prior of (μ,h). The desired estimates are obtained from either the prior or the posterior which is formed by combining the prior and sampling data. Computing methods are described and examples are presented to give demonstrations

  16. 78 FR 41339 - Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards

    Science.gov (United States)

    2013-07-10

    ...] Electric Reliability Organization Proposal To Retire Requirements in Reliability Standards AGENCY: Federal... Reliability Standards identified by the North American Electric Reliability Corporation (NERC), the Commission-certified Electric Reliability Organization. FOR FURTHER INFORMATION CONTACT: Kevin Ryan (Legal Information...

  17. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  18. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  19. Methods of Estimation the Reliability and Increasing the Informativeness of the Laboratory Results (Analysis of the Laboratory Case of Measurement the Indicators of Thyroid Function

    Directory of Open Access Journals (Sweden)

    N A Kovyazina

    2014-06-01

    Full Text Available The goal of the study was to demonstrate the multilevel laboratory quality management system and point at the methods of estimating the reliability and increasing the amount of information content of the laboratory results (on the example of the laboratory case. Results. The article examines the stages of laboratory quality management which has helped to estimate the reliability of the results of determining Free T3, Free T4 and TSH. The measurement results are presented by the expanded uncertainty and the evaluation of the dynamics. Conclusion. Compliance with mandatory measures for laboratory quality management system enables laboratories to obtain reliable results and calculate the parameters that are able to increase the amount of information content of laboratory tests in clinical decision making.

  20. Redefining reliability

    International Nuclear Information System (INIS)

    Paulson, S.L.

    1995-01-01

    Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas

  1. Constrained noninformative priors

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-10-01

    The Jeffreys noninformative prior distribution for a single unknown parameter is the distribution corresponding to a uniform distribution in the transformed model where the unknown parameter is approximately a location parameter. To obtain a prior distribution with a specified mean but with diffusion reflecting great uncertainty, a natural generalization of the noninformative prior is the distribution corresponding to the constrained maximum entropy distribution in the transformed model. Examples are given

  2. Using prior information from the medical literature in GWAS of oral cancer identifies novel susceptibility variant on chromosome 4--the AdAPT method.

    LENUS (Irish Health Repository)

    Johansson, Mattias

    2012-01-01

    Genome-wide association studies (GWAS) require large sample sizes to obtain adequate statistical power, but it may be possible to increase the power by incorporating complementary data. In this study we investigated the feasibility of automatically retrieving information from the medical literature and leveraging this information in GWAS.

  3. Application of Reliability in Breakwater Design

    DEFF Research Database (Denmark)

    Christiani, Erik

    methods to design certain types of breakwaters. Reliability analyses of the main armour and toe berm interaction is exemplified to show the effect of a multiple set of failure mechanisms. First the limit state equations of the main armour and toe interaction are derived from laboratory tests performed...... response, but in one area information has been lacking; bearing capacity has not been treated in depth in a probabilistic manner for breakwaters. Reliability analysis of conventional rubble mound breakwaters and conventional vertical breakwaters is exemplified for the purpose of establishing new ways...... by Bologna University. Thereafter a multiple system of failure for the interaction is established. Relevant stochastic parameters are characterized prior to the reliability evaluation. Application of reliability in crown wall design is illustrated by deriving relevant single foundation failure modes...

  4. Reliability data banks

    International Nuclear Information System (INIS)

    Cannon, A.G.; Bendell, A.

    1991-01-01

    Following an introductory chapter on Reliability, what is it, why it is needed, how it is achieved and measured, the principles of reliability data bases and analysis methodologies are the subject of the next two chapters. Achievements due to the development of data banks are mentioned for different industries in the next chapter, FACTS, a comprehensive information system for industrial safety and reliability data collection in process plants are covered next. CREDO, the Central Reliability Data Organization is described in the next chapter and is indexed separately, as is the chapter on DANTE, the fabrication reliability Data analysis system. Reliability data banks at Electricite de France and IAEA's experience in compiling a generic component reliability data base are also separately indexed. The European reliability data system, ERDS, and the development of a large data bank come next. The last three chapters look at 'Reliability data banks, - friend foe or a waste of time'? and future developments. (UK)

  5. Cue reliability and a landmark stability heuristic determine relative weighting between egocentric and allocentric visual information in memory-guided reach.

    Science.gov (United States)

    Byrne, Patrick A; Crawford, J Douglas

    2010-06-01

    It is not known how egocentric visual information (location of a target relative to the self) and allocentric visual information (location of a target relative to external landmarks) are integrated to form reach plans. Based on behavioral data from rodents and humans we hypothesized that the degree of stability in visual landmarks would influence the relative weighting. Furthermore, based on numerous cue-combination studies we hypothesized that the reach system would act like a maximum-likelihood estimator (MLE), where the reliability of both cues determines their relative weighting. To predict how these factors might interact we developed an MLE model that weighs egocentric and allocentric information based on their respective reliabilities, and also on an additional stability heuristic. We tested the predictions of this model in 10 human subjects by manipulating landmark stability and reliability (via variable amplitude vibration of the landmarks and variable amplitude gaze shifts) in three reach-to-touch tasks: an egocentric control (reaching without landmarks), an allocentric control (reaching relative to landmarks), and a cue-conflict task (involving a subtle landmark "shift" during the memory interval). Variability from all three experiments was used to derive parameters for the MLE model, which was then used to simulate egocentric-allocentric weighting in the cue-conflict experiment. As predicted by the model, landmark vibration--despite its lack of influence on pointing variability (and thus allocentric reliability) in the control experiment--had a strong influence on egocentric-allocentric weighting. A reduced model without the stability heuristic was unable to reproduce this effect. These results suggest heuristics for extrinsic cue stability are at least as important as reliability for determining cue weighting in memory-guided reaching.

  6. Development and pilot testing of an informed consent video for patients with limb trauma prior to debridement surgery using a modified Delphi technique.

    Science.gov (United States)

    Lin, Yen-Ko; Chen, Chao-Wen; Lee, Wei-Che; Lin, Tsung-Ying; Kuo, Liang-Chi; Lin, Chia-Ju; Shi, Leiyu; Tien, Yin-Chun; Cheng, Yuan-Chia

    2017-11-29

    Ensuring adequate informed consent for surgery in a trauma setting is challenging. We developed and pilot tested an educational video containing information regarding the informed consent process for surgery in trauma patients and a knowledge measure instrument and evaluated whether the audiovisual presentation improved the patients' knowledge regarding their procedure and aftercare and their satisfaction with the informed consent process. A modified Delphi technique in which a panel of experts participated in successive rounds of shared scoring of items to forecast outcomes was applied to reach a consensus among the experts. The resulting consensus was used to develop the video content and questions for measuring the understanding of the informed consent for debridement surgery in limb trauma patients. The expert panel included experienced patients. The participants in this pilot study were enrolled as a convenience sample of adult trauma patients scheduled to receive surgery. The modified Delphi technique comprised three rounds over a 4-month period. The items given higher scores by the experts in several categories were chosen for the subsequent rounds until consensus was reached. The experts reached a consensus on each item after the three-round process. The final knowledge measure comprising 10 questions was developed and validated. Thirty eligible trauma patients presenting to the Emergency Department (ED) were approached and completed the questionnaires in this pilot study. The participants exhibited significantly higher mean knowledge and satisfaction scores after watching the educational video than before watching the video. Our process is promising for developing procedure-specific informed consent and audiovisual aids in medical and surgical specialties. The educational video was developed using a scientific method that integrated the opinions of different stakeholders, particularly patients. This video is a useful tool for improving the knowledge and

  7. Reliability demonstration test planning: A three dimensional consideration

    International Nuclear Information System (INIS)

    Yadav, Om Prakash; Singh, Nanua; Goel, Parveen S.

    2006-01-01

    Increasing customer demand for reliability, fierce market competition on time-to-market and cost, and highly reliable products are making reliability testing more challenging task. This paper presents a systematic approach for identifying critical elements (subsystems and components) of the system and deciding the types of test to be performed to demonstrate reliability. It decomposes the system into three dimensions (i.e. physical, functional and time) and identifies critical elements in the design by allocating system level reliability to each candidate. The decomposition of system level reliability is achieved by using criticality index. The numerical value of criticality index for each candidate is derived based on the information available from failure mode and effects analysis (FMEA) document or warranty data from a prior system. It makes use of this information to develop reliability demonstration test plan for the identified (critical) failure mechanisms and physical elements. It also highlights the benefits of using prior information in order to locate critical spots in the design and in subsequent development of test plans. A case example is presented to demonstrate the proposed approach

  8. Quantum steganography using prior entanglement

    International Nuclear Information System (INIS)

    Mihara, Takashi

    2015-01-01

    Steganography is the hiding of secret information within innocent-looking information (e.g., text, audio, image, video, etc.). A quantum version of steganography is a method based on quantum physics. In this paper, we propose quantum steganography by combining quantum error-correcting codes with prior entanglement. In many steganographic techniques, embedding secret messages in error-correcting codes may cause damage to them if the embedded part is corrupted. However, our proposed steganography can separately create secret messages and the content of cover messages. The intrinsic form of the cover message does not have to be modified for embedding secret messages. - Highlights: • Our steganography combines quantum error-correcting codes with prior entanglement. • Our steganography can separately create secret messages and the content of cover messages. • Errors in cover messages do not have affect the recovery of secret messages. • We embed a secret message in the Steane code as an example of our steganography

  9. Quantum steganography using prior entanglement

    Energy Technology Data Exchange (ETDEWEB)

    Mihara, Takashi, E-mail: mihara@toyo.jp

    2015-06-05

    Steganography is the hiding of secret information within innocent-looking information (e.g., text, audio, image, video, etc.). A quantum version of steganography is a method based on quantum physics. In this paper, we propose quantum steganography by combining quantum error-correcting codes with prior entanglement. In many steganographic techniques, embedding secret messages in error-correcting codes may cause damage to them if the embedded part is corrupted. However, our proposed steganography can separately create secret messages and the content of cover messages. The intrinsic form of the cover message does not have to be modified for embedding secret messages. - Highlights: • Our steganography combines quantum error-correcting codes with prior entanglement. • Our steganography can separately create secret messages and the content of cover messages. • Errors in cover messages do not have affect the recovery of secret messages. • We embed a secret message in the Steane code as an example of our steganography.

  10. Bayesian methods in reliability

    Science.gov (United States)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  11. Revisiting the online health information reliability debate in the wake of "web 2.0": an inter-disciplinary literature and website review.

    Science.gov (United States)

    Adams, Samantha A

    2010-06-01

    The purpose of this inter-disciplinary literature review was to explore renewed concerns about the reliability of online health information in light of the increasing popularity of web applications that enable more end-user-generated content ("web 2.0"). The findings are based on a literature and web review. Literature was collected at four different points between October 2006 and October 2008 and included 56 sources from 10 academic disciplines. The web review consisted of following 6 blogs (including both new and archived posts, with comments) and one wiki for a period of 1.5 months and assessing the content for relevancy on six points, totaling 63 sources altogether. The reliability issues that are identified with respect to "web 2.0" reiterate more general concerns expressed about the web over the last 15 years. The difference, however, lies in the scope and scale of potential problems. Social scientists have also pointed to new issues that can be especially relevant for use of web 2.0 applications in health care. Specific points of renewed concern include: disclosure of authorship and information quality, anonymity and privacy, and the ability of individuals to apply information to their personal situation. Whether or not end-users understand what social scientists call "negative network externalities" is a new concern. Finally, not all reliability issues are negative-social networking and the shift from text-based information to symbolic information, images or interactive information, are considered to enhance patient education and to provide opportunities to reach diverse groups of patients. Interactive and collaborative web applications undeniably offer new opportunities for reaching patients and other health care consumers by facilitating lay information creation, sharing and retrieval. However, researchers must be careful and critical when incorporating applications or practices from other fields in health care. We must not easily dismiss concerns about

  12. Human factor reliability program

    International Nuclear Information System (INIS)

    Knoblochova, L.

    2017-01-01

    The human factor's reliability program was at Slovenske elektrarne, a.s. (SE) nuclear power plants. introduced as one of the components Initiatives of Excellent Performance in 2011. The initiative's goal was to increase the reliability of both people and facilities, in response to 3 major areas of improvement - Need for improvement of the results, Troubleshooting support, Supporting the achievement of the company's goals. The human agent's reliability program is in practice included: - Tools to prevent human error; - Managerial observation and coaching; - Human factor analysis; -Quick information about the event with a human agent; -Human reliability timeline and performance indicators; - Basic, periodic and extraordinary training in human factor reliability(authors)

  13. Prospective regularization design in prior-image-based reconstruction

    International Nuclear Information System (INIS)

    Dang, Hao; Siewerdsen, Jeffrey H; Stayman, J Webster

    2015-01-01

    Prior-image-based reconstruction (PIBR) methods leveraging patient-specific anatomical information from previous imaging studies and/or sequences have demonstrated dramatic improvements in dose utilization and image quality for low-fidelity data. However, a proper balance of information from the prior images and information from the measurements is required (e.g. through careful tuning of regularization parameters). Inappropriate selection of reconstruction parameters can lead to detrimental effects including false structures and failure to improve image quality. Traditional methods based on heuristics are subject to error and sub-optimal solutions, while exhaustive searches require a large number of computationally intensive image reconstructions. In this work, we propose a novel method that prospectively estimates the optimal amount of prior image information for accurate admission of specific anatomical changes in PIBR without performing full image reconstructions. This method leverages an analytical approximation to the implicitly defined PIBR estimator, and introduces a predictive performance metric leveraging this analytical form and knowledge of a particular presumed anatomical change whose accurate reconstruction is sought. Additionally, since model-based PIBR approaches tend to be space-variant, a spatially varying prior image strength map is proposed to optimally admit changes everywhere in the image (eliminating the need to know change locations a priori). Studies were conducted in both an ellipse phantom and a realistic thorax phantom emulating a lung nodule surveillance scenario. The proposed method demonstrated accurate estimation of the optimal prior image strength while achieving a substantial computational speedup (about a factor of 20) compared to traditional exhaustive search. Moreover, the use of the proposed prior strength map in PIBR demonstrated accurate reconstruction of anatomical changes without foreknowledge of change locations in

  14. Validity in assessment of prior learning

    DEFF Research Database (Denmark)

    Wahlgren, Bjarne; Aarkrog, Vibe

    2015-01-01

    , the article discusses the need for specific criteria for assessment. The reliability and validity of the assessment procedures depend on whether the competences are well-defined, and whether the teachers are adequately trained for the assessment procedures. Keywords: assessment, prior learning, adult...... education, vocational training, lifelong learning, validity...

  15. Arthur Prior and 'Now'

    DEFF Research Database (Denmark)

    Blackburn, Patrick Rowan; Jørgensen, Klaus Frovin

    2016-01-01

    ’s search led him through the work of Castañeda, and back to his own work on hybrid logic: the first made temporal reference philosophically respectable, the second made it technically feasible in a modal framework. With the aid of hybrid logic, Prior built a bridge from a two-dimensional UT calculus...

  16. Prior Knowledge Assessment Guide

    Science.gov (United States)

    2014-12-01

    assessment in a reasonable amount of time. Hands-on assessments can be extremely diverse in makeup and administration depending on the subject matter...DEVELOPING AND USING PRIOR KNOWLEDGE ASSESSMENTS TO TAILOR TRAINING D-3 ___ Brush and scrub ___ Orchards ___ Rice

  17. Validity and reliability testing of two instruments to measure breast cancer patients' concerns and information needs relating to radiation therapy

    International Nuclear Information System (INIS)

    Halkett, Georgia KB; Kristjanson, Linda J

    2007-01-01

    It is difficult to determine the most effective approach to patient education or tailor education interventions for patients in radiotherapy without tools that assess patients' specific radiation therapy information needs and concerns. Therefore, the aim of this study was to develop psychometrically sound tools to adequately determine the concerns and information needs of cancer patients during radiation therapy. Two tools were developed to (1) determine patients concerns about radiation therapy (RT Concerns Scale) and (2) ascertain patient's information needs at different time point during their radiation therapy (RT Information Needs Scale). Tools were based on previous research by the authors, published literature on breast cancer and radiation therapy and information behaviour research. Thirty-one breast cancer patients completed the questionnaire on one occasion and thirty participants completed the questionnaire on a second occasion to facilitate test-retest reliability. One participant's responses were removed from the analysis. Results were analysed for content validity, internal consistency and stability over time. Both tools demonstrated high internal consistency and adequate stability over time. The nine items in the RT Concerns Scale were retained because they met all pre-set psychometric criteria. Two items were deleted from the RT Information Needs Scale because they did not meet content validity criteria and did not achieve pre-specified criteria for internal consistency. This tool now contains 22 items. This paper provides preliminary data suggesting that the two tools presented are reliable and valid and would be suitable for use in trials or in the clinical setting

  18. Validation of the translation of an instrument to measure reliability of written information on treatment choices: a study on attention deficit/hyperactivity disorder (ADHD).

    Science.gov (United States)

    Montoya, A; Llopis, N; Gilaberte, I

    2011-12-01

    DISCERN is an instrument designed to help patients assess the reliability of written information on treatment choices. Originally created in English, there is no validated Spanish version of this instrument. This study seeks to validate the Spanish translation of the DISCERN instrument used as a primary measure on a multicenter study aimed to assess the reliability of web-based information on treatment choices for attention deficit/hyperactivity disorder (ADHD). We used a modified version of a method for validating translated instruments in which the original source-language version is formally compared with the back-translated source-language version. Each item was ranked in terms of comparability of language, similarity of interpretability, and degree of understandability. Responses used Likert scales ranging from 1 to 7, where 1 indicates the best interpretability, language and understandability, and 7 indicates the worst. Assessments were performed by 20 raters fluent in the source language. The Spanish translation of DISCERN, based on ratings of comparability, interpretability and degree of understandability (mean score (SD): 1.8 (1.1), 1.4 (0.9) and 1.6 (1.1), respectively), was considered extremely comparable. All items received a score of less than three, therefore no further revision of the translation was needed. The validation process showed that the quality of DISCERN translation was high, validating the comparable language of the tool translated on assessing written information on treatment choices for ADHD.

  19. Reliability demonstration test planning using bayesian analysis

    International Nuclear Information System (INIS)

    Chandran, Senthil Kumar; Arul, John A.

    2003-01-01

    In Nuclear Power Plants, the reliability of all the safety systems is very critical from the safety viewpoint and it is very essential that the required reliability requirements be met while satisfying the design constraints. From practical experience, it is found that the reliability of complex systems such as Safety Rod Drive Mechanism is of the order of 10 -4 with an uncertainty factor of 10. To demonstrate the reliability of such systems is prohibitive in terms of cost and time as the number of tests needed is very large. The purpose of this paper is to develop a Bayesian reliability demonstrating testing procedure for exponentially distributed failure times with gamma prior distribution on the failure rate which can be easily and effectively used to demonstrate component/subsystem/system reliability conformance to stated requirements. The important questions addressed in this paper are: With zero failures, how long one should perform the tests and how many components are required to conclude with a given degree of confidence, that the component under test, meets the reliability requirement. The procedure is explained with an example. This procedure can also be extended to demonstrate with more number of failures. The approach presented is applicable for deriving test plans for demonstrating component failure rates of nuclear power plants, as the failure data for similar components are becoming available in existing plants elsewhere. The advantages of this procedure are the criterion upon which the procedure is based is simple and pertinent, the fitting of the prior distribution is an integral part of the procedure and is based on the use of information regarding two percentiles of this distribution and finally, the procedure is straightforward and easy to apply in practice. (author)

  20. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  1. Reliability, Validity, and Utility of Instruments for Self-Report and Informant Report Concerning Symptoms of ADHD in Adult Patients

    Science.gov (United States)

    Kooij, J. J. Sandra; Boonstra, A. Marije; Swinkels, S. H. N.; Bekker, Evelijne M.; de Noord, Ineke; Buitelaar, Jan K.

    2008-01-01

    Objective: To study the correlation between symptoms of ADHD in adults, obtained with different methods and from different sources. Method: Information was obtained from 120 adults with ADHD, their partners, and their parents, using the ADHD Rating Scale, the Conners' Adult ADHD Rating Scales (CAARS), the Brown Attention-Deficit Disorder Scale…

  2. Reliability, validity, and utility of instruments for self-report and informant report concerning symptoms of ADHD in adult patients.

    NARCIS (Netherlands)

    Kooij, J.J.S.; Boonstra, A.M.; Swinkels, S.H.N.; Bekker, E.M.; Noord, I. de; Buitelaar, J.K.

    2008-01-01

    OBJECTIVE: To study the correlation between symptoms of ADHD in adults, obtained with different methods and from different sources. METHOD: Information was obtained from 120 adults with ADHD, their partners, and their parents, using the ADHD Rating Scale, the Conners' Adult ADHD Rating Scales

  3. Reliability of neural encoding

    DEFF Research Database (Denmark)

    Alstrøm, Preben; Beierholm, Ulrik; Nielsen, Carsten Dahl

    2002-01-01

    The reliability with which a neuron is able to create the same firing pattern when presented with the same stimulus is of critical importance to the understanding of neuronal information processing. We show that reliability is closely related to the process of phaselocking. Experimental results f...

  4. A reliable user authentication and key agreement scheme for Web-based Hospital-acquired Infection Surveillance Information System.

    Science.gov (United States)

    Wu, Zhen-Yu; Tseng, Yi-Ju; Chung, Yufang; Chen, Yee-Chun; Lai, Feipei

    2012-08-01

    With the rapid development of the Internet, both digitization and electronic orientation are required on various applications in the daily life. For hospital-acquired infection control, a Web-based Hospital-acquired Infection Surveillance System was implemented. Clinical data from different hospitals and systems were collected and analyzed. The hospital-acquired infection screening rules in this system utilized this information to detect different patterns of defined hospital-acquired infection. Moreover, these data were integrated into the user interface of a signal entry point to assist physicians and healthcare providers in making decisions. Based on Service-Oriented Architecture, web-service techniques which were suitable for integrating heterogeneous platforms, protocols, and applications, were used. In summary, this system simplifies the workflow of hospital infection control and improves the healthcare quality. However, it is probable for attackers to intercept the process of data transmission or access to the user interface. To tackle the illegal access and to prevent the information from being stolen during transmission over the insecure Internet, a password-based user authentication scheme is proposed for information integrity.

  5. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  6. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  7. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  8. Sets of priors reflecting prior-data conflict and agreement

    NARCIS (Netherlands)

    Walter, G.M.; Coolen, F.P.A.; Carvalho, J.P.; Lesot, M.-J.; Kaymak, U.; Vieira, S.; Bouchon-Meunier, B.; Yager, R.R.

    2016-01-01

    Bayesian inference enables combination of observations with prior knowledge in the reasoning process. The choice of a particular prior distribution to represent the available prior knowledge is, however, often debatable, especially when prior knowledge is limited or data are scarce, as then

  9. The minimum information required for a glycomics experiment (MIRAGE) project: sample preparation guidelines for reliable reporting of glycomics datasets.

    Science.gov (United States)

    Struwe, Weston B; Agravat, Sanjay; Aoki-Kinoshita, Kiyoko F; Campbell, Matthew P; Costello, Catherine E; Dell, Anne; Ten Feizi; Haslam, Stuart M; Karlsson, Niclas G; Khoo, Kay-Hooi; Kolarich, Daniel; Liu, Yan; McBride, Ryan; Novotny, Milos V; Packer, Nicolle H; Paulson, James C; Rapp, Erdmann; Ranzinger, Rene; Rudd, Pauline M; Smith, David F; Tiemeyer, Michael; Wells, Lance; York, William S; Zaia, Joseph; Kettner, Carsten

    2016-09-01

    The minimum information required for a glycomics experiment (MIRAGE) project was established in 2011 to provide guidelines to aid in data reporting from all types of experiments in glycomics research including mass spectrometry (MS), liquid chromatography, glycan arrays, data handling and sample preparation. MIRAGE is a concerted effort of the wider glycomics community that considers the adaptation of reporting guidelines as an important step towards critical evaluation and dissemination of datasets as well as broadening of experimental techniques worldwide. The MIRAGE Commission published reporting guidelines for MS data and here we outline guidelines for sample preparation. The sample preparation guidelines include all aspects of sample generation, purification and modification from biological and/or synthetic carbohydrate material. The application of MIRAGE sample preparation guidelines will lead to improved recording of experimental protocols and reporting of understandable and reproducible glycomics datasets. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Cognitive Behavioral Therapy for Psychosis (CBT-p) Delivered in a Community Mental Health Setting: A Case Comparison of Clients Receiving CBT Informed Strategies by Case Managers Prior to Therapy.

    Science.gov (United States)

    Sivec, Harry J; Montesano, Vicki L; Skubby, David; Knepp, Kristen A; Munetz, Mark R

    2017-02-01

    This exploratory case comparison examines the influence of case management activities on engagement and progress in psychotherapy for clients with schizophrenia. Six clients were recruited to participate in ten sessions of Cognitive Behavioral Therapy for psychosis (CBT-p). Three clients who had received Cognitive Behavioral techniques for psychosis (CBt-p, a low-intensity case management intervention) prior to receiving therapy were selected from referrals. A comparison group of three clients who had received standard case management services was selected from referrals. Cases within and across groups were compared on outcome measures and observations from case review were offered to inform future research. Delivering CBT-p services on a continuum from low- to high-intensity is discussed.

  11. Prior indigenous technological species

    Science.gov (United States)

    Wright, Jason T.

    2018-01-01

    One of the primary open questions of astrobiology is whether there is extant or extinct life elsewhere the solar system. Implicit in much of this work is that we are looking for microbial or, at best, unintelligent life, even though technological artefacts might be much easier to find. Search for Extraterrestrial Intelligence (SETI) work on searches for alien artefacts in the solar system typically presumes that such artefacts would be of extrasolar origin, even though life is known to have existed in the solar system, on Earth, for eons. But if a prior technological, perhaps spacefaring, species ever arose in the solar system, it might have produced artefacts or other technosignatures that have survived to present day, meaning solar system artefact SETI provides a potential path to resolving astrobiology's question. Here, I discuss the origins and possible locations for technosignatures of such a prior indigenous technological species, which might have arisen on ancient Earth or another body, such as a pre-greenhouse Venus or a wet Mars. In the case of Venus, the arrival of its global greenhouse and potential resurfacing might have erased all evidence of its existence on the Venusian surface. In the case of Earth, erosion and, ultimately, plate tectonics may have erased most such evidence if the species lived Gyr ago. Remaining indigenous technosignatures might be expected to be extremely old, limiting the places they might still be found to beneath the surfaces of Mars and the Moon, or in the outer solar system.

  12. Population distribution of flexible molecules from maximum entropy analysis using different priors as background information: application to the Φ, Ψ-conformational space of the α-(1-->2)-linked mannose disaccharide present in N- and O-linked glycoproteins.

    Science.gov (United States)

    Säwén, Elin; Massad, Tariq; Landersjö, Clas; Damberg, Peter; Widmalm, Göran

    2010-08-21

    The conformational space available to the flexible molecule α-D-Manp-(1-->2)-α-D-Manp-OMe, a model for the α-(1-->2)-linked mannose disaccharide in N- or O-linked glycoproteins, is determined using experimental data and molecular simulation combined with a maximum entropy approach that leads to a converged population distribution utilizing different input information. A database survey of the Protein Data Bank where structures having the constituent disaccharide were retrieved resulted in an ensemble with >200 structures. Subsequent filtering removed erroneous structures and gave the database (DB) ensemble having three classes of mannose-containing compounds, viz., N- and O-linked structures, and ligands to proteins. A molecular dynamics (MD) simulation of the disaccharide revealed a two-state equilibrium with a major and a minor conformational state, i.e., the MD ensemble. These two different conformation ensembles of the disaccharide were compared to measured experimental spectroscopic data for the molecule in water solution. However, neither of the two populations were compatible with experimental data from optical rotation, NMR (1)H,(1)H cross-relaxation rates as well as homo- and heteronuclear (3)J couplings. The conformational distributions were subsequently used as background information to generate priors that were used in a maximum entropy analysis. The resulting posteriors, i.e., the population distributions after the application of the maximum entropy analysis, still showed notable deviations that were not anticipated based on the prior information. Therefore, reparameterization of homo- and heteronuclear Karplus relationships for the glycosidic torsion angles Φ and Ψ were carried out in which the importance of electronegative substituents on the coupling pathway was deemed essential resulting in four derived equations, two (3)J(COCC) and two (3)J(COCH) being different for the Φ and Ψ torsions, respectively. These Karplus relationships are denoted

  13. Heuristics as Bayesian inference under extreme priors.

    Science.gov (United States)

    Parpart, Paula; Jones, Matt; Love, Bradley C

    2018-05-01

    Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Possibility of obtaining reliable information on component safety by means of large-scale tensile samples with Orowan-Soete flaws

    International Nuclear Information System (INIS)

    Aurich, D.; Wobst, K.; Kafka, H.

    1984-01-01

    The aim of the paper is to review the present knowledge regarding the ability of wide plate tensile specimen with saw cut trough center flaws of providing accurate information on component reliability; it points out the advantages and disadvantages of this specimen geometries. The effect of temperature, specimen geometry, ligament size and notch radii are discussed in comparison with other specimen geometries. This is followed by a comparison of the results of such tests with tests on inside stressed tanks. Conclusions: wide-plate tensile specimen are generally appropriate for assessing welded joints. However, they result in a more favourable evaluation of low-toughness steels from the point of view of crack growth than of high-toughness and soft steels in case of stresses with incipient cracks, as compared with the results obtained with three-point bending samples. (orig.) [de

  15. [Parenting stress and the reliability of parental information in the diagnostics of children and adolescents with symptoms of psychiatric and behavioral disorders].

    Science.gov (United States)

    Irlbauer-Müller, Viktoria; Eichler, Anna; Stemmler, Mark; Moll, Gunther H; Kratz, Oliver

    2017-07-01

    Information from parents is regularly used in the diagnostic process of children and adolescents with psychiatric symptoms. But the reliability of this information is debatable, because the parents’ own stress can distort their perceptions of the child’s symptoms. For each of N = 68 children and adolescents (11–18 years) who were using mental health services for the first time, we evaluated the ratings of a parent and a professional clinician (internalizing, externalizing symptoms, total-problem score). In addition, parenting stress was scored on the Eltern-Belastungs-Inventars (EBI, Tröster, 2011), which measures both child-related stress and parent-related stress as well as total stress. Highly stressed parent ratings differed more from the clinicians’ ratings than the ratings of less stressed parents. Additionally, correlations showed that higher parenting stress resulted in larger differences between the parent’s and the clinician’s assessments. Multiple regressions proved the predictive value of child-caused parenting stress for these differences. These results apply for internalizing symptoms, externalizing symptoms, and total-problem score. Parenting stress should be evaluated systematically in order to carefully assess the value of the information from parents and to determine how it should be included in diagnostic and therapeutical decisions.

  16. Generalized Bayesian inference with sets of conjugate priors for dealing with prior-data conflict : course at Lund University

    NARCIS (Netherlands)

    Walter, G.

    2015-01-01

    In the Bayesian approach to statistical inference, possibly subjective knowledge on model parameters can be expressed by so-called prior distributions. A prior distribution is updated, via Bayes’ Rule, to the so-called posterior distribution, which combines prior information and information from

  17. PET-MR image fusion in soft tissue sarcoma: accuracy, reliability and practicality of interactive point-based and automated mutual information techniques

    International Nuclear Information System (INIS)

    Somer, Edward J.R.; Marsden, Paul K.; Benatar, Nigel A.; O'Doherty, Michael J.; Goodey, Joanne; Smith, Michael A.

    2003-01-01

    The fusion of functional positron emission tomography (PET) data with anatomical magnetic resonance (MR) or computed tomography images, using a variety of interactive and automated techniques, is becoming commonplace, with the technique of choice dependent on the specific application. The case of PET-MR image fusion in soft tissue is complicated by a lack of conspicuous anatomical features and deviation from the rigid-body model. Here we compare a point-based external marker technique with an automated mutual information algorithm and discuss the practicality, reliability and accuracy of each when applied to the study of soft tissue sarcoma. Ten subjects with suspected sarcoma in the knee, thigh, groin, flank or back underwent MR and PET scanning after the attachment of nine external fiducial markers. In the assessment of the point-based technique, three error measures were considered: fiducial localisation error (FLE), fiducial registration error (FRE) and target registration error (TRE). FLE, which represents the accuracy with which the fiducial points can be located, is related to the FRE minimised by the registration algorithm. The registration accuracy is best characterised by the TRE, which is the distance between corresponding points in each image space after registration. In the absence of salient features within the target volume, the TRE can be measured at fiducials excluded from the registration process. To assess the mutual information technique, PET data, acquired after physically removing the markers, were reconstructed in a variety of ways and registered with MR. Having applied the transform suggested by the algorithm to the PET scan acquired before the markers were removed, the residual distance between PET and MR marker-pairs could be measured. The manual point-based technique yielded the best results (RMS TRE =8.3 mm, max =22.4 mm, min =1.7 mm), performing better than the automated algorithm (RMS TRE =20.0 mm, max =30.5 mm, min =7.7 mm) when

  18. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  19. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  20. Finding Reliable Health Information Online

    Science.gov (United States)

    ... online resources. Online Medical Dictionaries and Encyclopedias Complex medical terminology can be difficult to understand. The Web provides some excellent resources to aid in our understanding of these terms and medical jargon. NHGRI Talking Glossary of Genetic Terms www. ...

  1. 2017 NREL Photovoltaic Reliability Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-15

    NREL's Photovoltaic (PV) Reliability Workshop (PVRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology -- both critical goals for moving PV technologies deeper into the electricity marketplace.

  2. Reliability data book

    International Nuclear Information System (INIS)

    Bento, J.P.; Boerje, S.; Ericsson, G.; Hasler, A.; Lyden, C.O.; Wallin, L.; Poern, K.; Aakerlund, O.

    1985-01-01

    The main objective for the report is to improve failure data for reliability calculations as parts of safety analyses for Swedish nuclear power plants. The work is based primarily on evaluations of failure reports as well as information provided by the operation and maintenance staff of each plant. In the report are presented charts of reliability data for: pumps, valves, control rods/rod drives, electrical components, and instruments. (L.E.)

  3. Invasive urodynamic testing prior to surgical treatment for stress urinary incontinence in women: cost-effectiveness and value of information analyses in the context of a mixed methods feasibility study.

    Science.gov (United States)

    Homer, Tara; Shen, Jing; Vale, Luke; McColl, Elaine; Tincello, Douglas G; Hilton, Paul

    2018-01-01

    INVESTIGATE-I (INVasive Evaluation before Surgical Treatment of Incontinence Gives Added Therapeutic Effect?) was a mixed methods study to assess the feasibility of a future randomised controlled trial of invasive urodynamic testing (IUT) prior to surgery for stress urinary incontinence (SUI) in women. Here we report one of the study's five components, with the specific objectives of (i) exploring the cost-effectiveness of IUT compared with clinical assessment plus non-invasive tests (henceforth described as 'IUT' and 'no IUT' respectively) in women with SUI or stress-predominant mixed urinary incontinence (MUI) prior to surgery, and (ii) determining the expected net gain (ENG) from additional research. Study participants were women with SUI or stress-predominant MUI who had failed to respond to conservative treatments recruited from seven UK urogynaecology and female urology units. They were randomised to receive either 'IUT' or 'no IUT' before undergoing further treatment. Data from 218 women were used in the economic analysis. Cost utility, net benefit and value of information (VoI) analyses were performed within a randomised controlled pilot trial. Costs and quality-adjusted life years (QALYs) were estimated over 6 months to determine the incremental cost per QALY of 'IUT' compared to 'no IUT'. Net monetary benefit informed the VoI analysis. The VoI estimated the ENG and optimal sample size for a future definitive trial. At 6 months, the mean difference in total average cost was £138 ( p  = 0.071) in favour of 'IUT'; there was no difference in QALYs estimated from the SF-12 (difference 0.004; p  = 0.425) and EQ-5D-3L (difference - 0.004; p  = 0.725); therefore, the probability of IUT being cost-effective remains uncertain. The estimated ENG was positive for further research to address this uncertainty with an optimal sample size of 404 women. This is the largest economic evaluation of IUT. On average, up to 6 months after treatment, 'IUT' may

  4. External Prior Guided Internal Prior Learning for Real-World Noisy Image Denoising

    Science.gov (United States)

    Xu, Jun; Zhang, Lei; Zhang, David

    2018-06-01

    Most of existing image denoising methods learn image priors from either external data or the noisy image itself to remove noise. However, priors learned from external data may not be adaptive to the image to be denoised, while priors learned from the given noisy image may not be accurate due to the interference of corrupted noise. Meanwhile, the noise in real-world noisy images is very complex, which is hard to be described by simple distributions such as Gaussian distribution, making real noisy image denoising a very challenging problem. We propose to exploit the information in both external data and the given noisy image, and develop an external prior guided internal prior learning method for real noisy image denoising. We first learn external priors from an independent set of clean natural images. With the aid of learned external priors, we then learn internal priors from the given noisy image to refine the prior model. The external and internal priors are formulated as a set of orthogonal dictionaries to efficiently reconstruct the desired image. Extensive experiments are performed on several real noisy image datasets. The proposed method demonstrates highly competitive denoising performance, outperforming state-of-the-art denoising methods including those designed for real noisy images.

  5. Prior Elicitation, Assessment and Inference with a Dirichlet Prior

    Directory of Open Access Journals (Sweden)

    Michael Evans

    2017-10-01

    Full Text Available Methods are developed for eliciting a Dirichlet prior based upon stating bounds on the individual probabilities that hold with high prior probability. This approach to selecting a prior is applied to a contingency table problem where it is demonstrated how to assess the prior with respect to the bias it induces as well as how to check for prior-data conflict. It is shown that the assessment of a hypothesis via relative belief can easily take into account what it means for the falsity of the hypothesis to correspond to a difference of practical importance and provide evidence in favor of a hypothesis.

  6. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  7. Adaptive local thresholding for robust nucleus segmentation utilizing shape priors

    Science.gov (United States)

    Wang, Xiuzhong; Srinivas, Chukka

    2016-03-01

    This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.

  8. WE-EF-BRD-03: I Want It Now!: Advances in MRI Acquisition, Reconstruction and the Use of Priors to Enable Fast Anatomic and Physiologic Imaging to Inform Guidance and Adaptation Decisions

    Energy Technology Data Exchange (ETDEWEB)

    Hu, Y. [Mayo Clinic Arizona (United States)

    2015-06-15

    MRI-guided treatment is a growing area of medicine, particularly in radiotherapy and surgery. The exquisite soft tissue anatomic contrast offered by MRI, along with functional imaging, makes the use of MRI during therapeutic procedures very attractive. Challenging the utility of MRI in the therapy room are many issues including the physics of MRI and the impact on the environment and therapeutic instruments, the impact of the room and instruments on the MRI; safety, space, design and cost. In this session, the applications and challenges of MRI-guided treatment will be described. The session format is: Past, present and future: MRI-guided radiotherapy from 2005 to 2025: Jan Lagendijk Battling Maxwell’s equations: Physics challenges and solutions for hybrid MRI systems: Paul Keall I want it now!: Advances in MRI acquisition, reconstruction and the use of priors to enable fast anatomic and physiologic imaging to inform guidance and adaptation decisions: Yanle Hu MR in the OR: The growth and applications of MRI for interventional radiology and surgery: Rebecca Fahrig Learning Objectives: To understand the history and trajectory of MRI-guided radiotherapy To understand the challenges of integrating MR imaging systems with linear accelerators To understand the latest in fast MRI methods to enable the visualisation of anatomy and physiology on radiotherapy treatment timescales To understand the growing role and challenges of MRI for image-guided surgical procedures My disclosures are publicly available and updated at: http://sydney.edu.au/medicine/radiation-physics/about-us/disclosures.php.

  9. SU-G-JeP3-04: Estimating 4D CBCT from Prior Information and Extremely Limited Angle Projections Using Structural PCA and Weighted Free-Form Deformation

    International Nuclear Information System (INIS)

    Harris, W; Yin, F; Zhang, Y; Ren, L

    2016-01-01

    Purpose: To investigate the feasibility of using structure-based principal component analysis (PCA) motion-modeling and weighted free-form deformation to estimate on-board 4D-CBCT using prior information and extremely limited angle projections for potential 4D target verification of lung radiotherapy. Methods: A technique for lung 4D-CBCT reconstruction has been previously developed using a deformation field map (DFM)-based strategy. In the previous method, each phase of the 4D-CBCT was generated by deforming a prior CT volume. The DFM was solved by a motion-model extracted by global PCA and a free-form deformation (GMM-FD) technique, using data fidelity constraint and the deformation energy minimization. In this study, a new structural-PCA method was developed to build a structural motion-model (SMM) by accounting for potential relative motion pattern changes between different anatomical structures from simulation to treatment. The motion model extracted from planning 4DCT was divided into two structures: tumor and body excluding tumor, and the parameters of both structures were optimized together. Weighted free-form deformation (WFD) was employed afterwards to introduce flexibility in adjusting the weightings of different structures in the data fidelity constraint based on clinical interests. XCAT (computerized patient model) simulation with a 30 mm diameter lesion was simulated with various anatomical and respirational changes from planning 4D-CT to onboard volume. The estimation accuracy was evaluated by the Volume-Percent-Difference (VPD)/Center-of-Mass-Shift (COMS) between lesions in the estimated and “ground-truth” on board 4D-CBCT. Results: Among 6 different XCAT scenarios corresponding to respirational and anatomical changes from planning CT to on-board using single 30° on-board projections, the VPD/COMS for SMM-WFD was reduced to 10.64±3.04%/1.20±0.45mm from 21.72±9.24%/1.80±0.53mm for GMM-FD. Using 15° orthogonal projections, the VPD/COMS was

  10. SU-G-JeP3-04: Estimating 4D CBCT from Prior Information and Extremely Limited Angle Projections Using Structural PCA and Weighted Free-Form Deformation

    Energy Technology Data Exchange (ETDEWEB)

    Harris, W; Yin, F; Zhang, Y; Ren, L [Duke University Medical Center, Durham, NC (United States)

    2016-06-15

    Purpose: To investigate the feasibility of using structure-based principal component analysis (PCA) motion-modeling and weighted free-form deformation to estimate on-board 4D-CBCT using prior information and extremely limited angle projections for potential 4D target verification of lung radiotherapy. Methods: A technique for lung 4D-CBCT reconstruction has been previously developed using a deformation field map (DFM)-based strategy. In the previous method, each phase of the 4D-CBCT was generated by deforming a prior CT volume. The DFM was solved by a motion-model extracted by global PCA and a free-form deformation (GMM-FD) technique, using data fidelity constraint and the deformation energy minimization. In this study, a new structural-PCA method was developed to build a structural motion-model (SMM) by accounting for potential relative motion pattern changes between different anatomical structures from simulation to treatment. The motion model extracted from planning 4DCT was divided into two structures: tumor and body excluding tumor, and the parameters of both structures were optimized together. Weighted free-form deformation (WFD) was employed afterwards to introduce flexibility in adjusting the weightings of different structures in the data fidelity constraint based on clinical interests. XCAT (computerized patient model) simulation with a 30 mm diameter lesion was simulated with various anatomical and respirational changes from planning 4D-CT to onboard volume. The estimation accuracy was evaluated by the Volume-Percent-Difference (VPD)/Center-of-Mass-Shift (COMS) between lesions in the estimated and “ground-truth” on board 4D-CBCT. Results: Among 6 different XCAT scenarios corresponding to respirational and anatomical changes from planning CT to on-board using single 30° on-board projections, the VPD/COMS for SMM-WFD was reduced to 10.64±3.04%/1.20±0.45mm from 21.72±9.24%/1.80±0.53mm for GMM-FD. Using 15° orthogonal projections, the VPD/COMS was

  11. CISN Display Progress to Date - Reliable Delivery of Real-Time Earthquake Information, and ShakeMap to Critical End Users

    Science.gov (United States)

    Rico, H.; Hauksson, E.; Thomas, E.; Friberg, P.; Frechette, K.; Given, D.

    2003-12-01

    The California Integrated Seismic Network (CISN) has collaborated to develop a next-generation earthquake notification system that is nearing its first operations-ready release. The CISN Display actively alerts users of seismic data, and vital earthquake hazards information following a significant event. It will primarily replace the Caltech/USGS Broadcast of Earthquakes (CUBE) and Rapid Earthquake Data Integration (REDI) Display as the principal means of delivering geographical seismic data to emergency operations centers, utility companies and media outlets. A subsequent goal is to provide automated access to the many Web products produced by regional seismic networks after an earthquake. Another aim is to create a highly configurable client, allowing user organizations to overlay infrastructure data critical to their roles as first-responders, or lifeline operators. And the final goal is to integrate these requirements, into a package offering several layers of reliability to ensure delivery of services. Central to the CISN Display's role as a gateway to Web-based earthquake products is its comprehensive XML-messaging schema. The message model uses many of the same attributes in the CUBE format, but extends the old standard by provisioning additional elements for products currently available, and others yet to be considered. The client consumes these XML-messages, sorts them through a resident Quake Data Merge filter, and posts updates that also include hyperlinks associated to specific event IDs on the display map. Earthquake products available for delivery to the CISN Display are ShakeMap, focal mechanisms, waveform data, felt reports, aftershock forecasts and earthquake commentaries. By design the XML-message schema can evolve as products and information needs change, without breaking existing applications that rely on it. The latest version of the CISN Display can also automatically download ShakeMaps and display shaking intensity within the GIS system. This

  12. A NOVEL TECHNIQUE TO IMPROVE PHOTOMETRY IN CONFUSED IMAGES USING GRAPHS AND BAYESIAN PRIORS

    International Nuclear Information System (INIS)

    Safarzadeh, Mohammadtaher; Ferguson, Henry C.; Lu, Yu; Inami, Hanae; Somerville, Rachel S.

    2015-01-01

    We present a new technique for overcoming confusion noise in deep far-infrared Herschel space telescope images making use of prior information from shorter λ < 2 μm wavelengths. For the deepest images obtained by Herschel, the flux limit due to source confusion is about a factor of three brighter than the flux limit due to instrumental noise and (smooth) sky background. We have investigated the possibility of de-confusing simulated Herschel PACS 160 μm images by using strong Bayesian priors on the positions and weak priors on the flux of sources. We find the blended sources and group them together and simultaneously fit their fluxes. We derive the posterior probability distribution function of fluxes subject to these priors through Monte Carlo Markov Chain (MCMC) sampling by fitting the image. Assuming we can predict the FIR flux of sources based on the ultraviolet-optical part of their SEDs to within an order of magnitude, the simulations show that we can obtain reliable fluxes and uncertainties at least a factor of three fainter than the confusion noise limit of 3σ c = 2.7 mJy in our simulated PACS-160 image. This technique could in principle be used to mitigate the effects of source confusion in any situation where one has prior information of positions and plausible fluxes of blended sources. For Herschel, application of this technique will improve our ability to constrain the dust content in normal galaxies at high redshift

  13. The Prior Can Often Only Be Understood in the Context of the Likelihood

    Directory of Open Access Journals (Sweden)

    Andrew Gelman

    2017-10-01

    Full Text Available A key sticking point of Bayesian analysis is the choice of prior distribution, and there is a vast literature on potential defaults including uniform priors, Jeffreys’ priors, reference priors, maximum entropy priors, and weakly informative priors. These methods, however, often manifest a key conceptual tension in prior modeling: a model encoding true prior information should be chosen without reference to the model of the measurement process, but almost all common prior modeling techniques are implicitly motivated by a reference likelihood. In this paper we resolve this apparent paradox by placing the choice of prior into the context of the entire Bayesian analysis, from inference to prediction to model evaluation.

  14. Bayesian reliability demonstration for failure-free periods

    International Nuclear Information System (INIS)

    Coolen, F.P.A.; Coolen-Schrijner, P.; Rahrouh, M.

    2005-01-01

    We study sample sizes for testing as required for Bayesian reliability demonstration in terms of failure-free periods after testing, under the assumption that tests lead to zero failures. For the process after testing, we consider both deterministic and random numbers of tasks, including tasks arriving as Poisson processes. It turns out that the deterministic case is worst in the sense that it requires most tasks to be tested. We consider such reliability demonstration for a single type of task, as well as for multiple types of tasks to be performed by one system. We also consider the situation, where tests of different types of tasks may have different costs, aiming at minimal expected total costs, assuming that failure in the process would be catastrophic, in the sense that the process would be discontinued. Generally, these inferences are very sensitive to the choice of prior distribution, so one must be very careful with interpretation of non-informativeness of priors

  15. Bayesian reliability demonstration for failure-free periods

    Energy Technology Data Exchange (ETDEWEB)

    Coolen, F.P.A. [Department of Mathematical Sciences, Science Laboratories, University of Durham, South Road, Durham, DH1 3LE (United Kingdom)]. E-mail: frank.coolen@durham.ac.uk; Coolen-Schrijner, P. [Department of Mathematical Sciences, Science Laboratories, University of Durham, South Road, Durham, DH1 3LE (United Kingdom); Rahrouh, M. [Department of Mathematical Sciences, Science Laboratories, University of Durham, South Road, Durham, DH1 3LE (United Kingdom)

    2005-04-01

    We study sample sizes for testing as required for Bayesian reliability demonstration in terms of failure-free periods after testing, under the assumption that tests lead to zero failures. For the process after testing, we consider both deterministic and random numbers of tasks, including tasks arriving as Poisson processes. It turns out that the deterministic case is worst in the sense that it requires most tasks to be tested. We consider such reliability demonstration for a single type of task, as well as for multiple types of tasks to be performed by one system. We also consider the situation, where tests of different types of tasks may have different costs, aiming at minimal expected total costs, assuming that failure in the process would be catastrophic, in the sense that the process would be discontinued. Generally, these inferences are very sensitive to the choice of prior distribution, so one must be very careful with interpretation of non-informativeness of priors.

  16. Twenty-fifth water reactor safety information meeting: Proceedings. Volume 2: Human reliability analysis and human performance evaluation; Technical issues related to rulemakings; Risk-informed, performance-based initiatives; High burn-up fuel research

    International Nuclear Information System (INIS)

    Monteleone, S.

    1998-03-01

    This three-volume report contains papers presented at the conference. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Japan, Norway, and Russia. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. This volume contains the following: (1) human reliability analysis and human performance evaluation; (2) technical issues related to rulemakings; (3) risk-informed, performance-based initiatives; and (4) high burn-up fuel research

  17. Twenty-fifth water reactor safety information meeting: Proceedings. Volume 2: Human reliability analysis and human performance evaluation; Technical issues related to rulemakings; Risk-informed, performance-based initiatives; High burn-up fuel research

    Energy Technology Data Exchange (ETDEWEB)

    Monteleone, S. [comp.] [Brookhaven National Lab., Upton, NY (United States)

    1998-03-01

    This three-volume report contains papers presented at the conference. The papers are printed in the order of their presentation in each session and describe progress and results of programs in nuclear safety research conducted in this country and abroad. Foreign participation in the meeting included papers presented by researchers from France, Japan, Norway, and Russia. The titles of the papers and the names of the authors have been updated and may differ from those that appeared in the final program of the meeting. This volume contains the following: (1) human reliability analysis and human performance evaluation; (2) technical issues related to rulemakings; (3) risk-informed, performance-based initiatives; and (4) high burn-up fuel research. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  18. Mathematical reliability an expository perspective

    CERN Document Server

    Mazzuchi, Thomas; Singpurwalla, Nozer

    2004-01-01

    In this volume consideration was given to more advanced theoretical approaches and novel applications of reliability to ensure that topics having a futuristic impact were specifically included. Topics like finance, forensics, information, and orthopedics, as well as the more traditional reliability topics were purposefully undertaken to make this collection different from the existing books in reliability. The entries have been categorized into seven parts, each emphasizing a theme that seems poised for the future development of reliability as an academic discipline with relevance. The seven parts are networks and systems; recurrent events; information and design; failure rate function and burn-in; software reliability and random environments; reliability in composites and orthopedics, and reliability in finance and forensics. Embedded within the above are some of the other currently active topics such as causality, cascading, exchangeability, expert testimony, hierarchical modeling, optimization and survival...

  19. The Influence of Prior Knowledge on the Retrieval-Directed Function of Note Taking in Prior Knowledge Activation

    Science.gov (United States)

    Wetzels, Sandra A. J.; Kester, Liesbeth; van Merrienboer, Jeroen J. G.; Broers, Nick J.

    2011-01-01

    Background: Prior knowledge activation facilitates learning. Note taking during prior knowledge activation (i.e., note taking directed at retrieving information from memory) might facilitate the activation process by enabling learners to build an external representation of their prior knowledge. However, taking notes might be less effective in…

  20. Action priors for learning domain invariances

    CSIR Research Space (South Africa)

    Rosman, Benjamin S

    2015-04-01

    Full Text Available behavioural invariances in the domain, by identifying actions to be prioritised in local contexts, invariant to task details. This information has the effect of greatly increasing the speed of solving new problems. We formalise this notion as action priors...

  1. Integration of prior knowledge into dense image matching for video surveillance

    Science.gov (United States)

    Menze, M.; Heipke, C.

    2014-08-01

    Three-dimensional information from dense image matching is a valuable input for a broad range of vision applications. While reliable approaches exist for dedicated stereo setups they do not easily generalize to more challenging camera configurations. In the context of video surveillance the typically large spatial extent of the region of interest and repetitive structures in the scene render the application of dense image matching a challenging task. In this paper we present an approach that derives strong prior knowledge from a planar approximation of the scene. This information is integrated into a graph-cut based image matching framework that treats the assignment of optimal disparity values as a labelling task. Introducing the planar prior heavily reduces ambiguities together with the search space and increases computational efficiency. The results provide a proof of concept of the proposed approach. It allows the reconstruction of dense point clouds in more general surveillance camera setups with wider stereo baselines.

  2. Extraction of microseismic waveforms characteristics prior to rock burst using Hilbert-Huang transform

    Science.gov (United States)

    Li, Xuelong; Li, Zhonghui; Wang, Enyuan; Feng, Junjun; Chen, Liang; Li, Nan; Kong, Xiangguo

    2016-09-01

    This study provides a new research idea concerning rock burst prediction. The characteristics of microseismic (MS) waveforms prior to and during the rock burst were studied through the Hilbert-Huang transform (HHT). In order to demonstrate the advantage of the MS features extraction based on HHT, the conventional analysis method (Fourier transform) was also used to make a comparison. The results show that HHT is simple and reliable, and could extract in-depth information about the characteristics of MS waveforms. About 10 days prior to the rock burst, the main frequency of MS waveforms transforms from the high-frequency to low-frequency. What's more, the waveforms energy also presents accumulation characteristic. Based on our study results, it can be concluded that the MS signals analysis through HHT could provide valuable information about the coal or rock deformation and fracture.

  3. Being an Informed Consumer of Health Information and Assessment of Electronic Health Literacy in a National Sample of Internet Users: Validity and Reliability of the e-HLS Instrument.

    Science.gov (United States)

    Seçkin, Gül; Yeatts, Dale; Hughes, Susan; Hudson, Cassie; Bell, Valarie

    2016-07-11

    The Internet, with its capacity to provide information that transcends time and space barriers, continues to transform how people find and apply information to their own lives. With the current explosion in electronic sources of health information, including thousands of websites and hundreds of mobile phone health apps, electronic health literacy is gaining an increasing prominence in health and medical research. An important dimension of electronic health literacy is the ability to appraise the quality of information that will facilitate everyday health care decisions. Health information seekers explore their care options by gathering information from health websites, blogs, Web-based forums, social networking websites, and advertisements, despite the fact that information quality on the Internet varies greatly. Nonetheless, research has lagged behind in establishing multidimensional instruments, in part due to the evolving construct of health literacy itself. The purpose of this study was to examine psychometric properties of a new electronic health literacy (ehealth literacy) measure in a national sample of Internet users with specific attention to older users. Our paper is motivated by the fact that ehealth literacy is an underinvestigated area of inquiry. Our sample was drawn from a panel of more than 55,000 participants maintained by Knowledge Networks, the largest national probability-based research panel for Web-based surveys. We examined the factor structure of a 19-item electronic Health Literacy Scale (e-HLS) through exploratory factor analysis (EFA) and confirmatory factor analysis, internal consistency reliability, and construct validity on sample of adults (n=710) and a subsample of older adults (n=194). The AMOS graphics program 21.0 was used to construct a measurement model, linking latent factors obtained from EFA with 19 indicators to determine whether this factor structure achieved a good fit with our entire sample and the subsample (age ≥ 60

  4. The Prior Internet Resources 2017

    DEFF Research Database (Denmark)

    Engerer, Volkmar Paul; Albretsen, Jørgen

    2017-01-01

    The Prior Internet Resources (PIR) are presented. Prior’s unpublished scientific manuscripts and his wast letter correspondence with fellow researchers at the time, his Nachlass, is now subject to transcription by Prior-researchers worldwide, and form an integral part of PIR. It is demonstrated...

  5. The Importance of Prior Knowledge.

    Science.gov (United States)

    Cleary, Linda Miller

    1989-01-01

    Recounts a college English teacher's experience of reading and rereading Noam Chomsky, building up a greater store of prior knowledge. Argues that Frank Smith provides a theory for the importance of prior knowledge and Chomsky's work provided a personal example with which to interpret and integrate that theory. (RS)

  6. 76 FR 23171 - Electric Reliability Organization Interpretations of Interconnection Reliability Operations and...

    Science.gov (United States)

    2011-04-26

    ... Reliability Standards for the Bulk-Power System, Order No. 693, FERC Stats. & Regs. ] 31,242, order on reh'g...-Power System reliability may request an interpretation of a Reliability Standard.\\7\\ The ERO's standards... information in its reliability assessments. The Reliability Coordinator must monitor Bulk Electric System...

  7. Proposal of resolution to create an inquiry commission on the french nuclear power plants reliability in case or earthquakes and on the safety, information and warning procedures in case of incidents

    International Nuclear Information System (INIS)

    2003-01-01

    This short paper presents the reasons of the creation of parliamentary inquiry commission of 30 members, on the reliability of the nuclear power plants in France in case of earthquakes and on the safety, information and warning procedures in case of accidents. (A.L.B.)

  8. CISN Display - Reliable Delivery of Real-time Earthquake Information, Including Rapid Notification and ShakeMap to Critical End Users

    Science.gov (United States)

    Rico, H.; Hauksson, E.; Thomas, E.; Friberg, P.; Given, D.

    2002-12-01

    earthquake information on the Web. The links are automatically created when product generators deliver CUBE formatted packets to a Quake Data Distribution System (QDDS) hub (new distribution methods may be used later). The "feeder" modules tap into the QDDS hub and convert the packets into XML-messages. These messages are forwarded to message queues, and then distributed to clients where URLs are dynamically created for these products and linked to events on the CISN Display map. The products may be downloaded out-of-band; and with the inclusion of a GIS mapping tool users can plot organizational assets on the CISN Display map and overlay them against key spectral data, such as ground accelerations. This gives Emergency Response Managers information useful in allocating limited personnel and resources after a major event. At the heart of the system's robustness is a well-established and reliable set of communication protocols for best-effort delivery of data. For critical users a Common Object Request Broker Architecture (CORBA) state-full connection is used via a dedicated signaling channel. The system employs several CORBA methods that alert users of changes in the link status. Loss of connectivity triggers a strategy that attempts to reconnect through various physical and logical paths. Thus, by building on past application successes and proven Internet advances the CISN Display targets a specific audience by providing enhancements previously not available from other applications.

  9. Nudging toward Inquiry: Awakening and Building upon Prior Knowledge

    Science.gov (United States)

    Fontichiaro, Kristin, Comp.

    2010-01-01

    "Prior knowledge" (sometimes called schema or background knowledge) is information one already knows that helps him/her make sense of new information. New learning builds on existing prior knowledge. In traditional reporting-style research projects, students bypass this crucial step and plow right into answer-finding. It's no wonder that many…

  10. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  11. Can natural selection encode Bayesian priors?

    Science.gov (United States)

    Ramírez, Juan Camilo; Marshall, James A R

    2017-08-07

    The evolutionary success of many organisms depends on their ability to make decisions based on estimates of the state of their environment (e.g., predation risk) from uncertain information. These decision problems have optimal solutions and individuals in nature are expected to evolve the behavioural mechanisms to make decisions as if using the optimal solutions. Bayesian inference is the optimal method to produce estimates from uncertain data, thus natural selection is expected to favour individuals with the behavioural mechanisms to make decisions as if they were computing Bayesian estimates in typically-experienced environments, although this does not necessarily imply that favoured decision-makers do perform Bayesian computations exactly. Each individual should evolve to behave as if updating a prior estimate of the unknown environment variable to a posterior estimate as it collects evidence. The prior estimate represents the decision-maker's default belief regarding the environment variable, i.e., the individual's default 'worldview' of the environment. This default belief has been hypothesised to be shaped by natural selection and represent the environment experienced by the individual's ancestors. We present an evolutionary model to explore how accurately Bayesian prior estimates can be encoded genetically and shaped by natural selection when decision-makers learn from uncertain information. The model simulates the evolution of a population of individuals that are required to estimate the probability of an event. Every individual has a prior estimate of this probability and collects noisy cues from the environment in order to update its prior belief to a Bayesian posterior estimate with the evidence gained. The prior is inherited and passed on to offspring. Fitness increases with the accuracy of the posterior estimates produced. Simulations show that prior estimates become accurate over evolutionary time. In addition to these 'Bayesian' individuals, we also

  12. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  13. Recruiting for Prior Service Market

    Science.gov (United States)

    2008-06-01

    perceptions, expectations and issues for re-enlistment • Develop potential marketing and advertising tactics and strategies targeted to the defined...01 JUN 2008 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Recruiting for Prior Service Market 5a. CONTRACT NUMBER 5b. GRANT...Command First Handshake to First Unit of Assignment An Army of One Proud to Be e e to Serve Recruiting for Prior Service Market MAJ Eric Givens / MAJ Brian

  14. Total Variability Modeling using Source-specific Priors

    DEFF Research Database (Denmark)

    Shepstone, Sven Ewan; Lee, Kong Aik; Li, Haizhou

    2016-01-01

    sequence of an utterance. In both cases the prior for the latent variable is assumed to be non-informative, since for homogeneous datasets there is no gain in generality in using an informative prior. This work shows in the heterogeneous case, that using informative priors for com- puting the posterior......, can lead to favorable results. We focus on modeling the priors using minimum divergence criterion or fac- tor analysis techniques. Tests on the NIST 2008 and 2010 Speaker Recognition Evaluation (SRE) dataset show that our proposed method beats four baselines: For i-vector extraction using an already...... trained matrix, for the short2-short3 task in SRE’08, five out of eight female and four out of eight male common conditions, were improved. For the core-extended task in SRE’10, four out of nine female and six out of nine male common conditions were improved. When incorporating prior information...

  15. Assessment of Prior Learning in Adult Vocational Education and Training

    Directory of Open Access Journals (Sweden)

    Vibe Aarkrog

    2015-04-01

    Full Text Available The article deals about the results of a study of school-based Assessment of Prior Learning of adults who have enrolled as students in a VET college in order to qualify for occupations as skilled workers. Based on examples of VET teachers’ methods for assessing the students’ prior learning in the programs for gastronomes, respectively child care assistants the article discusses two issues in relation to Assessment of Prior Learing: the encounter of practical experience and school-based knowledge and the validity and reliability of the assessment procedures. Through focusing on the students’ knowing that and knowing why the assessment is based on a scholastic perception of the students’ needs for training, reflecting one of the most important challenges in Assessment of Prior Learning: how can practical experience be transformed into credits for the knowledge parts of the programs? The study shows that by combining several Assessment of Prior Learning methods and comparing the teachers’ assessments the teachers respond to the issues of validity and reliability. However, validity and reliability might be even further strengthened, if the competencies are well defined, if the education system is aware of securing a reasonable balance between knowing how, knowing that, and knowing why, and if the teachers are adequately trained for the assessment procedures.

  16. PET reconstruction via nonlocal means induced prior.

    Science.gov (United States)

    Hou, Qingfeng; Huang, Jing; Bian, Zhaoying; Chen, Wufan; Ma, Jianhua

    2015-01-01

    The traditional Bayesian priors for maximum a posteriori (MAP) reconstruction methods usually incorporate local neighborhood interactions that penalize large deviations in parameter estimates for adjacent pixels; therefore, only local pixel differences are utilized. This limits their abilities of penalizing the image roughness. To achieve high-quality PET image reconstruction, this study investigates a MAP reconstruction strategy by incorporating a nonlocal means induced (NLMi) prior (NLMi-MAP) which enables utilizing global similarity information of image. The present NLMi prior approximates the derivative of Gibbs energy function by an NLM filtering process. Specially, the NLMi prior is obtained by subtracting the current image estimation from its NLM filtered version and feeding the residual error back to the reconstruction filter to yield the new image estimation. We tested the present NLMi-MAP method with simulated and real PET datasets. Comparison studies with conventional filtered backprojection (FBP) and a few iterative reconstruction methods clearly demonstrate that the present NLMi-MAP method performs better in lowering noise, preserving image edge and in higher signal to noise ratio (SNR). Extensive experimental results show that the NLMi-MAP method outperforms the existing methods in terms of cross profile, noise reduction, SNR, root mean square error (RMSE) and correlation coefficient (CORR).

  17. Prior knowledge in recalling arguments in bioethical dilemmas

    Directory of Open Access Journals (Sweden)

    Hiemke Katharina Schmidt

    2015-09-01

    Full Text Available Prior knowledge is known to facilitate learning new information. Normally in studies confirming this outcome the relationship between prior knowledge and the topic to be learned is obvious: the information to be acquired is part of the domain or topic to which the prior knowledge belongs. This raises the question as to whether prior knowledge of various domains facilitates recalling information. In this study 79 eleventh-grade students completed a questionnaire on their prior knowledge of seven different domains related to the bioethical dilemma of prenatal diagnostics. The students read a text containing arguments for and arguments against prenatal diagnostics. After one week and again 12 weeks later they were asked to write down all the arguments they remembered. Prior knowledge helped them recall the arguments one week (r = .350 and 12 weeks (r = .316 later. Prior knowledge of three of the seven domains significantly helped them recall the arguments one week later (correlations between r = .194 to r = .394. Partial correlations with interest as a control item revealed that interest did not explain the relationship between prior knowledge and recall. Prior knowledge of different domains jointly supports the recall of arguments related to bioethical topics.

  18. Prior Sensitivity Analysis in Default Bayesian Structural Equation Modeling.

    Science.gov (United States)

    van Erp, Sara; Mulder, Joris; Oberski, Daniel L

    2017-11-27

    Bayesian structural equation modeling (BSEM) has recently gained popularity because it enables researchers to fit complex models and solve some of the issues often encountered in classical maximum likelihood estimation, such as nonconvergence and inadmissible solutions. An important component of any Bayesian analysis is the prior distribution of the unknown model parameters. Often, researchers rely on default priors, which are constructed in an automatic fashion without requiring substantive prior information. However, the prior can have a serious influence on the estimation of the model parameters, which affects the mean squared error, bias, coverage rates, and quantiles of the estimates. In this article, we investigate the performance of three different default priors: noninformative improper priors, vague proper priors, and empirical Bayes priors-with the latter being novel in the BSEM literature. Based on a simulation study, we find that these three default BSEM methods may perform very differently, especially with small samples. A careful prior sensitivity analysis is therefore needed when performing a default BSEM analysis. For this purpose, we provide a practical step-by-step guide for practitioners to conducting a prior sensitivity analysis in default BSEM. Our recommendations are illustrated using a well-known case study from the structural equation modeling literature, and all code for conducting the prior sensitivity analysis is available in the online supplemental materials. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Predictive values derived from lower wisdom teeth developmental stages on orthopantomograms to calculate the chronological age in adolescence and young adults as a prerequisite to obtain age-adjusted informed patient consent prior to elective surgical procedures in young patients with incomplete or mismatched personal data.

    Science.gov (United States)

    Friedrich, Reinhard E; Schmidt, Kirsten; Treszl, András; Kersten, Jan F

    2016-01-01

    Introduction: Surgical procedures require informed patient consent, which is mandatory prior to any procedure. These requirements apply in particular to elective surgical procedures. The communication with the patient about the procedure has to be comprehensive and based on mutual understanding. Furthermore, the informed consent has to take into account whether a patient is of legal age. As a result of large-scale migration, there are eventually patients planned for medical procedures, whose chronological age can't be assessed reliably by physical inspection alone. Age determination based on assessing wisdom tooth development stages can be used to help determining whether individuals involved in medical procedures are of legal age, i.e., responsible and accountable. At present, the assessment of wisdom tooth developmental stages barely allows a crude estimate of an individual's age. This study explores possibilities for more precise predictions of the age of individuals with emphasis on the legal age threshold of 18 years. Material and Methods: 1,900 dental orthopantomograms (female 938, male 962, age: 15-24 years), taken between the years 2000 and 2013 for diagnosis and treatment of diseases of the jaws, were evaluated. 1,895 orthopantomograms (female 935, male 960) of 1,804 patients (female 872, male 932) met the inclusion criteria. The archives of the Department of Diagnostic Radiology in Dentistry, University Medical Center Hamburg-Eppendorf, and of an oral and maxillofacial office in Rostock, Germany, were used to collect a sufficient number of radiographs. An effort was made to achieve almost equal distribution of age categories in this study group; 'age' was given on a particular day. The radiological criteria of lower third molar investigation were: presence and extension of periodontal space, alveolar bone loss, emergence of tooth, and stage of tooth mineralization (according to Demirjian). Univariate and multivariate general linear models were calculated

  20. Training shortest-path tractography: Automatic learning of spatial priors

    DEFF Research Database (Denmark)

    Kasenburg, Niklas; Liptrot, Matthew George; Reislev, Nina Linde

    2016-01-01

    Tractography is the standard tool for automatic delineation of white matter tracts from diffusion weighted images. However, the output of tractography often requires post-processing to remove false positives and ensure a robust delineation of the studied tract, and this demands expert prior...... knowledge. Here we demonstrate how such prior knowledge, or indeed any prior spatial information, can be automatically incorporated into a shortest-path tractography approach to produce more robust results. We describe how such a prior can be automatically generated (learned) from a population, and we...

  1. Waste package reliability analysis

    International Nuclear Information System (INIS)

    Pescatore, C.; Sastre, C.

    1983-01-01

    Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table

  2. Integration of Human Reliability Analysis Models into the Simulation-Based Framework for the Risk-Informed Safety Margin Characterization Toolkit

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Rasmussen, Martin; Ulrich, Thomas; Groth, Katrina; Smith, Curtis

    2016-01-01

    This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: • Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.

  3. Integration of Human Reliability Analysis Models into the Simulation-Based Framework for the Risk-Informed Safety Margin Characterization Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rasmussen, Martin [Norwegian Univ. of Science and Technology, Trondheim (Norway). Social Research; Herberger, Sarah [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ulrich, Thomas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-06-01

    This report presents an application of a computation-based human reliability analysis (HRA) framework called the Human Unimodel for Nuclear Technology to Enhance Reliability (HUNTER). HUNTER has been developed not as a standalone HRA method but rather as framework that ties together different HRA methods to model dynamic risk of human activities as part of an overall probabilistic risk assessment (PRA). While we have adopted particular methods to build an initial model, the HUNTER framework is meant to be intrinsically flexible to new pieces that achieve particular modeling goals. In the present report, the HUNTER implementation has the following goals: • Integration with a high fidelity thermal-hydraulic model capable of modeling nuclear power plant behaviors and transients • Consideration of a PRA context • Incorporation of a solid psychological basis for operator performance • Demonstration of a functional dynamic model of a plant upset condition and appropriate operator response This report outlines these efforts and presents the case study of a station blackout scenario to demonstrate the various modules developed to date under the HUNTER research umbrella.

  4. System Reliability Engineering

    International Nuclear Information System (INIS)

    Lim, Tae Jin

    2005-02-01

    This book tells of reliability engineering, which includes quality and reliability, reliability data, importance of reliability engineering, reliability and measure, the poisson process like goodness of fit test and the poisson arrival model, reliability estimation like exponential distribution, reliability of systems, availability, preventive maintenance such as replacement policies, minimal repair policy, shock models, spares, group maintenance and periodic inspection, analysis of common cause failure, and analysis model of repair effect.

  5. Discussion on an informative system set-up for the registration and processing of reliability data on FBR components in view of its application to design and safety studies and plant exploitation improvement

    International Nuclear Information System (INIS)

    Righini, R.; Sola, P.G.; Zappellini, G.

    1990-01-01

    This report describes the set-up and management activities carried-out by ENEA-VEL in collaboration with NIER in the development of a reliability data bank on fast reactor components; this data bank consists of an informative system implemented on the IBM 3090 computer of the ENEA centre of Bologna starting from the software of the CEDB, set-up by CCR Euratom of Ispra for the registration of reliability data on thermal reactor components. This report will contain a detailed description of all the modules (engineering, operating, etc.) provided in the informative system and of the modifications introduced by ENEA in order to adapt them to the peculiarities of the fast reactors and to increase its flexibility; a short description of the available data processing methods will be also included. It will be followed by a comparison between the results obtained applying the classical methods and the particular ones set-up by ENEA: this comparison will be useful to demonstrate the importance of the method applied in order to obtain significative reliability processed data. This report will be also useful to show the importance of the set-up data bank in the improvement of the component design and of the plant safety and exploitation with particular reference to the research of the critical areas and to the definition of the best inspection and maintenance programs

  6. Photovoltaic performance and reliability workshop

    Energy Technology Data Exchange (ETDEWEB)

    Mrig, L. [ed.

    1993-12-01

    This workshop was the sixth in a series of workshops sponsored by NREL/DOE under the general subject of photovoltaic testing and reliability during the period 1986--1993. PV performance and PV reliability are at least as important as PV cost, if not more. In the US, PV manufacturers, DOE laboratories, electric utilities, and others are engaged in the photovoltaic reliability research and testing. This group of researchers and others interested in the field were brought together to exchange the technical knowledge and field experience as related to current information in this evolving field of PV reliability. The papers presented here reflect this effort since the last workshop held in September, 1992. The topics covered include: cell and module characterization, module and system testing, durability and reliability, system field experience, and standards and codes.

  7. Identification of subsurface structures using electromagnetic data and shape priors

    Energy Technology Data Exchange (ETDEWEB)

    Tveit, Svenn, E-mail: svenn.tveit@uni.no [Uni CIPR, Uni Research, Bergen 5020 (Norway); Department of Mathematics, University of Bergen, Bergen 5020 (Norway); Bakr, Shaaban A., E-mail: shaaban.bakr1@gmail.com [Department of Mathematics, Faculty of Science, Assiut University, Assiut 71516 (Egypt); Uni CIPR, Uni Research, Bergen 5020 (Norway); Lien, Martha, E-mail: martha.lien@octio.com [Uni CIPR, Uni Research, Bergen 5020 (Norway); Octio AS, Bøhmergaten 44, Bergen 5057 (Norway); Mannseth, Trond, E-mail: trond.mannseth@uni.no [Uni CIPR, Uni Research, Bergen 5020 (Norway); Department of Mathematics, University of Bergen, Bergen 5020 (Norway)

    2015-03-01

    We consider the inverse problem of identifying large-scale subsurface structures using the controlled source electromagnetic method. To identify structures in the subsurface where the contrast in electric conductivity can be small, regularization is needed to bias the solution towards preserving structural information. We propose to combine two approaches for regularization of the inverse problem. In the first approach we utilize a model-based, reduced, composite representation of the electric conductivity that is highly flexible, even for a moderate number of degrees of freedom. With a low number of parameters, the inverse problem is efficiently solved using a standard, second-order gradient-based optimization algorithm. Further regularization is obtained using structural prior information, available, e.g., from interpreted seismic data. The reduced conductivity representation is suitable for incorporation of structural prior information. Such prior information cannot, however, be accurately modeled with a gaussian distribution. To alleviate this, we incorporate the structural information using shape priors. The shape prior technique requires the choice of kernel function, which is application dependent. We argue for using the conditionally positive definite kernel which is shown to have computational advantages over the commonly applied gaussian kernel for our problem. Numerical experiments on various test cases show that the methodology is able to identify fairly complex subsurface electric conductivity distributions while preserving structural prior information during the inversion.

  8. Web measurement - a tool to achieve reliable information on custody transfer measurement systems in pipeline operations; Web medicao - uma ferramenta de consolidacao de informacoes sobre sistemas de medicao para transferencia de custodia

    Energy Technology Data Exchange (ETDEWEB)

    Cid, Eliane Areas; Freitas, Surama de Oliveitra [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil). Centro de Pesquisas; Ferreira, Ana Luisa Auler da Silva; Dias, Gerson Vieira [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2003-07-01

    The prompt and organized information about measurement systems is essential for custody transfer in the pipeline transportation business. This organized information can serve as a basis for operational, maintenance and commercial groups in pipeline transportation companies. This information can also help management in planning future improvements in hardware for custody transfer measurements. In nation-wide companies, like TRANSPETRO, information about custody transfer measurement systems, if not organized, will be scattered geographically and organizationally. In organizing this kind of information, distributed systems have a big advantage, with information maintained by operational groups and centralized in the headquarters of the company. This paper describes the implementation of a system for consolidating and updating company information about custody transfer measurement systems for liquid and gas. The system has been implemented on the Intranet, allowing initial data entry in a distributed way, and a centralized validation by the headquarters engineering group. The new methodology has sharply increased the reliability in the information of custody transfer measurement systems in the company. (author)

  9. Models for reliability and management of NDT data

    International Nuclear Information System (INIS)

    Simola, K.

    1997-01-01

    In this paper the reliability of NDT measurements was approached from three directions. We have modelled the flaw sizing performance, the probability of flaw detection, and developed models to update the knowledge of true flaw size based on sequential measurement results and flaw sizing reliability model. In discussed models the measured flaw characteristics (depth, length) are assumed to be simple functions of the true characteristics and random noise corresponding to measurement errors, and the models are based on logarithmic transforms. Models for Bayesian updating of the flaw size distributions were developed. Using these models, it is possible to take into account the prior information of the flaw size and combine it with the measured results. A Bayesian approach could contribute e. g. to the definition of an appropriate combination of practical assessments and technical justifications in NDT system qualifications, as expressed by the European regulatory bodies

  10. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  11. On Bayesian System Reliability Analysis

    International Nuclear Information System (INIS)

    Soerensen Ringi, M.

    1995-01-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs

  12. 2015 NREL Photovoltaic Module Reliability Workshops

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-14

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  13. 2016 NREL Photovoltaic Module Reliability Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, Sarah [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-07

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology - both critical goals for moving PV technologies deeper into the electricity marketplace.

  14. Application of reliability methods in Ontario Hydro

    International Nuclear Information System (INIS)

    Jeppesen, R.; Ravishankar, T.J.

    1985-01-01

    Ontario Hydro have established a reliability program in support of its substantial nuclear program. Application of the reliability program to achieve both production and safety goals is described. The value of such a reliability program is evident in the record of Ontario Hydro's operating nuclear stations. The factors which have contributed to the success of the reliability program are identified as line management's commitment to reliability; selective and judicious application of reliability methods; establishing performance goals and monitoring the in-service performance; and collection, distribution, review and utilization of performance information to facilitate cost-effective achievement of goals and improvements. (orig.)

  15. The effect of factors related to prior schooling on student persistence ...

    African Journals Online (AJOL)

    ... of factors related to prior schooling on student persistence in higher education. ... Once reliable profiles of these students have been established and related to ... school counsellors, teachers in both schooling and higher education, and for ...

  16. AMSAA Reliability Growth Guide

    National Research Council Canada - National Science Library

    Broemm, William

    2000-01-01

    ... has developed reliability growth methodology for all phases of the process, from planning to tracking to projection. The report presents this methodology and associated reliability growth concepts.

  17. Information

    International Nuclear Information System (INIS)

    Boyard, Pierre.

    1981-01-01

    The fear for nuclear energy and more particularly for radioactive wastes is analyzed in the sociological context. Everybody agree on the information need, information is available but there is a problem for their diffusion. Reactions of the public are analyzed and journalists, scientists and teachers have a role to play [fr

  18. Optimization of Water Chemistry to Ensure Reliable Water Reactor Fuel Performance at High Burnup and in Ageing Plant (FUWAC). Additional Information

    International Nuclear Information System (INIS)

    2011-10-01

    This report presents the results of the Coordinated Research Project (CRP) on Optimization of Water Chemistry to Ensure Reliable Water Reactor Fuel Performance at High Burnup and in Ageing Plants (FUWAC, 2006-2009). It provides an overview of the results of the investigations into the current state of water chemistry practice and concerns in the primary circuit of water cooled power reactors including: corrosion of primary circuit materials; deposit composition and thickness on the fuel; crud induced power shift; fuel oxide growth and thickness; radioactivity buildup in the reactor coolant system (RCS). The FUWAC CRP is a follow-up to the DAWAC CRP (Data Processing Technologies and Diagnostics for Water Chemistry and Corrosion Control in Nuclear Power Plants 2001-2005). The DAWAC project improved the data processing technologies and diagnostics for water chemistry and corrosion control in nuclear power plants (NPPs). With the improved methods for controlling and monitoring water chemistry now available, it was felt that a review of the principles of water chemistry management should be undertaken in the light of new materials, more onerous operating conditions, emergent issues such as CIPS, also known as axial offset anomaly (AOA) and the ageing of operating power plant. In the framework of this CRP, water chemistry specialists from 16 nuclear utilities and research organizations, representing 15 countries, exchanged experimental and operational data, models and insights into water chemistry management. This CD-ROM attached to the printed IAEA-TECDOC includes the report itself, detailed progress reports of three Research Coordination Meetings (RCMs) (Annexes I-III) and the reports and presentations made during the project by the participants.

  19. Example-driven manifold priors for image deconvolution.

    Science.gov (United States)

    Ni, Jie; Turaga, Pavan; Patel, Vishal M; Chellappa, Rama

    2011-11-01

    Image restoration methods that exploit prior information about images to be estimated have been extensively studied, typically using the Bayesian framework. In this paper, we consider the role of prior knowledge of the object class in the form of a patch manifold to address the deconvolution problem. Specifically, we incorporate unlabeled image data of the object class, say natural images, in the form of a patch-manifold prior for the object class. The manifold prior is implicitly estimated from the given unlabeled data. We show how the patch-manifold prior effectively exploits the available sample class data for regularizing the deblurring problem. Furthermore, we derive a generalized cross-validation (GCV) function to automatically determine the regularization parameter at each iteration without explicitly knowing the noise variance. Extensive experiments show that this method performs better than many competitive image deconvolution methods.

  20. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  1. Calculating system reliability with SRFYDO

    Energy Technology Data Exchange (ETDEWEB)

    Morzinski, Jerome [Los Alamos National Laboratory; Anderson - Cook, Christine M [Los Alamos National Laboratory; Klamann, Richard M [Los Alamos National Laboratory

    2010-01-01

    SRFYDO is a process for estimating reliability of complex systems. Using information from all applicable sources, including full-system (flight) data, component test data, and expert (engineering) judgment, SRFYDO produces reliability estimates and predictions. It is appropriate for series systems with possibly several versions of the system which share some common components. It models reliability as a function of age and up to 2 other lifecycle (usage) covariates. Initial output from its Exploratory Data Analysis mode consists of plots and numerical summaries so that the user can check data entry and model assumptions, and help determine a final form for the system model. The System Reliability mode runs a complete reliability calculation using Bayesian methodology. This mode produces results that estimate reliability at the component, sub-system, and system level. The results include estimates of uncertainty, and can predict reliability at some not-too-distant time in the future. This paper presents an overview of the underlying statistical model for the analysis, discusses model assumptions, and demonstrates usage of SRFYDO.

  2. Component reliability data for use in probabilistic safety assessment

    International Nuclear Information System (INIS)

    1988-10-01

    Generic component reliability data is indispensable in any probabilistic safety analysis. It is not realistic to assume that all possible component failures and failure modes modeled in a PSA would be available from the operating experience of a specific plant in a statistically meaningful way. The degree that generic data is used in PSAs varies from case to case. Some studies are totally based on generic data while others use generic data as prior information to be specialized by plant specific data. Most studies, however, finally use a combination where data for certain components come from generic data sources and others from Bayesian updating. The IAEA effort to compile a generic component reliability data base aimed at facilitating the use of data available in the literature and at highlighting pitfalls which deserve special consideration. It was also intended to complement the fault tree and event tree package (PSAPACK) and to facilitate its use. Moreover, it should be noted, that the IAEA has recently initiated a Coordinated Research Program in Reliability Data Collection, Retrieval and Analysis. In this framework the issues identified as most affecting the quality of existing data bases would be addressed. This report presents the results of a compilation made from the specialized literature and includes reliability data for components usually considered in PSA

  3. Reliability issues at the LHC

    CERN Multimedia

    CERN. Geneva. Audiovisual Unit; Gillies, James D

    2002-01-01

    The Lectures on reliability issues at the LHC will be focused on five main Modules on five days. Module 1: Basic Elements in Reliability Engineering Some basic terms, definitions and methods, from components up to the system and the plant, common cause failures and human factor issues. Module 2: Interrelations of Reliability & Safety (R&S) Reliability and risk informed approach, living models, risk monitoring. Module 3: The ideal R&S Process for Large Scale Systems From R&S goals via the implementation into the system to the proof of the compliance. Module 4: Some Applications of R&S on LHC Master logic, anatomy of risk, cause - consequence diagram, decomposition and aggregation of the system. Module 5: Lessons learned from R&S Application in various Technologies Success stories, pitfalls, constrains in data and methods, limitations per se, experienced in aviation, space, process, nuclear, offshore and transport systems and plants. The Lectures will reflect in summary the compromise in...

  4. Reliability analysis in intelligent machines

    Science.gov (United States)

    Mcinroy, John E.; Saridis, George N.

    1990-01-01

    Given an explicit task to be executed, an intelligent machine must be able to find the probability of success, or reliability, of alternative control and sensing strategies. By using concepts for information theory and reliability theory, new techniques for finding the reliability corresponding to alternative subsets of control and sensing strategies are proposed such that a desired set of specifications can be satisfied. The analysis is straightforward, provided that a set of Gaussian random state variables is available. An example problem illustrates the technique, and general reliability results are presented for visual servoing with a computed torque-control algorithm. Moreover, the example illustrates the principle of increasing precision with decreasing intelligence at the execution level of an intelligent machine.

  5. Reliability analysis and operator modelling

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    1996-01-01

    The paper considers the state of operator modelling in reliability analysis. Operator models are needed in reliability analysis because operators are needed in process control systems. HRA methods must therefore be able to account both for human performance variability and for the dynamics of the interaction. A selected set of first generation HRA approaches is briefly described in terms of the operator model they use, their classification principle, and the actual method they propose. In addition, two examples of second generation methods are also considered. It is concluded that first generation HRA methods generally have very simplistic operator models, either referring to the time-reliability relationship or to elementary information processing concepts. It is argued that second generation HRA methods must recognise that cognition is embedded in a context, and be able to account for that in the way human reliability is analysed and assessed

  6. Reliability and Maintainability (RAM) Training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Packard, Michael H. (Editor)

    2000-01-01

    The theme of this manual is failure physics-the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low-cost reliable products. In a broader sense the manual should do more. It should underscore the urgent need CI for mature attitudes toward reliability. Five of the chapters were originally presented as a classroom course to over 1000 Martin Marietta engineers and technicians. Another four chapters and three appendixes have been added, We begin with a view of reliability from the years 1940 to 2000. Chapter 2 starts the training material with a review of mathematics and a description of what elements contribute to product failures. The remaining chapters elucidate basic reliability theory and the disciplines that allow us to control and eliminate failures.

  7. Estimating Functions with Prior Knowledge, (EFPK) for diffusions

    DEFF Research Database (Denmark)

    Nolsøe, Kim; Kessler, Mathieu; Madsen, Henrik

    2003-01-01

    In this paper a method is formulated in an estimating function setting for parameter estimation, which allows the use of prior information. The main idea is to use prior knowledge of the parameters, either specified as moments restrictions or as a distribution, and use it in the construction of a...... of an estimating function. It may be useful when the full Bayesian analysis is difficult to carry out for computational reasons. This is almost always the case for diffusions, which is the focus of this paper, though the method applies in other settings.......In this paper a method is formulated in an estimating function setting for parameter estimation, which allows the use of prior information. The main idea is to use prior knowledge of the parameters, either specified as moments restrictions or as a distribution, and use it in the construction...

  8. Learning priors for Bayesian computations in the nervous system.

    Directory of Open Access Journals (Sweden)

    Max Berniker

    Full Text Available Our nervous system continuously combines new information from our senses with information it has acquired throughout life. Numerous studies have found that human subjects manage this by integrating their observations with their previous experience (priors in a way that is close to the statistical optimum. However, little is known about the way the nervous system acquires or learns priors. Here we present results from experiments where the underlying distribution of target locations in an estimation task was switched, manipulating the prior subjects should use. Our experimental design allowed us to measure a subject's evolving prior while they learned. We confirm that through extensive practice subjects learn the correct prior for the task. We found that subjects can rapidly learn the mean of a new prior while the variance is learned more slowly and with a variable learning rate. In addition, we found that a Bayesian inference model could predict the time course of the observed learning while offering an intuitive explanation for the findings. The evidence suggests the nervous system continuously updates its priors to enable efficient behavior.

  9. Implicit Priors in Galaxy Cluster Mass and Scaling Relation Determinations

    Science.gov (United States)

    Mantz, A.; Allen, S. W.

    2011-01-01

    Deriving the total masses of galaxy clusters from observations of the intracluster medium (ICM) generally requires some prior information, in addition to the assumptions of hydrostatic equilibrium and spherical symmetry. Often, this information takes the form of particular parametrized functions used to describe the cluster gas density and temperature profiles. In this paper, we investigate the implicit priors on hydrostatic masses that result from this fully parametric approach, and the implications of such priors for scaling relations formed from those masses. We show that the application of such fully parametric models of the ICM naturally imposes a prior on the slopes of the derived scaling relations, favoring the self-similar model, and argue that this prior may be influential in practice. In contrast, this bias does not exist for techniques which adopt an explicit prior on the form of the mass profile but describe the ICM non-parametrically. Constraints on the slope of the cluster mass-temperature relation in the literature show a separation based the approach employed, with the results from fully parametric ICM modeling clustering nearer the self-similar value. Given that a primary goal of scaling relation analyses is to test the self-similar model, the application of methods subject to strong, implicit priors should be avoided. Alternative methods and best practices are discussed.

  10. A risk-informed approach of quantification of epistemic uncertainty for the long-term radioactive waste disposal. Improving reliability of expert judgements with an advanced elicitation procedure

    International Nuclear Information System (INIS)

    Sugiyama, Daisuke; Chida, Taiji; Fujita, Tomonari; Tsukamoto, Masaki

    2011-01-01

    A quantification methodology of epistemic uncertainty by expert judgement based on the risk-informed approach is developed to assess inevitable uncertainty for the long-term safety assessment of radioactive waste disposal. The proposed method in this study employs techniques of logic tree, by which options of models and/or scenarios are identified, and Evidential Support Logic (ESL), by which possibility of each option is quantified. In this report, the effect of a feedback process of discussion between experts and input of state-of-the-art knowledge in the proposed method is discussed to estimate alteration of the distribution of expert judgements which is one of the factors causing uncertainty. In a preliminary quantification experiment of uncertainty of degradation of the engineering barrier materials in a tentative sub-surface disposal using the proposed methodology, experts themselves modified questions appropriately to facilitate sound judgements and to correlate those with scientific evidences clearly. The result suggests that the method effectively improves confidence of expert judgement. Also, the degree of consensus of expert judgement was sort of improved in some cases, since scientific knowledge and information of expert judgement in other fields became common understanding. It is suggested that the proposed method could facilitate consensus on uncertainty between interested persons. (author)

  11. Psychometric properties of the 25-item Work Limitations Questionnaire in Japan: factor structure, validity, and reliability in information and communication technology company employees.

    Science.gov (United States)

    Kono, Yuko; Matsushima, Eisuke; Uji, Masayo

    2014-02-01

    The 25-item Work Limitations Questionnaire (WLQ-25) measures presenteeism but has not been sufficiently validated in a Japanese population. A total of 451 employees from four information technology companies in Tokyo completed the WLQ-25 and questionnaires of other variables on two occasions, 2 weeks apart. The WLQ-25 yielded a two-factor structure: Cognitive Demand and Physical Demand. These subscales showed good internal consistency, and both were associated with adverse working conditions, greater perceived job strain, lower skill use, poorer workplace social support, and less satisfactory psychological adjustment. Intraclass correlation coefficients of the two WLQ-25 subscales between time 1 and time 2 were 0.78 and 0.55, respectively. This study suggests acceptable psychometric properties of the WLQ-25 in Japan.

  12. Informe

    Directory of Open Access Journals (Sweden)

    Egon Lichetenberger

    1950-10-01

    Full Text Available Informe del doctor Egon Lichetenberger ante el Consejo Directivo de la Facultad, sobre el  curso de especialización en Anatomía Patológica patrocinado por la Kellogg Foundation (Departamento de Patología

  13. Suncor maintenance and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Little, S. [Suncor Energy, Calgary, AB (Canada)

    2006-07-01

    Fleet maintenance and reliability at Suncor Energy was discussed in this presentation, with reference to Suncor Energy's primary and support equipment fleets. This paper also discussed Suncor Energy's maintenance and reliability standard involving people, processes and technology. An organizational maturity chart that graphed organizational learning against organizational performance was illustrated. The presentation also reviewed the maintenance and reliability framework; maintenance reliability model; the process overview of the maintenance and reliability standard; a process flow chart of maintenance strategies and programs; and an asset reliability improvement process flow chart. An example of an improvement initiative was included, with reference to a shovel reliability review; a dipper trip reliability investigation; bucket related failures by type and frequency; root cause analysis of the reliability process; and additional actions taken. Last, the presentation provided a graph of the results of the improvement initiative and presented the key lessons learned. tabs., figs.

  14. Of plants and reliability

    International Nuclear Information System (INIS)

    Schneider Horst

    2009-01-01

    Behind the political statements made about the transformer event at the Kruemmel nuclear power station (KKK) in the summer of 2009 there are fundamental issues of atomic law. Pursuant to Articles 20 and 28 of its Basic Law, Germany is a state in which the rule of law applies. Consequently, the aspects of atomic law associated with the incident merit a closer look, all the more so as the items concerned have been known for many years. Important aspects in the debate about the Kruemmel nuclear power plant are the fact that the transformer is considered part of the nuclear power station under atomic law and thus a ''plant'' subject to surveillance by the nuclear regulatory agencies, on the one hand, and the reliability under atomic law of the operator and the executive personnel responsible, on the other hand. Both ''plant'' and ''reliability'' are terms focusing on nuclear safety. Hence the question to what extent safety was affected in the Kruemmel incident. The classification of the event as 0 = no or only a very slight safety impact on the INES scale (INES = International Nuclear Event Scale) should not be used to put aside the safety issue once and for all. Points of fact and their technical significance must be considered prior to any legal assessment. Legal assessments and regulations are associated with facts and circumstances. Any legal examination is based on the facts as determined and elucidated. Any other procedure would be tantamount to an inadmissible legal advance conviction. Now, what is the position of political statements, i.e. political assessments and political responsibility? If everything is done the correct way, they come at the end, after exploration of the facts and evaluation under applicable law. Sometimes things are handled differently, with consequences which are not very helpful. In the light of the provisions about the rule of law as laid down in the Basic Law, the new federal government should be made to observe the proper sequence of

  15. Fuel reliability experience in Finland

    International Nuclear Information System (INIS)

    Kekkonen, L.

    2015-01-01

    Four nuclear reactors have operated in Finland now for 35-38 years. The two VVER-440 units at Loviisa Nuclear Power Plant are operated by Fortum and two BWR’s in Olkiluoto are operated by Teollisuuden Voima Oyj (TVO). The fuel reliability experience of the four reactors operating currently in Finland has been very good and the fuel failure rates have been very low. Systematic inspection of spent fuel assemblies, and especially all failed assemblies, is a good practice that is employed in Finland in order to improve fuel reliability and operational safety. Investigation of the root cause of fuel failures is important in developing ways to prevent similar failures in the future. The operational and fuel reliability experience at the Loviisa Nuclear Power Plant has been reported also earlier in the international seminars on WWER Fuel Performance, Modelling and Experimental Support. In this paper the information on fuel reliability experience at Loviisa NPP is updated and also a short summary of the fuel reliability experience at Olkiluoto NPP is given. Keywords: VVER-440, fuel reliability, operational experience, poolside inspections, fuel failure identification. (author)

  16. Inverse problems with non-trivial priors: efficient solution through sequential Gibbs sampling

    DEFF Research Database (Denmark)

    Hansen, Thomas Mejer; Cordua, Knud Skou; Mosegaard, Klaus

    2012-01-01

    Markov chain Monte Carlo methods such as the Gibbs sampler and the Metropolis algorithm can be used to sample solutions to non-linear inverse problems. In principle, these methods allow incorporation of prior information of arbitrary complexity. If an analytical closed form description of the prior...... is available, which is the case when the prior can be described by a multidimensional Gaussian distribution, such prior information can easily be considered. In reality, prior information is often more complex than can be described by the Gaussian model, and no closed form expression of the prior can be given....... We propose an algorithm, called sequential Gibbs sampling, allowing the Metropolis algorithm to efficiently incorporate complex priors into the solution of an inverse problem, also for the case where no closed form description of the prior exists. First, we lay out the theoretical background...

  17. CHANGES IN PRIOR PERIOD ERRORS

    OpenAIRE

    Alina Pietraru; Dorina Luţă

    2007-01-01

    In 2007, in Romania is continued the gradual implementation of the International Financial Reporting Standards which includes: the IFRS, the IAS and their Interpretation as approved by European Union, translated and published in Romanian. The financial statement must achieve one major purpose that is to provide information about the financial position, financial performance and changes in financial position of that entity. In this context, the accounting policy elaborated and assumed by the m...

  18. The relationship between cost estimates reliability and BIM adoption: SEM analysis

    Science.gov (United States)

    Ismail, N. A. A.; Idris, N. H.; Ramli, H.; Rooshdi, R. R. Raja Muhammad; Sahamir, S. R.

    2018-02-01

    This paper presents the usage of Structural Equation Modelling (SEM) approach in analysing the effects of Building Information Modelling (BIM) technology adoption in improving the reliability of cost estimates. Based on the questionnaire survey results, SEM analysis using SPSS-AMOS application examined the relationships between BIM-improved information and cost estimates reliability factors, leading to BIM technology adoption. Six hypotheses were established prior to SEM analysis employing two types of SEM models, namely the Confirmatory Factor Analysis (CFA) model and full structural model. The SEM models were then validated through the assessment on their uni-dimensionality, validity, reliability, and fitness index, in line with the hypotheses tested. The final SEM model fit measures are: P-value=0.000, RMSEA=0.0790.90, TLI=0.956>0.90, NFI=0.935>0.90 and ChiSq/df=2.259; indicating that the overall index values achieved the required level of model fitness. The model supports all the hypotheses evaluated, confirming that all relationship exists amongst the constructs are positive and significant. Ultimately, the analysis verified that most of the respondents foresee better understanding of project input information through BIM visualization, its reliable database and coordinated data, in developing more reliable cost estimates. They also perceive to accelerate their cost estimating task through BIM adoption.

  19. Divergent Priors and well Behaved Bayes Factors

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2011-01-01

    textabstractDivergent priors are improper when defined on unbounded supports. Bartlett's paradox has been taken to imply that using improper priors results in ill-defined Bayes factors, preventing model comparison by posterior probabilities. However many improper priors have attractive properties

  20. Generalized multiple kernel learning with data-dependent priors.

    Science.gov (United States)

    Mao, Qi; Tsang, Ivor W; Gao, Shenghua; Wang, Li

    2015-06-01

    Multiple kernel learning (MKL) and classifier ensemble are two mainstream methods for solving learning problems in which some sets of features/views are more informative than others, or the features/views within a given set are inconsistent. In this paper, we first present a novel probabilistic interpretation of MKL such that maximum entropy discrimination with a noninformative prior over multiple views is equivalent to the formulation of MKL. Instead of using the noninformative prior, we introduce a novel data-dependent prior based on an ensemble of kernel predictors, which enhances the prediction performance of MKL by leveraging the merits of the classifier ensemble. With the proposed probabilistic framework of MKL, we propose a hierarchical Bayesian model to learn the proposed data-dependent prior and classification model simultaneously. The resultant problem is convex and other information (e.g., instances with either missing views or missing labels) can be seamlessly incorporated into the data-dependent priors. Furthermore, a variety of existing MKL models can be recovered under the proposed MKL framework and can be readily extended to incorporate these priors. Extensive experiments demonstrate the benefits of our proposed framework in supervised and semisupervised settings, as well as in tasks with partial correspondence among multiple views.

  1. Human Reliability Program Overview

    Energy Technology Data Exchange (ETDEWEB)

    Bodin, Michael

    2012-09-25

    This presentation covers the high points of the Human Reliability Program, including certification/decertification, critical positions, due process, organizational structure, program components, personnel security, an overview of the US DOE reliability program, retirees and academia, and security program integration.

  2. Power electronics reliability analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Mark A.; Atcitty, Stanley

    2009-12-01

    This report provides the DOE and industry with a general process for analyzing power electronics reliability. The analysis can help with understanding the main causes of failures, downtime, and cost and how to reduce them. One approach is to collect field maintenance data and use it directly to calculate reliability metrics related to each cause. Another approach is to model the functional structure of the equipment using a fault tree to derive system reliability from component reliability. Analysis of a fictitious device demonstrates the latter process. Optimization can use the resulting baseline model to decide how to improve reliability and/or lower costs. It is recommended that both electric utilities and equipment manufacturers make provisions to collect and share data in order to lay the groundwork for improving reliability into the future. Reliability analysis helps guide reliability improvements in hardware and software technology including condition monitoring and prognostics and health management.

  3. Reliability of software

    International Nuclear Information System (INIS)

    Kopetz, H.

    1980-01-01

    Common factors and differences in the reliability of hardware and software; reliability increase by means of methods of software redundancy. Maintenance of software for long term operating behavior. (HP) [de

  4. Pocket Handbook on Reliability

    Science.gov (United States)

    1975-09-01

    exponencial distributions Weibull distribution, -xtimating reliability, confidence intervals, relia- bility growth, 0. P- curves, Bayesian analysis. 20 A S...introduction for those not familiar with reliability and a good refresher for those who are currently working in the area. LEWIS NERI, CHIEF...includes one or both of the following objectives: a) prediction of the current system reliability, b) projection on the system reliability for someI future

  5. FRELIB, Failure Reliability Index Calculation

    International Nuclear Information System (INIS)

    Parkinson, D.B.; Oestergaard, C.

    1984-01-01

    1 - Description of problem or function: Calculation of the reliability index given the failure boundary. A linearization point (design point) is found on the failure boundary for a stationary reliability index (min) and a stationary failure probability density function along the failure boundary, provided that the basic variables are normally distributed. 2 - Method of solution: Iteration along the failure boundary which must be specified - together with its partial derivatives with respect to the basic variables - by the user in a subroutine FSUR. 3 - Restrictions on the complexity of the problem: No distribution information included (first-order-second-moment-method). 20 basic variables (could be extended)

  6. Principles of Bridge Reliability

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Nowak, Andrzej S.

    The paper gives a brief introduction to the basic principles of structural reliability theory and its application to bridge engineering. Fundamental concepts like failure probability and reliability index are introduced. Ultimate as well as serviceability limit states for bridges are formulated......, and as an example the reliability profile and a sensitivity analyses for a corroded reinforced concrete bridge is shown....

  7. Reliability in engineering '87

    International Nuclear Information System (INIS)

    Tuma, M.

    1987-01-01

    The participants heard 51 papers dealing with the reliability of engineering products. Two of the papers were incorporated in INIS, namely ''Reliability comparison of two designs of low pressure regeneration of the 1000 MW unit at the Temelin nuclear power plant'' and ''Use of probability analysis of reliability in designing nuclear power facilities.''(J.B.)

  8. Iterated random walks with shape prior

    DEFF Research Database (Denmark)

    Pujadas, Esmeralda Ruiz; Kjer, Hans Martin; Piella, Gemma

    2016-01-01

    the parametric probability density function. Then, random walks is performed iteratively aligning the prior with the current segmentation in every iteration. We tested the proposed approach with natural and medical images and compared it with the latest techniques with random walks and shape priors......We propose a new framework for image segmentation using random walks where a distance shape prior is combined with a region term. The shape prior is weighted by a confidence map to reduce the influence of the prior in high gradient areas and the region term is computed with k-means to estimate....... The experiments suggest that this method gives promising results for medical and natural images....

  9. Reliability and optimization of structural systems

    International Nuclear Information System (INIS)

    Thoft-Christensen, P.

    1987-01-01

    The proceedings contain 28 papers presented at the 1st working conference. The working conference was organized by the IFIP Working Group 7.5. The proceedings also include 4 papers which were submitted, but for various reasons not presented at the working conference. The working conference was attended by 50 participants from 18 countries. The conference was the first scientific meeting of the new IFIP Working Group 7.5 on 'Reliability and Optimization of Structural Systems'. The purpose of the Working Group 7.5 is to promote modern structural system optimization and reliability theory, to advance international cooperation in the field of structural system optimization and reliability theory, to stimulate research, development and application of structural system optimization and reliability theory, to further the dissemination and exchange of information on reliability and optimization of structural system optimization and reliability theory, and to encourage education in structural system optimization and reliability theory. (orig./HP)

  10. A Noninformative Prior on a Space of Distribution Functions

    Directory of Open Access Journals (Sweden)

    Alexander Terenin

    2017-07-01

    Full Text Available In a given problem, the Bayesian statistical paradigm requires the specification of a prior distribution that quantifies relevant information about the unknowns of main interest external to the data. In cases where little such information is available, the problem under study may possess an invariance under a transformation group that encodes a lack of information, leading to a unique prior—this idea was explored at length by E.T. Jaynes. Previous successful examples have included location-scale invariance under linear transformation, multiplicative invariance of the rate at which events in a counting process are observed, and the derivation of the Haldane prior for a Bernoulli success probability. In this paper we show that this method can be extended, by generalizing Jaynes, in two ways: (1 to yield families of approximately invariant priors; and (2 to the infinite-dimensional setting, yielding families of priors on spaces of distribution functions. Our results can be used to describe conditions under which a particular Dirichlet Process posterior arises from an optimal Bayesian analysis, in the sense that invariances in the prior and likelihood lead to one and only one posterior distribution.

  11. Problematics of Reliability of Road Rollers

    Science.gov (United States)

    Stawowiak, Michał; Kuczaj, Mariusz

    2018-06-01

    This article refers to the reliability of road rollers used in a selected roadworks company. Information on the method of road rollers service and how the service affects the reliability of these rollers is presented. Attention was paid to the process of the implemented maintenance plan with regard to the machine's operational time. The reliability of road rollers was analyzed by determining and interpreting readiness coefficients.

  12. 78 FR 56242 - Agency Information Collection Activities: Prior Disclosure

    Science.gov (United States)

    2013-09-12

    ... technology; and (e) the annual cost burden to respondents or record keepers from the collection of... Number of Respondents: 3,500. Estimated Number of Annual Responses: 3,500. Estimated Time per Response: 1...

  13. Partitioning of genomic variance using prior biological information

    DEFF Research Database (Denmark)

    Edwards, Stefan McKinnon; Janss, Luc; Madsen, Per

    2013-01-01

    variants influence complex diseases. Despite the successes, the variants identified as being statistically significant have generally explained only a small fraction of the heritable component of the trait, the so-called problem of missing heritability. Insufficient modelling of the underlying genetic...... architecture may in part explain this missing heritability. Evidence collected across genome-wide association studies in human provides insight into the genetic architecture of complex traits. Although many genetic variants with small or moderate effects contribute to the overall genetic variation, it appears...... that the associated genetic variants are enriched for genes that are connected in biol ogical pathways or for likely functional effects on genes. These biological findings provide valuable insight for developing better genomic models. These are statistical models for predicting complex trait phenotypes on the basis...

  14. increased relevance and influence of free prior informed consent

    African Journals Online (AJOL)

    MSc (Ecuador), PhD (Australia), international consultant, researcher, lecturer and activist on ..... mation, for example, prevent cases of Pirates of Carbon. Also, the Green .... In regards to eco-tourism, there are relatively successful experiences.

  15. Reliable computer systems.

    Science.gov (United States)

    Wear, L L; Pinkert, J R

    1993-11-01

    In this article, we looked at some decisions that apply to the design of reliable computer systems. We began with a discussion of several terms such as testability, then described some systems that call for highly reliable hardware and software. The article concluded with a discussion of methods that can be used to achieve higher reliability in computer systems. Reliability and fault tolerance in computers probably will continue to grow in importance. As more and more systems are computerized, people will want assurances about the reliability of these systems, and their ability to work properly even when sub-systems fail.

  16. Novel approach for evaluation of service reliability for electricity customers

    Institute of Scientific and Technical Information of China (English)

    JIANG; John; N

    2009-01-01

    Understanding reliability value for electricity customer is important to market-based reliability management. This paper proposes a novel approach to evaluate the reliability for electricity customers by using indifference curve between economic compensation for power interruption and service reliability of electricity. Indifference curve is formed by calculating different planning schemes of network expansion for different reliability requirements of customers, which reveals the information about economic values for different reliability levels for electricity customers, so that the reliability based on market supply demand mechanism can be established and economic signals can be provided for reliability management and enhancement.

  17. Reliability Characteristics of Power Plants

    Directory of Open Access Journals (Sweden)

    Zbynek Martinek

    2017-01-01

    Full Text Available This paper describes the phenomenon of reliability of power plants. It gives an explanation of the terms connected with this topic as their proper understanding is important for understanding the relations and equations which model the possible real situations. The reliability phenomenon is analysed using both the exponential distribution and the Weibull distribution. The results of our analysis are specific equations giving information about the characteristics of the power plants, the mean time of operations and the probability of failure-free operation. Equations solved for the Weibull distribution respect the failures as well as the actual operating hours. Thanks to our results, we are able to create a model of dynamic reliability for prediction of future states. It can be useful for improving the current situation of the unit as well as for creating the optimal plan of maintenance and thus have an impact on the overall economics of the operation of these power plants.

  18. Gearbox Reliability Collaborative High Speed Shaft Tapered Roller Bearing Calibration

    Energy Technology Data Exchange (ETDEWEB)

    Keller, J.; Guo, Y.; McNiff, B.

    2013-10-01

    The National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) is a project investigating gearbox reliability primarily through testing and modeling. Previous dynamometer testing focused upon acquiring measurements in the planetary section of the test gearbox. Prior to these tests, the strain gages installed on the planetary bearings were calibrated in a load frame.

  19. Statistical Bayesian method for reliability evaluation based on ADT data

    Science.gov (United States)

    Lu, Dawei; Wang, Lizhi; Sun, Yusheng; Wang, Xiaohong

    2018-05-01

    Accelerated degradation testing (ADT) is frequently conducted in the laboratory to predict the products’ reliability under normal operating conditions. Two kinds of methods, degradation path models and stochastic process models, are utilized to analyze degradation data and the latter one is the most popular method. However, some limitations like imprecise solution process and estimation result of degradation ratio still exist, which may affect the accuracy of the acceleration model and the extrapolation value. Moreover, the conducted solution of this problem, Bayesian method, lose key information when unifying the degradation data. In this paper, a new data processing and parameter inference method based on Bayesian method is proposed to handle degradation data and solve the problems above. First, Wiener process and acceleration model is chosen; Second, the initial values of degradation model and parameters of prior and posterior distribution under each level is calculated with updating and iteration of estimation values; Third, the lifetime and reliability values are estimated on the basis of the estimation parameters; Finally, a case study is provided to demonstrate the validity of the proposed method. The results illustrate that the proposed method is quite effective and accuracy in estimating the lifetime and reliability of a product.

  20. Gearbox Reliability Collaborative Phase 3 Gearbox 2 Test Plan

    Energy Technology Data Exchange (ETDEWEB)

    Link, H.; Keller, J.; Guo, Y.; McNiff, B.

    2013-04-01

    Gearboxes in wind turbines have not been achieving their expected design life even though they commonly meet or exceed the design criteria specified in current design standards. One of the basic premises of the National Renewable Energy Laboratory (NREL) Gearbox Reliability Collaborative (GRC) is that the low gearbox reliability results from the absence of critical elements in the design process or insufficient design tools. Key goals of the GRC are to improve design approaches and analysis tools and to recommend practices and test methods resulting in improved design standards for wind turbine gearboxes that lower the cost of energy (COE) through improved reliability. The GRC uses a combined gearbox testing, modeling and analysis approach, along with a database of information from gearbox failures collected from overhauls and investigation of gearbox condition monitoring techniques to improve wind turbine operations and maintenance practices. Testing of Gearbox 2 (GB2) using the two-speed turbine controller that has been used in prior testing. This test series will investigate non-torque loads, high-speed shaft misalignment, and reproduction of field conditions in the dynamometer. This test series will also include vibration testing using an eddy-current brake on the gearbox's high speed shaft.

  1. An architectural model for software reliability quantification: sources of data

    International Nuclear Information System (INIS)

    Smidts, C.; Sova, D.

    1999-01-01

    Software reliability assessment models in use today treat software as a monolithic block. An aversion towards 'atomic' models seems to exist. These models appear to add complexity to the modeling, to the data collection and seem intrinsically difficult to generalize. In 1997, we introduced an architecturally based software reliability model called FASRE. The model is based on an architecture derived from the requirements which captures both functional and nonfunctional requirements and on a generic classification of functions, attributes and failure modes. The model focuses on evaluation of failure mode probabilities and uses a Bayesian quantification framework. Failure mode probabilities of functions and attributes are propagated to the system level using fault trees. It can incorporate any type of prior information such as results of developers' testing, historical information on a specific functionality and its attributes, and, is ideally suited for reusable software. By building an architecture and deriving its potential failure modes, the model forces early appraisal and understanding of the weaknesses of the software, allows reliability analysis of the structure of the system, provides assessments at a functional level as well as at a systems' level. In order to quantify the probability of failure (or the probability of success) of a specific element of our architecture, data are needed. The term element of the architecture is used here in its broadest sense to mean a single failure mode or a higher level of abstraction such as a function. The paper surveys the potential sources of software reliability data available during software development. Next the mechanisms for incorporating these sources of relevant data to the FASRE model are identified

  2. Effects of regularisation priors on dynamic PET Data

    International Nuclear Information System (INIS)

    Caldeira, Liliana; Scheins, Juergen; Silva, Nuno da; Gaens, Michaela; Shah, N Jon

    2014-01-01

    Dynamic PET provides temporal information about tracer uptake. However, each PET frame has usually low statistics, resulting in noisy images. The goal is to study effects of prior regularisation on dynamic PET data. Quantification and noise in image-domain and time-domain as well as impact on parametric images is assessed.

  3. A confiabilidade da informação fornecida pelo indivíduo a respeito de seu posicionamento habitual de língua The reliability of the information provided by the individuals about their habitual tongue position

    Directory of Open Access Journals (Sweden)

    Ana Fernanda Rodrigues Cardoso

    2011-04-01

    Full Text Available OBJETIVO: verificar a confiabilidade da informação fornecida por adultos e crianças a respeito do posicionamento habitual de língua. MÉTODOS: foram investigadas 30 crianças e 30 adultos em dois momentos, com diferença mínima de sete e máxima de vinte e um dias. Inicialmente foi realizada a observação do posicionamento habitual de língua. Em seguida, o participante foi questionado a respeito de seu posicionamento habitual. Após a resposta, a língua foi estimulada com uma espátula de madeira, a fim de aumentar a percepção. Posteriormente, questionou-se, novamente, o indivíduo. Em seguida, orientou-se o participante a observar onde sua língua permanece habitualmente na cavidade oral, até o segundo momento da avaliação. Nesta oportunidade, o participante foi questionado a respeito de seu posicionamento habitual de língua. Os dados foram analisados por meio da estatística Kappa. RESULTADOS: não foi possível visualizar o posicionamento habitual de língua em 100% da amostra. Quanto à confiabilidade geral das respostas verificou-se classificação entre discreta e regular. As crianças apresentaram respostas pouco consistentes e bastante diversificadas, já em relação aos adultos, parte apresentou respostas corretas logo no primeiro questionamento e parte somente apresentou respostas confiáveis após estimulação de percepção intra-oral. CONCLUSÕES: a confiabilidade da informação fornecida pelos indivíduos da amostra a respeito de seu posicionamento habitual de língua varia entre discreta e regular, sendo, portanto, baixa, tanto em crianças quanto em adultos. Uma possível estratégia a ser utilizada na prática clínica fonoaudiológica é questionar o paciente quanto ao seu posicionamento lingual após determinado período de observação.PURPOSE: to check the reliability of the information provided by adults and children about the habitual tongue position. METHODS: we investigated 30 children and 30 adults in

  4. Basics of Bayesian reliability estimation from attribute test data

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Waller, R.A.

    1975-10-01

    The basic notions of Bayesian reliability estimation from attribute lifetest data are presented in an introductory and expository manner. Both Bayesian point and interval estimates of the probability of surviving the lifetest, the reliability, are discussed. The necessary formulas are simply stated, and examples are given to illustrate their use. In particular, a binomial model in conjunction with a beta prior model is considered. Particular attention is given to the procedure for selecting an appropriate prior model in practice. Empirical Bayes point and interval estimates of reliability are discussed and examples are given. 7 figures, 2 tables

  5. Source Localization by Entropic Inference and Backward Renormalization Group Priors

    Directory of Open Access Journals (Sweden)

    Nestor Caticha

    2015-04-01

    Full Text Available A systematic method of transferring information from coarser to finer resolution based on renormalization group (RG transformations is introduced. It permits building informative priors in finer scales from posteriors in coarser scales since, under some conditions, RG transformations in the space of hyperparameters can be inverted. These priors are updated using renormalized data into posteriors by Maximum Entropy. The resulting inference method, backward RG (BRG priors, is tested by doing simulations of a functional magnetic resonance imaging (fMRI experiment. Its results are compared with a Bayesian approach working in the finest available resolution. Using BRG priors sources can be partially identified even when signal to noise ratio levels are up to ~ -25dB improving vastly on the single step Bayesian approach. For low levels of noise the BRG prior is not an improvement over the single scale Bayesian method. Analysis of the histograms of hyperparameters can show how to distinguish if the method is failing, due to very high levels of noise, or if the identification of the sources is, at least partially possible.

  6. Valid MR imaging predictors of prior knee arthroscopy

    International Nuclear Information System (INIS)

    Discepola, Federico; Le, Huy B.Q.; Park, John S.; Clopton, Paul; Knoll, Andrew N.; Austin, Matthew J.; Resnick, Donald L.

    2012-01-01

    To determine whether fibrosis of the medial patellar reticulum (MPR), lateral patellar reticulum (LPR), deep medial aspect of Hoffa's fat pad (MDH), or deep lateral aspect of Hoffa's fat pad (LDH) is a valid predictor of prior knee arthroscopy. Institutional review board approval and waiver of informed consent were obtained for this HIPPA-compliant study. Initially, fibrosis of the MPR, LPR, MDH, or LDH in MR imaging studies of 50 patients with prior knee arthroscopy and 100 patients without was recorded. Subsequently, two additional radiologists, blinded to clinical data, retrospectively and independently recorded the presence of fibrosis of the MPR in 50 patients with prior knee arthroscopy and 50 without. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy for detecting the presence of fibrosis in the MPR were calculated. κ statistics were used to analyze inter-observer agreement. Fibrosis of each of the regions examined during the first portion of the study showed a significant association with prior knee arthroscopy (p < 0.005 for each). A patient with fibrosis of the MPR, LDH, or LPR was 45.5, 9, or 3.7 times more likely, respectively, to have had a prior knee arthroscopy. Logistic regression analysis indicated that fibrosis of the MPR supplanted the diagnostic utility of identifying fibrosis of the LPR, LDH, or MDH, or combinations of these (p ≥ 0.09 for all combinations). In the second portion of the study, fibrosis of the MPR demonstrated a mean sensitivity of 82%, specificity of 72%, PPV of 75%, NPV of 81%, and accuracy of 77% for predicting prior knee arthroscopy. Analysis of MR images can be used to determine if a patient has had prior knee arthroscopy by identifying fibrosis of the MPR, LPR, MDH, or LDH. Fibrosis of the MPR was the strongest predictor of prior knee arthroscopy. (orig.)

  7. Valid MR imaging predictors of prior knee arthroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Discepola, Federico; Le, Huy B.Q. [McGill University Health Center, Jewsih General Hospital, Division of Musculoskeletal Radiology, Montreal, Quebec (Canada); Park, John S. [Annapolis Radiology Associates, Division of Musculoskeletal Radiology, Annapolis, MD (United States); Clopton, Paul; Knoll, Andrew N.; Austin, Matthew J.; Resnick, Donald L. [University of California San Diego (UCSD), Division of Musculoskeletal Radiology, San Diego, CA (United States)

    2012-01-15

    To determine whether fibrosis of the medial patellar reticulum (MPR), lateral patellar reticulum (LPR), deep medial aspect of Hoffa's fat pad (MDH), or deep lateral aspect of Hoffa's fat pad (LDH) is a valid predictor of prior knee arthroscopy. Institutional review board approval and waiver of informed consent were obtained for this HIPPA-compliant study. Initially, fibrosis of the MPR, LPR, MDH, or LDH in MR imaging studies of 50 patients with prior knee arthroscopy and 100 patients without was recorded. Subsequently, two additional radiologists, blinded to clinical data, retrospectively and independently recorded the presence of fibrosis of the MPR in 50 patients with prior knee arthroscopy and 50 without. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and accuracy for detecting the presence of fibrosis in the MPR were calculated. {kappa} statistics were used to analyze inter-observer agreement. Fibrosis of each of the regions examined during the first portion of the study showed a significant association with prior knee arthroscopy (p < 0.005 for each). A patient with fibrosis of the MPR, LDH, or LPR was 45.5, 9, or 3.7 times more likely, respectively, to have had a prior knee arthroscopy. Logistic regression analysis indicated that fibrosis of the MPR supplanted the diagnostic utility of identifying fibrosis of the LPR, LDH, or MDH, or combinations of these (p {>=} 0.09 for all combinations). In the second portion of the study, fibrosis of the MPR demonstrated a mean sensitivity of 82%, specificity of 72%, PPV of 75%, NPV of 81%, and accuracy of 77% for predicting prior knee arthroscopy. Analysis of MR images can be used to determine if a patient has had prior knee arthroscopy by identifying fibrosis of the MPR, LPR, MDH, or LDH. Fibrosis of the MPR was the strongest predictor of prior knee arthroscopy. (orig.)

  8. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sø rbye, Sigrunn Holbek; Rue, Haavard

    2017-01-01

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  9. Penalised Complexity Priors for Stationary Autoregressive Processes

    KAUST Repository

    Sørbye, Sigrunn Holbek

    2017-05-25

    The autoregressive (AR) process of order p(AR(p)) is a central model in time series analysis. A Bayesian approach requires the user to define a prior distribution for the coefficients of the AR(p) model. Although it is easy to write down some prior, it is not at all obvious how to understand and interpret the prior distribution, to ensure that it behaves according to the users\\' prior knowledge. In this article, we approach this problem using the recently developed ideas of penalised complexity (PC) priors. These prior have important properties like robustness and invariance to reparameterisations, as well as a clear interpretation. A PC prior is computed based on specific principles, where model component complexity is penalised in terms of deviation from simple base model formulations. In the AR(1) case, we discuss two natural base model choices, corresponding to either independence in time or no change in time. The latter case is illustrated in a survival model with possible time-dependent frailty. For higher-order processes, we propose a sequential approach, where the base model for AR(p) is the corresponding AR(p-1) model expressed using the partial autocorrelations. The properties of the new prior distribution are compared with the reference prior in a simulation study.

  10. Reliability of information on varicella history in preschool children Confiabilidade da informação sobre antecedente de varicela em crianças pré-escolares

    Directory of Open Access Journals (Sweden)

    Lúcia Ferro Bricks

    2007-01-01

    Full Text Available OBJECTIVE: To verify how reliable is the information provided by parents about the history of varicella in their children. METHODS: 204 parents of previously healthy children attending two municipal day-care centers of São Paulo city were interviewed between August 2003 and September 2005. A standardized form was filled out with information regarding age, sex, history of varicella and other diseases, drug use and antecedent of immunization, After medical history, physical examination and checking of immunization records, 5 ml of blood were collected for ELISA (in house varicella test. Exclusion criteria were: age less than 1 year or more than 60 months, previous immunization against chickenpox, presence of co-morbidities or recent use of immunosuppressive drugs. Data were filed in a data bank using the Excel 2003 Microsoft Office Program and stored in a PC computer. The exact Fisher test was employed to calculate sensibility, specificity, positive and negative predictive values of history of varicella informed by children's parents. RESULTS: The age of the children varied from 12 to 54 months (median, 26 months; 49 (24% children had positive history of varicella, 155 (76% a negative or doubtful history. The predictive positive and negative values of the information were 90% and 93%, respectively (p = 0.0001. CONCLUSIONS: The degree of reliability of information about history of varicella informed by parents of children attending day care centers was high and useful to establish recommendations on varicella blocking immunization in day-care centers.OBJETIVOS: Verificar o grau de confiabilidade da informação fornecida pelos pais de crianças atendidas em creches sobre o antecedente de varicela. MÉTODOS: Os pais de 204 crianças previamente saudáveis matriculadas em duas creches da cidade de São Paulo foram entrevistados entre Agosto de 2003 e Setembro de 2005 para preenchimento de um questionário padronizado com informações sobre idade

  11. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  12. Reliability of electronic systems

    International Nuclear Information System (INIS)

    Roca, Jose L.

    2001-01-01

    Reliability techniques have been developed subsequently as a need of the diverse engineering disciplines, nevertheless they are not few those that think they have been work a lot on reliability before the same word was used in the current context. Military, space and nuclear industries were the first ones that have been involved in this topic, however not only in these environments it is that it has been carried out this small great revolution in benefit of the increase of the reliability figures of the products of those industries, but rather it has extended to the whole industry. The fact of the massive production, characteristic of the current industries, drove four decades ago, to the fall of the reliability of its products, on one hand, because the massively itself and, for other, to the recently discovered and even not stabilized industrial techniques. Industry should be changed according to those two new requirements, creating products of medium complexity and assuring an enough reliability appropriated to production costs and controls. Reliability began to be integral part of the manufactured product. Facing this philosophy, the book describes reliability techniques applied to electronics systems and provides a coherent and rigorous framework for these diverse activities providing a unifying scientific basis for the entire subject. It consists of eight chapters plus a lot of statistical tables and an extensive annotated bibliography. Chapters embrace the following topics: 1- Introduction to Reliability; 2- Basic Mathematical Concepts; 3- Catastrophic Failure Models; 4-Parametric Failure Models; 5- Systems Reliability; 6- Reliability in Design and Project; 7- Reliability Tests; 8- Software Reliability. This book is in Spanish language and has a potentially diverse audience as a text book from academic to industrial courses. (author)

  13. Operational safety reliability research

    International Nuclear Information System (INIS)

    Hall, R.E.; Boccio, J.L.

    1986-01-01

    Operating reactor events such as the TMI accident and the Salem automatic-trip failures raised the concern that during a plant's operating lifetime the reliability of systems could degrade from the design level that was considered in the licensing process. To address this concern, NRC is sponsoring the Operational Safety Reliability Research project. The objectives of this project are to identify the essential tasks of a reliability program and to evaluate the effectiveness and attributes of such a reliability program applicable to maintaining an acceptable level of safety during the operating lifetime at the plant

  14. Circuit design for reliability

    CERN Document Server

    Cao, Yu; Wirth, Gilson

    2015-01-01

    This book presents physical understanding, modeling and simulation, on-chip characterization, layout solutions, and design techniques that are effective to enhance the reliability of various circuit units.  The authors provide readers with techniques for state of the art and future technologies, ranging from technology modeling, fault detection and analysis, circuit hardening, and reliability management. Provides comprehensive review on various reliability mechanisms at sub-45nm nodes; Describes practical modeling and characterization techniques for reliability; Includes thorough presentation of robust design techniques for major VLSI design units; Promotes physical understanding with first-principle simulations.

  15. Putting Priors in Mixture Density Mercer Kernels

    Science.gov (United States)

    Srivastava, Ashok N.; Schumann, Johann; Fischer, Bernd

    2004-01-01

    This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly infinite dimensional feature space. We describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using predefined kernels. These data adaptive kernels can en- code prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS). The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains template for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic- algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code. The results show that the Mixture Density Mercer-Kernel described here outperforms tree-based classification in distinguishing high-redshift galaxies from low- redshift galaxies by approximately 16% on test data, bagged trees by approximately 7%, and bagged trees built on a much larger sample of data by approximately 2%.

  16. PV Systems Reliability Final Technical Report.

    Energy Technology Data Exchange (ETDEWEB)

    Lavrova, Olga [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Flicker, Jack David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Johnson, Jay [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Armijo, Kenneth Miguel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gonzalez, Sigifredo [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schindelholz, Eric John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sorensen, Neil R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Yang, Benjamin Bing-Yeh [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The continued exponential growth of photovoltaic technologies paves a path to a solar-powered world, but requires continued progress toward low-cost, high-reliability, high-performance photovoltaic (PV) systems. High reliability is an essential element in achieving low-cost solar electricity by reducing operation and maintenance (O&M) costs and extending system lifetime and availability, but these attributes are difficult to verify at the time of installation. Utilities, financiers, homeowners, and planners are demanding this information in order to evaluate their financial risk as a prerequisite to large investments. Reliability research and development (R&D) is needed to build market confidence by improving product reliability and by improving predictions of system availability, O&M cost, and lifetime. This project is focused on understanding, predicting, and improving the reliability of PV systems. The two areas being pursued include PV arc-fault and ground fault issues, and inverter reliability.

  17. Reliable design of electronic equipment an engineering guide

    CERN Document Server

    Natarajan, Dhanasekharan

    2014-01-01

    This book explains reliability techniques with examples from electronics design for the benefit of engineers. It presents the application of de-rating, FMEA, overstress analyses and reliability improvement tests for designing reliable electronic equipment. Adequate information is provided for designing computerized reliability database system to support the application of the techniques by designers. Pedantic terms and the associated mathematics of reliability engineering discipline are excluded for the benefit of comprehensiveness and practical applications. This book offers excellent support

  18. Reproducing kernel Hilbert spaces of Gaussian priors

    NARCIS (Netherlands)

    Vaart, van der A.W.; Zanten, van J.H.; Clarke, B.; Ghosal, S.

    2008-01-01

    We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described

  19. Improving Open Access through Prior Learning Assessment

    Science.gov (United States)

    Yin, Shuangxu; Kawachi, Paul

    2013-01-01

    This paper explores and presents new data on how to improve open access in distance education through using prior learning assessments. Broadly there are three types of prior learning assessment (PLAR): Type-1 for prospective students to be allowed to register for a course; Type-2 for current students to avoid duplicating work-load to gain…

  20. Quantitative Evidence Synthesis with Power Priors

    NARCIS (Netherlands)

    Rietbergen, C.|info:eu-repo/dai/nl/322847796

    2016-01-01

    The aim of this thesis is to provide the applied researcher with a practical approach for quantitative evidence synthesis using the conditional power prior that allows for subjective input and thereby provides an alternative tgbgo deal with the difficulties as- sociated with the joint power prior

  1. Competing risk models in reliability systems, a Weibull distribution model with Bayesian analysis approach

    International Nuclear Information System (INIS)

    Iskandar, Ismed; Gondokaryono, Yudi Satria

    2016-01-01

    In reliability theory, the most important problem is to determine the reliability of a complex system from the reliability of its components. The weakness of most reliability theories is that the systems are described and explained as simply functioning or failed. In many real situations, the failures may be from many causes depending upon the age and the environment of the system and its components. Another problem in reliability theory is one of estimating the parameters of the assumed failure models. The estimation may be based on data collected over censored or uncensored life tests. In many reliability problems, the failure data are simply quantitatively inadequate, especially in engineering design and maintenance system. The Bayesian analyses are more beneficial than the classical one in such cases. The Bayesian estimation analyses allow us to combine past knowledge or experience in the form of an apriori distribution with life test data to make inferences of the parameter of interest. In this paper, we have investigated the application of the Bayesian estimation analyses to competing risk systems. The cases are limited to the models with independent causes of failure by using the Weibull distribution as our model. A simulation is conducted for this distribution with the objectives of verifying the models and the estimators and investigating the performance of the estimators for varying sample size. The simulation data are analyzed by using Bayesian and the maximum likelihood analyses. The simulation results show that the change of the true of parameter relatively to another will change the value of standard deviation in an opposite direction. For a perfect information on the prior distribution, the estimation methods of the Bayesian analyses are better than those of the maximum likelihood. The sensitivity analyses show some amount of sensitivity over the shifts of the prior locations. They also show the robustness of the Bayesian analysis within the range

  2. Least-cost failure diagnosis in uncertain reliability systems

    International Nuclear Information System (INIS)

    Cox, Louis Anthony; Chiu, Steve Y.; Sun Xiaorong

    1996-01-01

    In many textbook solutions, for systems failure diagnosis problems studied using reliability theory and artificial intelligence, the prior probabilities of different failure states can be estimated and used to guide the sequential search for failed components after the whole system fails. In practice, however, both the component failure probabilities and the structure function of the system being examined--i.e., the mapping between the states of its components and the state of the system--may not be known with certainty. At best:, the probabilities of different hypothesized system descriptions, each specifying the component failure probabilities and the system's structure function, may be known to a useful approximation, perhaps based on sample data and previous experience. Cost-effective diagnosis of the system's failure state is then a challenging problem. Although the probabilities of component failures are aleatory, uncertainties about these probabilities and about the system structure function are epistemic. This paper examines how to make best use of both epistemic prior probabilities for system descriptions and the information gleaned from costly inspections of component states after the system fails, to minimize the average cost of identifying the failure state. Two approaches are introduced for systems dominated by aleatory uncertainties, one motivated by information theory and the other based on the idea of trying to prove a hypothesis about the identity of the failure state as efficiently as possible. While the general problem of cost-effective failure diagnosis is computationally intractable (NP-hard), both heuristics provide useful approximations on small to moderate sized problems and optimal results for certain common types of reliability systems, including series, parallel, parallel-series, and k-out-of-n systems. A hybrid heuristic that adaptively chooses which heuristic to apply next after any sequence of observations (component test results

  3. Hawaii Electric System Reliability

    Energy Technology Data Exchange (ETDEWEB)

    Loose, Verne William [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva Monroy, Cesar Augusto [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2012-08-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers’ views of reliability “worth” and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers’ views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  4. Hawaii electric system reliability.

    Energy Technology Data Exchange (ETDEWEB)

    Silva Monroy, Cesar Augusto; Loose, Verne William

    2012-09-01

    This report addresses Hawaii electric system reliability issues; greater emphasis is placed on short-term reliability but resource adequacy is reviewed in reference to electric consumers' views of reliability %E2%80%9Cworth%E2%80%9D and the reserve capacity required to deliver that value. The report begins with a description of the Hawaii electric system to the extent permitted by publicly available data. Electrical engineering literature in the area of electric reliability is researched and briefly reviewed. North American Electric Reliability Corporation standards and measures for generation and transmission are reviewed and identified as to their appropriateness for various portions of the electric grid and for application in Hawaii. Analysis of frequency data supplied by the State of Hawaii Public Utilities Commission is presented together with comparison and contrast of performance of each of the systems for two years, 2010 and 2011. Literature tracing the development of reliability economics is reviewed and referenced. A method is explained for integrating system cost with outage cost to determine the optimal resource adequacy given customers' views of the value contributed by reliable electric supply. The report concludes with findings and recommendations for reliability in the State of Hawaii.

  5. Improving machinery reliability

    CERN Document Server

    Bloch, Heinz P

    1998-01-01

    This totally revised, updated and expanded edition provides proven techniques and procedures that extend machinery life, reduce maintenance costs, and achieve optimum machinery reliability. This essential text clearly describes the reliability improvement and failure avoidance steps practiced by best-of-class process plants in the U.S. and Europe.

  6. LED system reliability

    NARCIS (Netherlands)

    Driel, W.D. van; Yuan, C.A.; Koh, S.; Zhang, G.Q.

    2011-01-01

    This paper presents our effort to predict the system reliability of Solid State Lighting (SSL) applications. A SSL system is composed of a LED engine with micro-electronic driver(s) that supplies power to the optic design. Knowledge of system level reliability is not only a challenging scientific

  7. Integrated system reliability analysis

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...

  8. Using Priors to Compensate Geometrical Problems in Head-Mounted Eye Trackers

    DEFF Research Database (Denmark)

    Batista Narcizo, Fabricio; Ahmed, Zaheer; Hansen, Dan Witzner

    The use of additional information (a.k.a. priors) to help the eye tracking process is presented as an alternative to compensate classical geometrical problems in head-mounted eye trackers. Priors can be obtained from several distinct sources, such as: sensors to collect information related...... estimation specially for uncalibrated head-mounted setups....

  9. Design reliability engineering

    International Nuclear Information System (INIS)

    Buden, D.; Hunt, R.N.M.

    1989-01-01

    Improved design techniques are needed to achieve high reliability at minimum cost. This is especially true of space systems where lifetimes of many years without maintenance are needed and severe mass limitations exist. Reliability must be designed into these systems from the start. Techniques are now being explored to structure a formal design process that will be more complete and less expensive. The intent is to integrate the best features of design, reliability analysis, and expert systems to design highly reliable systems to meet stressing needs. Taken into account are the large uncertainties that exist in materials, design models, and fabrication techniques. Expert systems are a convenient method to integrate into the design process a complete definition of all elements that should be considered and an opportunity to integrate the design process with reliability, safety, test engineering, maintenance and operator training. 1 fig

  10. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    Science.gov (United States)

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  11. Lung cancer gene expression database analysis incorporating prior knowledge with support vector machine-based classification method

    Directory of Open Access Journals (Sweden)

    Huang Desheng

    2009-07-01

    Full Text Available Abstract Background A reliable and precise classification is essential for successful diagnosis and treatment of cancer. Gene expression microarrays have provided the high-throughput platform to discover genomic biomarkers for cancer diagnosis and prognosis. Rational use of the available bioinformation can not only effectively remove or suppress noise in gene chips, but also avoid one-sided results of separate experiment. However, only some studies have been aware of the importance of prior information in cancer classification. Methods Together with the application of support vector machine as the discriminant approach, we proposed one modified method that incorporated prior knowledge into cancer classification based on gene expression data to improve accuracy. A public well-known dataset, Malignant pleural mesothelioma and lung adenocarcinoma gene expression database, was used in this study. Prior knowledge is viewed here as a means of directing the classifier using known lung adenocarcinoma related genes. The procedures were performed by software R 2.80. Results The modified method performed better after incorporating prior knowledge. Accuracy of the modified method improved from 98.86% to 100% in training set and from 98.51% to 99.06% in test set. The standard deviations of the modified method decreased from 0.26% to 0 in training set and from 3.04% to 2.10% in test set. Conclusion The method that incorporates prior knowledge into discriminant analysis could effectively improve the capacity and reduce the impact of noise. This idea may have good future not only in practice but also in methodology.

  12. Terminology for pregnancy loss prior to viability

    DEFF Research Database (Denmark)

    Kolte, A M; Bernardi, L A; Christiansen, O B

    2015-01-01

    Pregnancy loss prior to viability is common and research in the field is extensive. Unfortunately, terminology in the literature is inconsistent. The lack of consensus regarding nomenclature and classification of pregnancy loss prior to viability makes it difficult to compare study results from...... different centres. In our opinion, terminology and definitions should be based on clinical findings, and when possible, transvaginal ultrasound. With this Early Pregnancy Consensus Statement, it is our goal to provide clear and consistent terminology for pregnancy loss prior to viability....

  13. Uncertainties and reliability theories for reactor safety

    International Nuclear Information System (INIS)

    Veneziano, D.

    1975-01-01

    What makes the safety problem of nuclear reactors particularly challenging is the demand for high levels of reliability and the limitation of statistical information. The latter is an unfortunate circumstance, which forces deductive theories of reliability to use models and parameter values with weak factual support. The uncertainty about probabilistic models and parameters which are inferred from limited statistical evidence can be quantified and incorporated rationally into inductive theories of reliability. In such theories, the starting point is the information actually available, as opposed to an estimated probabilistic model. But, while the necessity of introducing inductive uncertainty into reliability theories has been recognized by many authors, no satisfactory inductive theory is presently available. The paper presents: a classification of uncertainties and of reliability models for reactor safety; a general methodology to include these uncertainties into reliability analysis; a discussion about the relative advantages and the limitations of various reliability theories (specifically, of inductive and deductive, parametric and nonparametric, second-moment and full-distribution theories). For example, it is shown that second-moment theories, which were originally suggested to cope with the scarcity of data, and which have been proposed recently for the safety analysis of secondary containment vessels, are the least capable of incorporating statistical uncertainty. The focus is on reliability models for external threats (seismic accelerations and tornadoes). As an application example, the effect of statistical uncertainty on seismic risk is studied using parametric full-distribution models

  14. Reliability of construction materials

    International Nuclear Information System (INIS)

    Merz, H.

    1976-01-01

    One can also speak of reliability with respect to materials. While for reliability of components the MTBF (mean time between failures) is regarded as the main criterium, this is replaced with regard to materials by possible failure mechanisms like physical/chemical reaction mechanisms, disturbances of physical or chemical equilibrium, or other interactions or changes of system. The main tasks of the reliability analysis of materials therefore is the prediction of the various failure reasons, the identification of interactions, and the development of nondestructive testing methods. (RW) [de

  15. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  16. Reliability and mechanical design

    International Nuclear Information System (INIS)

    Lemaire, Maurice

    1997-01-01

    A lot of results in mechanical design are obtained from a modelisation of physical reality and from a numerical solution which would lead to the evaluation of needs and resources. The goal of the reliability analysis is to evaluate the confidence which it is possible to grant to the chosen design through the calculation of a probability of failure linked to the retained scenario. Two types of analysis are proposed: the sensitivity analysis and the reliability analysis. Approximate methods are applicable to problems related to reliability, availability, maintainability and safety (RAMS)

  17. RTE - 2013 Reliability Report

    International Nuclear Information System (INIS)

    Denis, Anne-Marie

    2014-01-01

    RTE publishes a yearly reliability report based on a standard model to facilitate comparisons and highlight long-term trends. The 2013 report is not only stating the facts of the Significant System Events (ESS), but it moreover underlines the main elements dealing with the reliability of the electrical power system. It highlights the various elements which contribute to present and future reliability and provides an overview of the interaction between the various stakeholders of the Electrical Power System on the scale of the European Interconnected Network. (author)

  18. A Simulation of Pell Grant Awards and Costs Using Prior-Prior Year Financial Data

    Science.gov (United States)

    Kelchen, Robert; Jones, Gigi

    2015-01-01

    We examine the likely implications of switching from a prior year (PY) financial aid system, the current practice in which students file the Free Application for Federal Student Aid (FAFSA) using income data from the previous tax year, to prior-prior year (PPY), in which data from two years before enrollment is used. While PPY allows students to…

  19. An automated method for estimating reliability of grid systems using Bayesian networks

    International Nuclear Information System (INIS)

    Doguc, Ozge; Emmanuel Ramirez-Marquez, Jose

    2012-01-01

    Grid computing has become relevant due to its applications to large-scale resource sharing, wide-area information transfer, and multi-institutional collaborating. In general, in grid computing a service requests the use of a set of resources, available in a grid, to complete certain tasks. Although analysis tools and techniques for these types of systems have been studied, grid reliability analysis is generally computation-intensive to obtain due to the complexity of the system. Moreover, conventional reliability models have some common assumptions that cannot be applied to the grid systems. Therefore, new analytical methods are needed for effective and accurate assessment of grid reliability. This study presents a new method for estimating grid service reliability, which does not require prior knowledge about the grid system structure unlike the previous studies. Moreover, the proposed method does not rely on any assumptions about the link and node failure rates. This approach is based on a data-mining algorithm, the K2, to discover the grid system structure from raw historical system data, that allows to find minimum resource spanning trees (MRST) within the grid then, uses Bayesian networks (BN) to model the MRST and estimate grid service reliability.

  20. Prior Authorization of PMDs Demonstration - Status Update

    Data.gov (United States)

    U.S. Department of Health & Human Services — CMS implemented a Prior Authorization process for scooters and power wheelchairs for people with Fee-For-Service Medicare who reside in seven states with high...

  1. Short Report Biochemical derangements prior to emergency ...

    African Journals Online (AJOL)

    MMJ VOL 29 (1): March 2017. Biochemical derangements prior to emergency laparotomy at QECH 55. Malawi Medical Journal 29 (1): March 2017 ... Venepuncture was performed preoperatively for urgent cases, defined as those requiring.

  2. Recommendations for certification or measurement of reliability for reliable digital archival repositories with emphasis on access

    Directory of Open Access Journals (Sweden)

    Paula Regina Ventura Amorim Gonçalez

    2017-04-01

    Full Text Available Introduction: Considering the guidelines of ISO 16363: 2012 (Space data and information transfer systems -- Audit and certification of trustworthy digital repositories and the text of CONARQ Resolution 39 for certification of Reliable Digital Archival Repository (RDC-Arq, verify the technical recommendations should be used as the basis for a digital archival repository to be considered reliable. Objective: Identify requirements for the creation of Reliable Digital Archival Repositories with emphasis on access to information from the ISO 16363: 2012 and CONARQ Resolution 39. Methodology: For the development of the study, the methodology consisted of an exploratory, descriptive and documentary theoretical investigation, since it is based on ISO 16363: 2012 and CONARQ Resolution 39. From the perspective of the problem approach, the study is qualitative and quantitative, since the data were collected, tabulated, and analyzed from the interpretation of their contents. Results: We presented a set of Checklist Recommendations for reliability measurement and/or certification for RDC-Arq with a clipping focused on the identification of requirements with emphasis on access to information is presented. Conclusions: The right to information as well as access to reliable information is a premise for Digital Archival Repositories, so the set of recommendations is directed to archivists who work in Digital Repositories and wish to verify the requirements necessary to evaluate the reliability of the Digital Repository or still guide the information professional in collecting requirements for repository reliability certification.

  3. Approach to reliability assessment

    International Nuclear Information System (INIS)

    Green, A.E.; Bourne, A.J.

    1975-01-01

    Experience has shown that reliability assessments can play an important role in the early design and subsequent operation of technological systems where reliability is at a premium. The approaches to and techniques for such assessments, which have been outlined in the paper, have been successfully applied in variety of applications ranging from individual equipments to large and complex systems. The general approach involves the logical and systematic establishment of the purpose, performance requirements and reliability criteria of systems. This is followed by an appraisal of likely system achievment based on the understanding of different types of variational behavior. A fundamental reliability model emerges from the correlation between the appropriate Q and H functions for performance requirement and achievement. This model may cover the complete spectrum of performance behavior in all the system dimensions

  4. Attentional and Contextual Priors in Sound Perception.

    Science.gov (United States)

    Wolmetz, Michael; Elhilali, Mounya

    2016-01-01

    Behavioral and neural studies of selective attention have consistently demonstrated that explicit attentional cues to particular perceptual features profoundly alter perception and performance. The statistics of the sensory environment can also provide cues about what perceptual features to expect, but the extent to which these more implicit contextual cues impact perception and performance, as well as their relationship to explicit attentional cues, is not well understood. In this study, the explicit cues, or attentional prior probabilities, and the implicit cues, or contextual prior probabilities, associated with different acoustic frequencies in a detection task were simultaneously manipulated. Both attentional and contextual priors had similarly large but independent impacts on sound detectability, with evidence that listeners tracked and used contextual priors for a variety of sound classes (pure tones, harmonic complexes, and vowels). Further analyses showed that listeners updated their contextual priors rapidly and optimally, given the changing acoustic frequency statistics inherent in the paradigm. A Bayesian Observer model accounted for both attentional and contextual adaptations found with listeners. These results bolster the interpretation of perception as Bayesian inference, and suggest that some effects attributed to selective attention may be a special case of contextual prior integration along a feature axis.

  5. Modeling and validating Bayesian accrual models on clinical data and simulations using adaptive priors.

    Science.gov (United States)

    Jiang, Yu; Simon, Steve; Mayo, Matthew S; Gajewski, Byron J

    2015-02-20

    Slow recruitment in clinical trials leads to increased costs and resource utilization, which includes both the clinic staff and patient volunteers. Careful planning and monitoring of the accrual process can prevent the unnecessary loss of these resources. We propose two hierarchical extensions to the existing Bayesian constant accrual model: the accelerated prior and the hedging prior. The new proposed priors are able to adaptively utilize the researcher's previous experience and current accrual data to produce the estimation of trial completion time. The performance of these models, including prediction precision, coverage probability, and correct decision-making ability, is evaluated using actual studies from our cancer center and simulation. The results showed that a constant accrual model with strongly informative priors is very accurate when accrual is on target or slightly off, producing smaller mean squared error, high percentage of coverage, and a high number of correct decisions as to whether or not continue the trial, but it is strongly biased when off target. Flat or weakly informative priors provide protection against an off target prior but are less efficient when the accrual is on target. The accelerated prior performs similar to a strong prior. The hedging prior performs much like the weak priors when the accrual is extremely off target but closer to the strong priors when the accrual is on target or only slightly off target. We suggest improvements in these models and propose new models for future research. Copyright © 2014 John Wiley & Sons, Ltd.

  6. The rating reliability calculator

    Directory of Open Access Journals (Sweden)

    Solomon David J

    2004-04-01

    Full Text Available Abstract Background Rating scales form an important means of gathering evaluation data. Since important decisions are often based on these evaluations, determining the reliability of rating data can be critical. Most commonly used methods of estimating reliability require a complete set of ratings i.e. every subject being rated must be rated by each judge. Over fifty years ago Ebel described an algorithm for estimating the reliability of ratings based on incomplete data. While his article has been widely cited over the years, software based on the algorithm is not readily available. This paper describes an easy-to-use Web-based utility for estimating the reliability of ratings based on incomplete data using Ebel's algorithm. Methods The program is available public use on our server and the source code is freely available under GNU General Public License. The utility is written in PHP, a common open source imbedded scripting language. The rating data can be entered in a convenient format on the user's personal computer that the program will upload to the server for calculating the reliability and other statistics describing the ratings. Results When the program is run it displays the reliability, number of subject rated, harmonic mean number of judges rating each subject, the mean and standard deviation of the averaged ratings per subject. The program also displays the mean, standard deviation and number of ratings for each subject rated. Additionally the program will estimate the reliability of an average of a number of ratings for each subject via the Spearman-Brown prophecy formula. Conclusion This simple web-based program provides a convenient means of estimating the reliability of rating data without the need to conduct special studies in order to provide complete rating data. I would welcome other researchers revising and enhancing the program.

  7. Structural systems reliability analysis

    International Nuclear Information System (INIS)

    Frangopol, D.

    1975-01-01

    For an exact evaluation of the reliability of a structure it appears necessary to determine the distribution densities of the loads and resistances and to calculate the correlation coefficients between loads and between resistances. These statistical characteristics can be obtained only on the basis of a long activity period. In case that such studies are missing the statistical properties formulated here give upper and lower bounds of the reliability. (orig./HP) [de

  8. Reliability and maintainability

    International Nuclear Information System (INIS)

    1994-01-01

    Several communications in this conference are concerned with nuclear plant reliability and maintainability; their titles are: maintenance optimization of stand-by Diesels of 900 MW nuclear power plants; CLAIRE: an event-based simulation tool for software testing; reliability as one important issue within the periodic safety review of nuclear power plants; design of nuclear building ventilation by the means of functional analysis; operation characteristic analysis for a power industry plant park, as a function of influence parameters

  9. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  10. Extended likelihood inference in reliability

    International Nuclear Information System (INIS)

    Martz, H.F. Jr.; Beckman, R.J.; Waller, R.A.

    1978-10-01

    Extended likelihood methods of inference are developed in which subjective information in the form of a prior distribution is combined with sampling results by means of an extended likelihood function. The extended likelihood function is standardized for use in obtaining extended likelihood intervals. Extended likelihood intervals are derived for the mean of a normal distribution with known variance, the failure-rate of an exponential distribution, and the parameter of a binomial distribution. Extended second-order likelihood methods are developed and used to solve several prediction problems associated with the exponential and binomial distributions. In particular, such quantities as the next failure-time, the number of failures in a given time period, and the time required to observe a given number of failures are predicted for the exponential model with a gamma prior distribution on the failure-rate. In addition, six types of life testing experiments are considered. For the binomial model with a beta prior distribution on the probability of nonsurvival, methods are obtained for predicting the number of nonsurvivors in a given sample size and for predicting the required sample size for observing a specified number of nonsurvivors. Examples illustrate each of the methods developed. Finally, comparisons are made with Bayesian intervals in those cases where these are known to exist

  11. Health Information on the Web: Finding Reliable Information

    Science.gov (United States)

    ... NutritionBalancing Your Food PlateFollow the U.S. Department of Agriculture’s MyPlate and other guidelines to get a balanced ... Rehabilitation Emotional Well-Being Mental Health Sex and Birth Control Sex and Sexuality Birth Control Family Health ...

  12. Nuclear plant reliability data system. 1979 annual reports of cumulative system and component reliability

    International Nuclear Information System (INIS)

    1979-01-01

    The primary purposes of the information in these reports are the following: to provide operating statistics of safety-related systems within a unit which may be used to compare and evaluate reliability performance and to provide failure mode and failure rate statistics on components which may be used in failure mode effects analysis, fault hazard analysis, probabilistic reliability analysis, and so forth

  13. Analysis and Application of Reliability

    International Nuclear Information System (INIS)

    Jeong, Hae Seong; Park, Dong Ho; Kim, Jae Ju

    1999-05-01

    This book tells of analysis and application of reliability, which includes definition, importance and historical background of reliability, function of reliability and failure rate, life distribution and assumption of reliability, reliability of unrepaired system, reliability of repairable system, sampling test of reliability, failure analysis like failure analysis by FEMA and FTA, and cases, accelerated life testing such as basic conception, acceleration and acceleration factor, and analysis of accelerated life testing data, maintenance policy about alternation and inspection.

  14. Morbidity in early Parkinson's disease and prior to diagnosis

    DEFF Research Database (Denmark)

    Frandsen, Rune; Kjellberg, Jakob; Ibsen, Rikke

    2014-01-01

    BACKGROUND: Nonmotor symptoms are probably present prior to, early on, and following, a diagnosis of Parkinson's disease. Nonmotor symptoms may hold important information about the progression of Parkinson's disease. OBJECTIVE: To evaluated the total early and prediagnostic morbidities in the 3......, poisoning and certain other external causes, and other factors influencing health status and contact with health services. It was negatively associated with neoplasm, cardiovascular, and respiratory diseases. CONCLUSIONS: Patients with a diagnosis of Parkinson's disease present significant differences...

  15. Mission Reliability Estimation for Repairable Robot Teams

    Science.gov (United States)

    Trebi-Ollennu, Ashitey; Dolan, John; Stancliff, Stephen

    2010-01-01

    A mission reliability estimation method has been designed to translate mission requirements into choices of robot modules in order to configure a multi-robot team to have high reliability at minimal cost. In order to build cost-effective robot teams for long-term missions, one must be able to compare alternative design paradigms in a principled way by comparing the reliability of different robot models and robot team configurations. Core modules have been created including: a probabilistic module with reliability-cost characteristics, a method for combining the characteristics of multiple modules to determine an overall reliability-cost characteristic, and a method for the generation of legitimate module combinations based on mission specifications and the selection of the best of the resulting combinations from a cost-reliability standpoint. The developed methodology can be used to predict the probability of a mission being completed, given information about the components used to build the robots, as well as information about the mission tasks. In the research for this innovation, sample robot missions were examined and compared to the performance of robot teams with different numbers of robots and different numbers of spare components. Data that a mission designer would need was factored in, such as whether it would be better to have a spare robot versus an equivalent number of spare parts, or if mission cost can be reduced while maintaining reliability using spares. This analytical model was applied to an example robot mission, examining the cost-reliability tradeoffs among different team configurations. Particularly scrutinized were teams using either redundancy (spare robots) or repairability (spare components). Using conservative estimates of the cost-reliability relationship, results show that it is possible to significantly reduce the cost of a robotic mission by using cheaper, lower-reliability components and providing spares. This suggests that the

  16. MOV reliability evaluation and periodic verification scheduling

    Energy Technology Data Exchange (ETDEWEB)

    Bunte, B.D.

    1996-12-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs.

  17. MOV reliability evaluation and periodic verification scheduling

    International Nuclear Information System (INIS)

    Bunte, B.D.

    1996-01-01

    The purpose of this paper is to establish a periodic verification testing schedule based on the expected long term reliability of gate or globe motor operated valves (MOVs). The methodology in this position paper determines the nominal (best estimate) design margin for any MOV based on the best available information pertaining to the MOVs design requirements, design parameters, existing hardware design, and present setup. The uncertainty in this margin is then determined using statistical means. By comparing the nominal margin to the uncertainty, the reliability of the MOV is estimated. The methodology is appropriate for evaluating the reliability of MOVs in the GL 89-10 program. It may be used following periodic testing to evaluate and trend MOV performance and reliability. It may also be used to evaluate the impact of proposed modifications and maintenance activities such as packing adjustments. In addition, it may be used to assess the impact of new information of a generic nature which impacts safety related MOVs

  18. Component reliability analysis for development of component reliability DB of Korean standard NPPs

    International Nuclear Information System (INIS)

    Choi, S. Y.; Han, S. H.; Kim, S. H.

    2002-01-01

    The reliability data of Korean NPP that reflects the plant specific characteristics is necessary for PSA and Risk Informed Application. We have performed a project to develop the component reliability DB and calculate the component reliability such as failure rate and unavailability. We have collected the component operation data and failure/repair data of Korean standard NPPs. We have analyzed failure data by developing a data analysis method which incorporates the domestic data situation. And then we have compared the reliability results with the generic data for the foreign NPPs

  19. Reliability of windstorm predictions in the ECMWF ensemble prediction system

    Science.gov (United States)

    Becker, Nico; Ulbrich, Uwe

    2016-04-01

    Windstorms caused by extratropical cyclones are one of the most dangerous natural hazards in the European region. Therefore, reliable predictions of such storm events are needed. Case studies have shown that ensemble prediction systems (EPS) are able to provide useful information about windstorms between two and five days prior to the event. In this work, ensemble predictions with the European Centre for Medium-Range Weather Forecasts (ECMWF) EPS are evaluated in a four year period. Within the 50 ensemble members, which are initialized every 12 hours and are run for 10 days, windstorms are identified and tracked in time and space. By using a clustering approach, different predictions of the same storm are identified in the different ensemble members and compared to reanalysis data. The occurrence probability of the predicted storms is estimated by fitting a bivariate normal distribution to the storm track positions. Our results show, for example, that predicted storm clusters with occurrence probabilities of more than 50% have a matching observed storm in 80% of all cases at a lead time of two days. The predicted occurrence probabilities are reliable up to 3 days lead time. At longer lead times the occurrence probabilities are overestimated by the EPS.

  20. Photovoltaic Module Reliability Workshop 2010: February 18-19, 2010

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, J.

    2013-11-01

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  1. Photovoltaic Module Reliability Workshop 2012: February 28 - March 1, 2012

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.

    2013-11-01

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  2. Reliability of fossil-fuel and nuclear power installations

    International Nuclear Information System (INIS)

    1983-01-01

    The conference heard a total of 37 papers of which 24 were inputted in INIS. The subject area was mainly the use of reliability information systems and the production of data banks for these systems, the application of the reliability theory and the reliability analysis of equipment and systems of nuclear power plants. (J.P.)

  3. Photovoltaic Module Reliability Workshop 2011: February 16-17, 2011

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.

    2013-11-01

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  4. Photovoltaic Module Reliability Workshop 2013: February 26-27, 2013

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.

    2013-10-01

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  5. Photovoltaic Module Reliability Workshop 2014: February 25-26, 2014

    Energy Technology Data Exchange (ETDEWEB)

    Kurtz, S.

    2014-02-01

    NREL's Photovoltaic (PV) Module Reliability Workshop (PVMRW) brings together PV reliability experts to share information, leading to the improvement of PV module reliability. Such improvement reduces the cost of solar electricity and promotes investor confidence in the technology--both critical goals for moving PV technologies deeper into the electricity marketplace.

  6. Reliable computer systems design and evaluatuion

    CERN Document Server

    Siewiorek, Daniel

    2014-01-01

    Enhance your hardware/software reliabilityEnhancement of system reliability has been a major concern of computer users and designers ¦ and this major revision of the 1982 classic meets users' continuing need for practical information on this pressing topic. Included are case studies of reliablesystems from manufacturers such as Tandem, Stratus, IBM, and Digital, as well as coverage of special systems such as the Galileo Orbiter fault protection system and AT&T telephone switching processors.

  7. Structural reliability of atomic power plant

    International Nuclear Information System (INIS)

    Klemin, A.I.; Polyakov, E.F.

    1980-01-01

    In 1978 the first specialized technical manual ''Technique of Calculating the Structural Reliability of an Atomic Power Plant and Its Systems in the Design Stage'' was developed. The present article contains information about the main characteristics and capabilities of the manual. The manual gives recommendations concerning the calculations of the reliability of such specific systems as the reactor control and safety system, the system of instrumentation and automatic control, and safety systems. 2 refs

  8. Safety and reliability criteria

    International Nuclear Information System (INIS)

    O'Neil, R.

    1978-01-01

    Nuclear power plants and, in particular, reactor pressure boundary components have unique reliability requirements, in that usually no significant redundancy is possible, and a single failure can give rise to possible widespread core damage and fission product release. Reliability may be required for availability or safety reasons, but in the case of the pressure boundary and certain other systems safety may dominate. Possible Safety and Reliability (S and R) criteria are proposed which would produce acceptable reactor design. Without some S and R requirement the designer has no way of knowing how far he must go in analysing his system or component, or whether his proposed solution is likely to gain acceptance. The paper shows how reliability targets for given components and systems can be individually considered against the derived S and R criteria at the design and construction stage. Since in the case of nuclear pressure boundary components there is often very little direct experience on which to base reliability studies, relevant non-nuclear experience is examined. (author)

  9. Proposed reliability cost model

    Science.gov (United States)

    Delionback, L. M.

    1973-01-01

    The research investigations which were involved in the study include: cost analysis/allocation, reliability and product assurance, forecasting methodology, systems analysis, and model-building. This is a classic example of an interdisciplinary problem, since the model-building requirements include the need for understanding and communication between technical disciplines on one hand, and the financial/accounting skill categories on the other. The systems approach is utilized within this context to establish a clearer and more objective relationship between reliability assurance and the subcategories (or subelements) that provide, or reenforce, the reliability assurance for a system. Subcategories are further subdivided as illustrated by a tree diagram. The reliability assurance elements can be seen to be potential alternative strategies, or approaches, depending on the specific goals/objectives of the trade studies. The scope was limited to the establishment of a proposed reliability cost-model format. The model format/approach is dependent upon the use of a series of subsystem-oriented CER's and sometimes possible CTR's, in devising a suitable cost-effective policy.

  10. Operator reliability assessment system (OPERAS)

    International Nuclear Information System (INIS)

    Singh, A.; Spurgin, A.J.; Martin, T.; Welsch, J.; Hallam, J.W.

    1991-01-01

    OPERAS is a personal-computer (PC) based software to collect and process simulator data on control-room operators responses during requalification training scenarios. The data collection scheme is based upon approach developed earlier during the EPRI Operator Reliability Experiments project. The software allows automated data collection from simulator, thus minimizing simulator staff time and resources to collect, maintain and process data which can be useful in monitoring, assessing and enhancing the progress of crew reliability and effectiveness. The system is designed to provide the data and output information in the form of user-friendly charts, tables and figures for use by plant staff. OPERAS prototype software has been implemented at the Diablo Canyon (PWR) and Millstone (BWR) plants and is currently being used to collect operator response data. Data collected from similator include plant-state variables such as reactor pressure and temperature, malfunction, times at which annunciators are activated, operator actions and observations of crew behavior by training staff. The data and systematic analytical results provided by the OPERAS system can contribute to increase objectivity by the utility probabilistic risk analysis (PRA) and training staff in monitoring and assessing reliability of their crews

  11. Preparing learners with partly incorrect intuitive prior knowledge for learning

    Directory of Open Access Journals (Sweden)

    Andrea eOhst

    2014-07-01

    Full Text Available Learners sometimes have incoherent and fragmented intuitive prior knowledge that is (partly ‘incompatible’ with the to-be-learned contents. Such knowledge in pieces can cause conceptual disorientation and cognitive overload while learning. We hypothesized that a pre-training intervention providing a generalized schema as a structuring framework for such knowledge in pieces would support (reorganizing-processes of prior knowledge and thus reduce unnecessary cognitive load during subsequent learning. Fifty-six student teachers participated in the experiment. A framework group underwent a pre-training intervention providing a generalized, categorical schema for categorizing primary learning strategies and related but different strategies as a cognitive framework for (re-organizing their prior knowledge. Our control group received comparable factual information but no framework. Afterwards, all participants learned about primary learning strategies. The framework group claimed to possess higher levels of interest and self-efficacy, achieved higher learning outcomes, and learned more efficiently. Hence, providing a categorical framework can help overcome the barrier of incorrect prior knowledge in pieces.

  12. Preparing learners with partly incorrect intuitive prior knowledge for learning

    Science.gov (United States)

    Ohst, Andrea; Fondu, Béatrice M. E.; Glogger, Inga; Nückles, Matthias; Renkl, Alexander

    2014-01-01

    Learners sometimes have incoherent and fragmented intuitive prior knowledge that is (partly) “incompatible” with the to-be-learned contents. Such knowledge in pieces can cause conceptual disorientation and cognitive overload while learning. We hypothesized that a pre-training intervention providing a generalized schema as a structuring framework for such knowledge in pieces would support (re)organizing-processes of prior knowledge and thus reduce unnecessary cognitive load during subsequent learning. Fifty-six student teachers participated in the experiment. A framework group underwent a pre-training intervention providing a generalized, categorical schema for categorizing primary learning strategies and related but different strategies as a cognitive framework for (re-)organizing their prior knowledge. Our control group received comparable factual information but no framework. Afterwards, all participants learned about primary learning strategies. The framework group claimed to possess higher levels of interest and self-efficacy, achieved higher learning outcomes, and learned more efficiently. Hence, providing a categorical framework can help overcome the barrier of incorrect prior knowledge in pieces. PMID:25071638

  13. Offending prior to first psychiatric contact

    DEFF Research Database (Denmark)

    Stevens, H; Agerbo, E; Dean, K

    2012-01-01

    There is a well-established association between psychotic disorders and subsequent offending but the extent to which those who develop psychosis might have a prior history of offending is less clear. Little is known about whether the association between illness and offending exists in non-psychot......-psychotic disorders. The aim of this study was to determine whether the association between mental disorder and offending is present prior to illness onset in psychotic and non-psychotic disorders.......There is a well-established association between psychotic disorders and subsequent offending but the extent to which those who develop psychosis might have a prior history of offending is less clear. Little is known about whether the association between illness and offending exists in non...

  14. GENERAL ASPECTS REGARDING THE PRIOR DISCIPLINARY RESEARCH

    Directory of Open Access Journals (Sweden)

    ANDRA PURAN (DASCĂLU

    2012-05-01

    Full Text Available Disciplinary research is the first phase of the disciplinary action. According to art. 251 paragraph 1 of the Labour Code no disciplinary sanction may be ordered before performing the prior disciplinary research.These regulations provide an exception: the sanction of written warning. The current regulations in question, kept from the old regulation, provides a protection for employees against abuses made by employers, since sanctions are affecting the salary or the position held, or even the development of individual employment contract. Thus, prior research of the fact that is a misconduct, before a disciplinary sanction is applied, is an essential condition for the validity of the measure ordered. Through this study we try to highlight some general issues concerning the characteristics, processes and effects of prior disciplinary research.

  15. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  16. Issues in cognitive reliability

    International Nuclear Information System (INIS)

    Woods, D.D.; Hitchler, M.J.; Rumancik, J.A.

    1984-01-01

    This chapter examines some problems in current methods to assess reactor operator reliability at cognitive tasks and discusses new approaches to solve these problems. The two types of human failures are errors in the execution of an intention and errors in the formation/selection of an intention. Topics considered include the types of description, error correction, cognitive performance and response time, the speed-accuracy tradeoff function, function based task analysis, and cognitive task analysis. One problem of human reliability analysis (HRA) techniques in general is the question of what are the units of behavior whose reliability are to be determined. A second problem for HRA is that people often detect and correct their errors. The use of function based analysis, which maps the problem space for plant control, is recommended

  17. Visibility Restoration for Single Hazy Image Using Dual Prior Knowledge

    Directory of Open Access Journals (Sweden)

    Mingye Ju

    2017-01-01

    Full Text Available Single image haze removal has been a challenging task due to its super ill-posed nature. In this paper, we propose a novel single image algorithm that improves the detail and color of such degraded images. More concretely, we redefine a more reliable atmospheric scattering model (ASM based on our previous work and the atmospheric point spread function (APSF. Further, by taking the haze density spatial feature into consideration, we design a scene-wise APSF kernel prediction mechanism to eliminate the multiple-scattering effect. With the redefined ASM and designed APSF, combined with the existing prior knowledge, the complex dehazing problem can be subtly converted into one-dimensional searching problem, which allows us to directly obtain the scene transmission and thereby recover visually realistic results via the proposed ASM. Experimental results verify that our algorithm outperforms several state-of-the-art dehazing techniques in terms of robustness, effectiveness, and efficiency.

  18. Reliability issues in PACS

    Science.gov (United States)

    Taira, Ricky K.; Chan, Kelby K.; Stewart, Brent K.; Weinberg, Wolfram S.

    1991-07-01

    Reliability is an increasing concern when moving PACS from the experimental laboratory to the clinical environment. Any system downtime may seriously affect patient care. The authors report on the several classes of errors encountered during the pre-clinical release of the PACS during the past several months and present the solutions implemented to handle them. The reliability issues discussed include: (1) environmental precautions, (2) database backups, (3) monitor routines of critical resources and processes, (4) hardware redundancy (networks, archives), and (5) development of a PACS quality control program.

  19. Reliability Parts Derating Guidelines

    Science.gov (United States)

    1982-06-01

    226-30, October 1974. 66 I, 26. "Reliability of GAAS Injection Lasers", De Loach , B. C., Jr., 1973 IEEE/OSA Conference on Laser Engineering and...Vol. R-23, No. 4, 226-30, October 1974. 28. "Reliability of GAAS Injection Lasers", De Loach , B. C., Jr., 1973 IEEE/OSA Conference on Laser...opnatien ot 󈨊 deg C, mounted on a 4-inach square 0.250~ inch thick al~loy alum~nusi panel.. This mounting technique should be L~ ken into cunoidur~tiou

  20. Guidelines for Reporting Reliability and Agreement Studies (GRRAS) were proposed

    DEFF Research Database (Denmark)

    Kottner, Jan; Audigé, Laurent; Brorson, Stig

    2011-01-01

    Results of reliability and agreement studies are intended to provide information about the amount of error inherent in any diagnosis, score, or measurement. The level of reliability and agreement among users of scales, instruments, or classifications is widely unknown. Therefore, there is a need ......, standards, or guidelines for reporting reliability and agreement in the health care and medical field are lacking. The objective was to develop guidelines for reporting reliability and agreement studies....

  1. Reliability databases: State-of-the-art and perspectives

    DEFF Research Database (Denmark)

    Akhmedjanov, Farit

    2001-01-01

    The report gives a history of development and an overview of the existing reliability databases. This overview also describes some other (than computer databases) sources of reliability and failures information, e.g. reliability handbooks, but the mainattention is paid to standard models...... and software packages containing the data mentioned. The standards corresponding to collection and exchange of reliability data are observed too. Finally, perspective directions in such data sources development areshown....

  2. Effects of the Informed Health Choices primary school intervention on the ability of children in Uganda to assess the reliability of claims about treatment effects: a cluster-randomised controlled trial.

    Science.gov (United States)

    Nsangi, Allen; Semakula, Daniel; Oxman, Andrew D; Austvoll-Dahlgren, Astrid; Oxman, Matt; Rosenbaum, Sarah; Morelli, Angela; Glenton, Claire; Lewin, Simon; Kaseje, Margaret; Chalmers, Iain; Fretheim, Atle; Ding, Yunpeng; Sewankambo, Nelson K

    2017-07-22

    Claims about what improves or harms our health are ubiquitous. People need to be able to assess the reliability of these claims. We aimed to evaluate an intervention designed to teach primary school children to assess claims about the effects of treatments (ie, any action intended to maintain or improve health). In this cluster-randomised controlled trial, we included primary schools in the central region of Uganda that taught year-5 children (aged 10-12 years). We excluded international schools, special needs schools for children with auditory and visual impairments, schools that had participated in user-testing and piloting of the resources, infant and nursery schools, adult education schools, and schools that were difficult for us to access in terms of travel time. We randomly allocated a representative sample of eligible schools to either an intervention or control group. Intervention schools received the Informed Health Choices primary school resources (textbooks, exercise books, and a teachers' guide). Teachers attended a 2 day introductory workshop and gave nine 80 min lessons during one school term. The lessons addressed 12 concepts essential to assessing claims about treatment effects and making informed health choices. We did not intervene in the control schools. The primary outcome, measured at the end of the school term, was the mean score on a test with two multiple-choice questions for each of the 12 concepts and the proportion of children with passing scores on the same test. This trial is registered with the Pan African Clinical Trial Registry, number PACTR201606001679337. Between April 11, 2016, and June 8, 2016, 2960 schools were assessed for eligibility; 2029 were eligible, and a random sample of 170 were invited to recruitment meetings. After recruitment meetings, 120 eligible schools consented and were randomly assigned to either the intervention group (n=60, 76 teachers and 6383 children) or control group (n=60, 67 teachers and 4430 children

  3. Information and Informality

    DEFF Research Database (Denmark)

    Larsson, Magnus; Segerstéen, Solveig; Svensson, Cathrin

    2011-01-01

    leaders on the basis of their possession of reliable knowledge in technical as well as organizational domains. The informal leaders engaged in interpretation and brokering of information and knowledge, as well as in mediating strategic values and priorities on both formal and informal arenas. Informal...... leaders were thus seen to function on the level of the organization as a whole, and in cooperation with formal leaders. Drawing on existing theory of leadership in creative and professional contexts, this cooperation can be specified to concern task structuring. The informal leaders in our study...... contributed to task structuring through sensemaking activities, while formal leaders focused on aspects such as clarifying output expectations, providing feedback, project structure, and diversity....

  4. Feedback reliability calculation for an iterative block decision feedback equalizer

    OpenAIRE

    Huang, G; Nix, AR; Armour, SMD

    2009-01-01

    A new class of iterative block decision feedback equalizer (IB-DFE) was pioneered by Chan and Benvenuto. Unlike the conventional DFE, the IB-DFE is optimized according to the reliability of the feedback (FB) symbols. Since the use of the training sequence (TS) for feedback reliability (FBR) estimation lowers the bandwidth efficiency, FBR estimation without the need for additional TS is of considerable interest. However, prior FBR estimation is limited in the literature to uncoded M-ary phases...

  5. Prior Knowledge Facilitates Mutual Gaze Convergence and Head Nodding Synchrony in Face-to-face Communication.

    Science.gov (United States)

    Thepsoonthorn, C; Yokozuka, T; Miura, S; Ogawa, K; Miyake, Y

    2016-12-02

    As prior knowledge is claimed to be an essential key to achieve effective education, we are interested in exploring whether prior knowledge enhances communication effectiveness. To demonstrate the effects of prior knowledge, mutual gaze convergence and head nodding synchrony are observed as indicators of communication effectiveness. We conducted an experiment on lecture task between lecturer and student under 2 conditions: prior knowledge and non-prior knowledge. The students in prior knowledge condition were provided the basic information about the lecture content and were assessed their understanding by the experimenter before starting the lecture while the students in non-prior knowledge had none. The result shows that the interaction in prior knowledge condition establishes significantly higher mutual gaze convergence (t(15.03) = 6.72, p < 0.0001; α = 0.05, n = 20) and head nodding synchrony (t(16.67) = 1.83, p = 0.04; α = 0.05, n = 19) compared to non-prior knowledge condition. This study reveals that prior knowledge facilitates mutual gaze convergence and head nodding synchrony. Furthermore, the interaction with and without prior knowledge can be evaluated by measuring or observing mutual gaze convergence and head nodding synchrony.

  6. PIRPLE: a penalized-likelihood framework for incorporation of prior images in CT reconstruction

    International Nuclear Information System (INIS)

    Stayman, J Webster; Dang, Hao; Ding, Yifu; Siewerdsen, Jeffrey H

    2013-01-01

    Over the course of diagnosis and treatment, it is common for a number of imaging studies to be acquired. Such imaging sequences can provide substantial patient-specific prior knowledge about the anatomy that can be incorporated into a prior-image-based tomographic reconstruction for improved image quality and better dose utilization. We present a general methodology using a model-based reconstruction approach including formulations of the measurement noise that also integrates prior images. This penalized-likelihood technique adopts a sparsity enforcing penalty that incorporates prior information yet allows for change between the current reconstruction and the prior image. Moreover, since prior images are generally not registered with the current image volume, we present a modified model-based approach that seeks a joint registration of the prior image in addition to the reconstruction of projection data. We demonstrate that the combined prior-image- and model-based technique outperforms methods that ignore the prior data or lack a noise model. Moreover, we demonstrate the importance of registration for prior-image-based reconstruction methods and show that the prior-image-registered penalized-likelihood estimation (PIRPLE) approach can maintain a high level of image quality in the presence of noisy and undersampled projection data. (paper)

  7. The reliability of commonly used electrophysiology measures.

    Science.gov (United States)

    Brown, K E; Lohse, K R; Mayer, I M S; Strigaro, G; Desikan, M; Casula, E P; Meunier, S; Popa, T; Lamy, J-C; Odish, O; Leavitt, B R; Durr, A; Roos, R A C; Tabrizi, S J; Rothwell, J C; Boyd, L A; Orth, M

    Electrophysiological measures can help understand brain function both in healthy individuals and in the context of a disease. Given the amount of information that can be extracted from these measures and their frequent use, it is essential to know more about their inherent reliability. To understand the reliability of electrophysiology measures in healthy individuals. We hypothesized that measures of threshold and latency would be the most reliable and least susceptible to methodological differences between study sites. Somatosensory evoked potentials from 112 control participants; long-latency reflexes, transcranial magnetic stimulation with resting and active motor thresholds, motor evoked potential latencies, input/output curves, and short-latency sensory afferent inhibition and facilitation from 84 controls were collected at 3 visits over 24 months at 4 Track-On HD study sites. Reliability was assessed using intra-class correlation coefficients for absolute agreement, and the effects of reliability on statistical power are demonstrated for different sample sizes and study designs. Measures quantifying latencies, thresholds, and evoked responses at high stimulator intensities had the highest reliability, and required the smallest sample sizes to adequately power a study. Very few between-site differences were detected. Reliability and susceptibility to between-site differences should be evaluated for electrophysiological measures before including them in study designs. Levels of reliability vary substantially across electrophysiological measures, though there are few between-site differences. To address this, reliability should be used in conjunction with theoretical calculations to inform sample size and ensure studies are adequately powered to detect true change in measures of interest. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Recognition of Prior Learning: The Participants' Perspective

    Science.gov (United States)

    Miguel, Marta C.; Ornelas, José H.; Maroco, João P.

    2016-01-01

    The current narrative on lifelong learning goes beyond formal education and training, including learning at work, in the family and in the community. Recognition of prior learning is a process of evaluation of those skills and knowledge acquired through life experience, allowing them to be formally recognized by the qualification systems. It is a…

  9. Prior learning assessment and quality assurance practice ...

    African Journals Online (AJOL)

    The use of RPL (Recognition of Prior Learning) in higher education to assess RPL candidates for admission into programmes of study met with a lot of criticism from faculty academics. Lecturers viewed the possibility of admitting large numbers of under-qualified adult learners, as a threat to the institution's reputation, or an ...

  10. Deriving proper uniform priors for regression coefficients, Parts I, II, and III

    NARCIS (Netherlands)

    van Erp, H.R.N.; Linger, R.O.; van Gelder, P.H.A.J.M.

    2017-01-01

    It is a relatively well-known fact that in problems of Bayesian model selection, improper priors should, in general, be avoided. In this paper we will derive and discuss a collection of four proper uniform priors which lie on an ascending scale of informativeness. It will turn out that these

  11. 3D reconstruction for partial data electrical impedance tomography using a sparsity prior

    DEFF Research Database (Denmark)

    Garde, Henrik; Knudsen, Kim

    2015-01-01

    of the conductivity is used to improve reconstructions for the partial data problem with Cauchy data measured only on a subset of the boundary. A sparsity prior is enforced using the ℓ1 norm in the penalty term of a Tikhonov functional, and spatial prior information is incorporated by applying a spatially distributed...

  12. Columbus safety and reliability

    Science.gov (United States)

    Longhurst, F.; Wessels, H.

    1988-10-01

    Analyses carried out to ensure Columbus reliability, availability, and maintainability, and operational and design safety are summarized. Failure modes/effects/criticality is the main qualitative tool used. The main aspects studied are fault tolerance, hazard consequence control, risk minimization, human error effects, restorability, and safe-life design.

  13. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  14. Power transformer reliability modelling

    NARCIS (Netherlands)

    Schijndel, van A.

    2010-01-01

    Problem description Electrical power grids serve to transport and distribute electrical power with high reliability and availability at acceptable costs and risks. These grids play a crucial though preferably invisible role in supplying sufficient power in a convenient form. Today’s society has

  15. Designing reliability into accelerators

    International Nuclear Information System (INIS)

    Hutton, A.

    1992-08-01

    For the next generation of high performance, high average luminosity colliders, the ''factories,'' reliability engineering must be introduced right at the inception of the project and maintained as a central theme throughout the project. There are several aspects which will be addressed separately: Concept; design; motivation; management techniques; and fault diagnosis

  16. Proof tests on reliability

    International Nuclear Information System (INIS)

    Mishima, Yoshitsugu

    1983-01-01

    In order to obtain public understanding on nuclear power plants, tests should be carried out to prove the reliability and safety of present LWR plants. For example, the aseismicity of nuclear power plants must be verified by using a large scale earthquake simulator. Reliability test began in fiscal 1975, and the proof tests on steam generators and on PWR support and flexure pins against stress corrosion cracking have already been completed, and the results have been internationally highly appreciated. The capacity factor of the nuclear power plant operation in Japan rose to 80% in the summer of 1983, and considering the period of regular inspection, it means the operation of almost full capacity. Japanese LWR technology has now risen to the top place in the world after having overcome the defects. The significance of the reliability test is to secure the functioning till the age limit is reached, to confirm the correct forecast of deteriorating process, to confirm the effectiveness of the remedy to defects and to confirm the accuracy of predicting the behavior of facilities. The reliability of nuclear valves, fuel assemblies, the heat affected zones in welding, reactor cooling pumps and electric instruments has been tested or is being tested. (Kako, I.)

  17. Reliability and code level

    NARCIS (Netherlands)

    Kasperski, M.; Geurts, C.P.W.

    2005-01-01

    The paper describes the work of the IAWE Working Group WBG - Reliability and Code Level, one of the International Codification Working Groups set up at ICWE10 in Copenhagen. The following topics are covered: sources of uncertainties in the design wind load, appropriate design target values for the

  18. Reliability of Plastic Slabs

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    1989-01-01

    In the paper it is shown how upper and lower bounds for the reliability of plastic slabs can be determined. For the fundamental case it is shown that optimal bounds of a deterministic and a stochastic analysis are obtained on the basis of the same failure mechanisms and the same stress fields....

  19. Reliability based structural design

    NARCIS (Netherlands)

    Vrouwenvelder, A.C.W.M.

    2014-01-01

    According to ISO 2394, structures shall be designed, constructed and maintained in such a way that they are suited for their use during the design working life in an economic way. To fulfil this requirement one needs insight into the risk and reliability under expected and non-expected actions. A

  20. Travel time reliability modeling.

    Science.gov (United States)

    2011-07-01

    This report includes three papers as follows: : 1. Guo F., Rakha H., and Park S. (2010), "A Multi-state Travel Time Reliability Model," : Transportation Research Record: Journal of the Transportation Research Board, n 2188, : pp. 46-54. : 2. Park S.,...

  1. Reliability and Model Fit

    Science.gov (United States)

    Stanley, Leanne M.; Edwards, Michael C.

    2016-01-01

    The purpose of this article is to highlight the distinction between the reliability of test scores and the fit of psychometric measurement models, reminding readers why it is important to consider both when evaluating whether test scores are valid for a proposed interpretation and/or use. It is often the case that an investigator judges both the…

  2. Parametric Mass Reliability Study

    Science.gov (United States)

    Holt, James P.

    2014-01-01

    The International Space Station (ISS) systems are designed based upon having redundant systems with replaceable orbital replacement units (ORUs). These ORUs are designed to be swapped out fairly quickly, but some are very large, and some are made up of many components. When an ORU fails, it is replaced on orbit with a spare; the failed unit is sometimes returned to Earth to be serviced and re-launched. Such a system is not feasible for a 500+ day long-duration mission beyond low Earth orbit. The components that make up these ORUs have mixed reliabilities. Components that make up the most mass-such as computer housings, pump casings, and the silicon board of PCBs-typically are the most reliable. Meanwhile components that tend to fail the earliest-such as seals or gaskets-typically have a small mass. To better understand the problem, my project is to create a parametric model that relates both the mass of ORUs to reliability, as well as the mass of ORU subcomponents to reliability.

  3. Reliability Approach of a Compressor System using Reliability Block ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... This paper presents a reliability analysis of such a system using reliability ... Keywords-compressor system, reliability, reliability block diagram, RBD .... the same structure has been kept with the three subsystems: air flow, oil flow and .... and Safety in Engineering Design", Springer, 2009. [3] P. O'Connor ...

  4. Prior processes and their applications nonparametric Bayesian estimation

    CERN Document Server

    Phadia, Eswar G

    2016-01-01

    This book presents a systematic and comprehensive treatment of various prior processes that have been developed over the past four decades for dealing with Bayesian approach to solving selected nonparametric inference problems. This revised edition has been substantially expanded to reflect the current interest in this area. After an overview of different prior processes, it examines the now pre-eminent Dirichlet process and its variants including hierarchical processes, then addresses new processes such as dependent Dirichlet, local Dirichlet, time-varying and spatial processes, all of which exploit the countable mixture representation of the Dirichlet process. It subsequently discusses various neutral to right type processes, including gamma and extended gamma, beta and beta-Stacy processes, and then describes the Chinese Restaurant, Indian Buffet and infinite gamma-Poisson processes, which prove to be very useful in areas such as machine learning, information retrieval and featural modeling. Tailfree and P...

  5. PET image reconstruction using multi-parametric anato-functional priors

    Science.gov (United States)

    Mehranian, Abolfazl; Belzunce, Martin A.; Niccolini, Flavia; Politis, Marios; Prieto, Claudia; Turkheimer, Federico; Hammers, Alexander; Reader, Andrew J.

    2017-08-01

    In this study, we investigate the application of multi-parametric anato-functional (MR-PET) priors for the maximum a posteriori (MAP) reconstruction of brain PET data in order to address the limitations of the conventional anatomical priors in the presence of PET-MR mismatches. In addition to partial volume correction benefits, the suitability of these priors for reconstruction of low-count PET data is also introduced and demonstrated, comparing to standard maximum-likelihood (ML) reconstruction of high-count data. The conventional local Tikhonov and total variation (TV) priors and current state-of-the-art anatomical priors including the Kaipio, non-local Tikhonov prior with Bowsher and Gaussian similarity kernels are investigated and presented in a unified framework. The Gaussian kernels are calculated using both voxel- and patch-based feature vectors. To cope with PET and MR mismatches, the Bowsher and Gaussian priors are extended to multi-parametric priors. In addition, we propose a modified joint Burg entropy prior that by definition exploits all parametric information in the MAP reconstruction of PET data. The performance of the priors was extensively evaluated using 3D simulations and two clinical brain datasets of [18F]florbetaben and [18F]FDG radiotracers. For simulations, several anato-functional mismatches were intentionally introduced between the PET and MR images, and furthermore, for the FDG clinical dataset, two PET-unique active tumours were embedded in the PET data. Our simulation results showed that the joint Burg entropy prior far outperformed the conventional anatomical priors in terms of preserving PET unique lesions, while still reconstructing functional boundaries with corresponding MR boundaries. In addition, the multi-parametric extension of the Gaussian and Bowsher priors led to enhanced preservation of edge and PET unique features and also an improved bias-variance performance. In agreement with the simulation results, the clinical results

  6. Science Literacy and Prior Knowledge of Astronomy MOOC Students

    Science.gov (United States)

    Impey, Chris David; Buxner, Sanlyn; Wenger, Matthew; Formanek, Martin

    2018-01-01

    Many of science classes offered on Coursera fall into fall into the category of general education or general interest classes for lifelong learners, including our own, Astronomy: Exploring Time and Space. Very little is known about the backgrounds and prior knowledge of these students. In this talk we present the results of a survey of our Astronomy MOOC students. We also compare these results to our previous work on undergraduate students in introductory astronomy courses. Survey questions examined student demographics and motivations as well as their science and information literacy (including basic science knowledge, interest, attitudes and beliefs, and where they get their information about science). We found that our MOOC students are different than the undergraduate students in more ways than demographics. Many MOOC students demonstrated high levels of science and information literacy. With a more comprehensive understanding of our students’ motivations and prior knowledge about science and how they get their information about science, we will be able to develop more tailored learning experiences for these lifelong learners.

  7. Reliability of operating WWER monitoring systems

    International Nuclear Information System (INIS)

    Yastrebenetsky, M.A.; Goldrin, V.M.; Garagulya, A.V.

    1996-01-01

    The elaboration of WWER monitoring systems reliability measures is described in this paper. The evaluation is based on the statistical data about failures what have collected at the Ukrainian operating nuclear power plants (NPP). The main attention is devoted to radiation safety monitoring system and unit information computer system, what collects information from different sensors and system of the unit. Reliability measures were used for decision the problems, connected with life extension of the instruments, and for other purposes. (author). 6 refs, 6 figs

  8. Reliability of operating WWER monitoring systems

    Energy Technology Data Exchange (ETDEWEB)

    Yastrebenetsky, M A; Goldrin, V M; Garagulya, A V [Ukrainian State Scientific Technical Center of Nuclear and Radiation Safety, Kharkov (Ukraine). Instrumentation and Control Systems Dept.

    1997-12-31

    The elaboration of WWER monitoring systems reliability measures is described in this paper. The evaluation is based on the statistical data about failures what have collected at the Ukrainian operating nuclear power plants (NPP). The main attention is devoted to radiation safety monitoring system and unit information computer system, what collects information from different sensors and system of the unit. Reliability measures were used for decision the problems, connected with life extension of the instruments, and for other purposes. (author). 6 refs, 6 figs.

  9. Maximum entropy reconstruction of spin densities involving non uniform prior

    International Nuclear Information System (INIS)

    Schweizer, J.; Ressouche, E.; Papoular, R.J.; Zheludev, A.I.

    1997-01-01

    Diffraction experiments give microscopic information on structures in crystals. A method which uses the concept of maximum of entropy (MaxEnt), appears to be a formidable improvement in the treatment of diffraction data. This method is based on a bayesian approach: among all the maps compatible with the experimental data, it selects that one which has the highest prior (intrinsic) probability. Considering that all the points of the map are equally probable, this probability (flat prior) is expressed via the Boltzman entropy of the distribution. This method has been used for the reconstruction of charge densities from X-ray data, for maps of nuclear densities from unpolarized neutron data as well as for distributions of spin density. The density maps obtained by this method, as compared to those resulting from the usual inverse Fourier transformation, are tremendously improved. In particular, any substantial deviation from the background is really contained in the data, as it costs entropy compared to a map that would ignore such features. However, in most of the cases, before the measurements are performed, some knowledge exists about the distribution which is investigated. It can range from the simple information of the type of scattering electrons to an elaborate theoretical model. In these cases, the uniform prior which considers all the different pixels as equally likely, is too weak a requirement and has to be replaced. In a rigorous bayesian analysis, Skilling has shown that prior knowledge can be encoded into the Maximum Entropy formalism through a model m(rvec r), via a new definition for the entropy given in this paper. In the absence of any data, the maximum of the entropy functional is reached for ρ(rvec r) = m(rvec r). Any substantial departure from the model, observed in the final map, is really contained in the data as, with the new definition, it costs entropy. This paper presents illustrations of model testing

  10. Hysteresis as an Implicit Prior in Tactile Spatial Decision Making

    Science.gov (United States)

    Thiel, Sabrina D.; Bitzer, Sebastian; Nierhaus, Till; Kalberlah, Christian; Preusser, Sven; Neumann, Jane; Nikulin, Vadim V.; van der Meer, Elke; Villringer, Arno; Pleger, Burkhard

    2014-01-01

    Perceptual decisions not only depend on the incoming information from sensory systems but constitute a combination of current sensory evidence and internally accumulated information from past encounters. Although recent evidence emphasizes the fundamental role of prior knowledge for perceptual decision making, only few studies have quantified the relevance of such priors on perceptual decisions and examined their interplay with other decision-relevant factors, such as the stimulus properties. In the present study we asked whether hysteresis, describing the stability of a percept despite a change in stimulus property and known to occur at perceptual thresholds, also acts as a form of an implicit prior in tactile spatial decision making, supporting the stability of a decision across successively presented random stimuli (i.e., decision hysteresis). We applied a variant of the classical 2-point discrimination task and found that hysteresis influenced perceptual decision making: Participants were more likely to decide ‘same’ rather than ‘different’ on successively presented pin distances. In a direct comparison between the influence of applied pin distances (explicit stimulus property) and hysteresis, we found that on average, stimulus property explained significantly more variance of participants’ decisions than hysteresis. However, when focusing on pin distances at threshold, we found a trend for hysteresis to explain more variance. Furthermore, the less variance was explained by the pin distance on a given decision, the more variance was explained by hysteresis, and vice versa. Our findings suggest that hysteresis acts as an implicit prior in tactile spatial decision making that becomes increasingly important when explicit stimulus properties provide decreasing evidence. PMID:24587045

  11. Routine saline infusion sonohysterography prior to assisted ...

    African Journals Online (AJOL)

    53.85%), 8 (30.77%) and 4 (15.38%) respectively. The average duration of the procedure was 6 minutes with a range of 4-9 minutes. Saline infusion sonohysterography is a reliable, cost effective and safe diagnostic tool in the evaluation of the ...

  12. Reliability in the utility computing era: Towards reliable Fog computing

    DEFF Research Database (Denmark)

    Madsen, Henrik; Burtschy, Bernard; Albeanu, G.

    2013-01-01

    This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....

  13. RTE - Reliability report 2016

    International Nuclear Information System (INIS)

    2017-06-01

    Every year, RTE produces a reliability report for the past year. This document lays out the main factors that affected the electrical power system's operational reliability in 2016 and the initiatives currently under way intended to ensure its reliability in the future. Within a context of the energy transition, changes to the European interconnected network mean that RTE has to adapt on an on-going basis. These changes include the increase in the share of renewables injecting an intermittent power supply into networks, resulting in a need for flexibility, and a diversification in the numbers of stakeholders operating in the energy sector and changes in the ways in which they behave. These changes are dramatically changing the structure of the power system of tomorrow and the way in which it will operate - particularly the way in which voltage and frequency are controlled, as well as the distribution of flows, the power system's stability, the level of reserves needed to ensure supply-demand balance, network studies, assets' operating and control rules, the tools used and the expertise of operators. The results obtained in 2016 are evidence of a globally satisfactory level of reliability for RTE's operations in somewhat demanding circumstances: more complex supply-demand balance management, cross-border schedules at interconnections indicating operation that is closer to its limits and - most noteworthy - having to manage a cold spell just as several nuclear power plants had been shut down. In a drive to keep pace with the changes expected to occur in these circumstances, RTE implemented numerous initiatives to ensure high levels of reliability: - maintaining investment levels of euro 1.5 billion per year; - increasing cross-zonal capacity at borders with our neighbouring countries, thus bolstering the security of our electricity supply; - implementing new mechanisms (demand response, capacity mechanism, interruptibility, etc.); - involvement in tests or projects

  14. Airline experience with reliability-centered maintenance

    International Nuclear Information System (INIS)

    Matteson, T.D.

    1985-01-01

    Reliability-Centered Maintenance is a process for developing preventive maintenance programs. Its concepts evolved from the post WWII experience of the airline community. Its genesis was in a paper by F. Stanley Nowlan and Thomas D. Matteson of United Airlines for the American Institute of Aeronautics and Astronautics in 1967. Its first application was to the Boeing 747. It has subsequently been adopted by the FAA and the Department of Defense and applied to many new transport and military aircraft. Its objective is applicable and effective preventive maintenance and it has proven to be a highly effective replacement for the prior intuitive processes for selective preventive maintenance tasks. It focuses on system functions, functional failures, then dominant failure modes and effects. It then uses a decision tree to classify failure criticality and identify applicable and effective tasks. The result is a program focused on maintaining inherent safety and reliability at minimum cost. (orig.)

  15. Airline experience with reliability-centered maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Matteson, T.D.

    1985-11-01

    Reliability-Centered Maintenance is a process for developing preventive maintenance programs. Its concepts evolved from the post WWII experience of the airline community. Its genesis was in a paper by F. Stanley Nowlan and Thomas D. Matteson of United Airlines for the American Institute of Aeronautics and Astronautics in 1967. Its first application was to the Boeing 747. It has subsequently been adopted by the FAA and the Department of Defense and applied to many new transport and military aircraft. Its objective is applicable and effective preventive maintenance and it has proven to be a highly effective replacement for the prior intuitive processes for selective preventive maintenance tasks. It focuses on system functions, functional failures, then dominant failure modes and effects. It then uses a decision tree to classify failure criticality and identify applicable and effective tasks. The result is a program focused on maintaining inherent safety and reliability at minimum cost. (orig.).

  16. SLAC modulator system improvements and reliability results

    International Nuclear Information System (INIS)

    Donaldson, A.R.

    1998-06-01

    In 1995, an improvement project was completed on the 244 klystron modulators in the linear accelerator. The modulator system has been previously described. This article offers project details and their resulting effect on modulator and component reliability. Prior to the project, the authors had collected four operating cycles (1991 through 1995) of MTTF data. In this discussion, the '91 data will be excluded since the modulators operated at 60 Hz. The five periods following the '91 run were reviewed due to the common repetition rate at 120 Hz

  17. Accelerator reliability workshop

    Energy Technology Data Exchange (ETDEWEB)

    Hardy, L; Duru, Ph; Koch, J M; Revol, J L; Van Vaerenbergh, P; Volpe, A M; Clugnet, K; Dely, A; Goodhew, D

    2002-07-01

    About 80 experts attended this workshop, which brought together all accelerator communities: accelerator driven systems, X-ray sources, medical and industrial accelerators, spallation sources projects (American and European), nuclear physics, etc. With newly proposed accelerator applications such as nuclear waste transmutation, replacement of nuclear power plants and others. Reliability has now become a number one priority for accelerator designers. Every part of an accelerator facility from cryogenic systems to data storage via RF systems are concerned by reliability. This aspect is now taken into account in the design/budget phase, especially for projects whose goal is to reach no more than 10 interruptions per year. This document gathers the slides but not the proceedings of the workshop.

  18. Human Reliability Program Workshop

    Energy Technology Data Exchange (ETDEWEB)

    Landers, John; Rogers, Erin; Gerke, Gretchen

    2014-05-18

    A Human Reliability Program (HRP) is designed to protect national security as well as worker and public safety by continuously evaluating the reliability of those who have access to sensitive materials, facilities, and programs. Some elements of a site HRP include systematic (1) supervisory reviews, (2) medical and psychological assessments, (3) management evaluations, (4) personnel security reviews, and (4) training of HRP staff and critical positions. Over the years of implementing an HRP, the Department of Energy (DOE) has faced various challenges and overcome obstacles. During this 4-day activity, participants will examine programs that mitigate threats to nuclear security and the insider threat to include HRP, Nuclear Security Culture (NSC) Enhancement, and Employee Assistance Programs. The focus will be to develop an understanding of the need for a systematic HRP and to discuss challenges and best practices associated with mitigating the insider threat.

  19. Reliability and construction control

    Directory of Open Access Journals (Sweden)

    Sherif S. AbdelSalam

    2016-06-01

    Full Text Available The goal of this study was to determine the most reliable and efficient combination of design and construction methods required for vibro piles. For a wide range of static and dynamic formulas, the reliability-based resistance factors were calculated using EGYPT database, which houses load test results for 318 piles. The analysis was extended to introduce a construction control factor that determines the variation between the pile nominal capacities calculated using static versus dynamic formulae. From the major outcomes, the lowest coefficient of variation is associated with Davisson’s criterion, and the resistance factors calculated for the AASHTO method are relatively high compared with other methods. Additionally, the CPT-Nottingham and Schmertmann method provided the most economic design. Recommendations related to a pile construction control factor were also presented, and it was found that utilizing the factor can significantly reduce variations between calculated and actual capacities.

  20. Scyllac equipment reliability analysis

    International Nuclear Information System (INIS)

    Gutscher, W.D.; Johnson, K.J.

    1975-01-01

    Most of the failures in Scyllac can be related to crowbar trigger cable faults. A new cable has been designed, procured, and is currently undergoing evaluation. When the new cable has been proven, it will be worked into the system as quickly as possible without causing too much additional down time. The cable-tip problem may not be easy or even desirable to solve. A tightly fastened permanent connection that maximizes contact area would be more reliable than the plug-in type of connection in use now, but it would make system changes and repairs much more difficult. The balance of the failures have such a low occurrence rate that they do not cause much down time and no major effort is underway to eliminate them. Even though Scyllac was built as an experimental system and has many thousands of components, its reliability is very good. Because of this the experiment has been able to progress at a reasonable pace