WorldWideScience

Sample records for pre-test probability settings

  1. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  2. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  3. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    Science.gov (United States)

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  4. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  5. A Cryogenic Test Set-Up for the Qualification of Pre-Series Test Cells for the LHC Cryogenic Distribution Line

    CERN Document Server

    Livran, J; Parente, C; Riddone, G; Rybkowski, D; Veillet, N

    2000-01-01

    Three pre-series Test Cells of the LHC Cryogenic Distribution Line (QRL) [1], manufactured by three European industrial companies, will be tested in the year 2000 to qualify the design chosen and verify the thermal and mechanical performances. A dedicated test stand (170 m x 13 m) has been built for extensive testing and performance assessment of the pre-series units in parallel. They will be fed with saturated liquid helium at 4.2 K supplied by a mobile helium dewar. In addition, LN2 cooled helium will be used for cool-down and thermal shielding. For each of the three pre-series units, a set of end boxes has been designed and manufactured at CERN. This paper presents the layout of the cryogenic system for the pre-series units, the calorimetric methods as well as the results of the thermal calculation of the end box test.

  6. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  7. Frequency formats, probability formats, or problem structure? A test of the nested-sets hypothesis in an extensional reasoning task

    Directory of Open Access Journals (Sweden)

    William P. Neace

    2008-02-01

    Full Text Available Five experiments addressed a controversy in the probability judgment literature that centers on the efficacy of framing probabilities as frequencies. The natural frequency view predicts that frequency formats attenuate errors, while the nested-sets view predicts that highlighting the set-subset structure of the problem reduces error, regardless of problem format. This study tested these predictions using a conjunction task. Previous studies reporting that frequency formats reduced conjunction errors confounded reference class with problem format. After controlling this confound, the present study's findings show that conjunction errors can be reduced using either a probability or a frequency format, that frequency effects depend upon the presence of a reference class, and that frequency formats do not promote better statistical reasoning than probability formats.

  8. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  9. "I Don't Really Understand Probability at All": Final Year Pre-Service Teachers' Understanding of Probability

    Science.gov (United States)

    Maher, Nicole; Muir, Tracey

    2014-01-01

    This paper reports on one aspect of a wider study that investigated a selection of final year pre-service primary teachers' responses to four probability tasks. The tasks focused on foundational ideas of probability including sample space, independence, variation and expectation. Responses suggested that strongly held intuitions appeared to…

  10. Application of tests of goodness of fit in determining the probability density function for spacing of steel sets in tunnel support system

    Directory of Open Access Journals (Sweden)

    Farnoosh Basaligheh

    2015-12-01

    Full Text Available One of the conventional methods for temporary support of tunnels is to use steel sets with shotcrete. The nature of a temporary support system demands a quick installation of its structures. As a result, the spacing between steel sets is not a fixed amount and it can be considered as a random variable. Hence, in the reliability analysis of these types of structures, the selection of an appropriate probability distribution function of spacing of steel sets is essential. In the present paper, the distances between steel sets are collected from an under-construction tunnel and the collected data is used to suggest a proper Probability Distribution Function (PDF for the spacing of steel sets. The tunnel has two different excavation sections. In this regard, different distribution functions were investigated and three common tests of goodness of fit were used for evaluation of each function for each excavation section. Results from all three methods indicate that the Wakeby distribution function can be suggested as the proper PDF for spacing between the steel sets. It is also noted that, although the probability distribution function for two different tunnel sections is the same, the parameters of PDF for the individual sections are different from each other.

  11. Comparative analysis through probability distributions of a data set

    Science.gov (United States)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  12. Pre-test probability risk scores and their use in contemporary management of patients with chest pain: One year stress echo cohort study

    Science.gov (United States)

    Demarco, Daniela Cassar; Papachristidis, Alexandros; Roper, Damian; Tsironis, Ioannis; Byrne, Jonathan; Monaghan, Mark

    2015-01-01

    Objectives To compare how patients with chest pain would be investigated, based on the two guidelines available for UK cardiologists, on the management of patients with stable chest pain. The UK National Institute of Clinical Excellence (NICE) guideline which was published in 2010 and the European society of cardiology (ESC) guideline published in 2013. Both guidelines utilise pre-test probability risk scores, to guide the choice of investigation. Design We undertook a large retrospective study to investigate the outcomes of stress echocardiography. Setting A large tertiary centre in the UK in a contemporary clinical practice. Participants Two thirds of the patients in the cohort were referred from our rapid access chest pain clinics. Results We found that the NICE risk score overestimates risk by 20% compared to the ESC Risk score. We also found that based on the NICE guidelines, 44% of the patients presenting with chest pain, in this cohort, would have been investigated invasively, with diagnostic coronary angiography. Using the ESC guidelines, only 0.3% of the patients would be investigated invasively. Conclusion The large discrepancy between the two guidelines can be easily reduced if NICE adopted the ESC risk score. PMID:26673458

  13. DESIGN OF STRUCTURAL ELEMENTS IN THE EVENT OF THE PRE-SET RELIABILITY, REGULAR LOAD AND BEARING CAPACITY DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Tamrazyan Ashot Georgievich

    2012-10-01

    Full Text Available Accurate and adequate description of external influences and of the bearing capacity of the structural material requires the employment of the probability theory methods. In this regard, the characteristic that describes the probability of failure-free operation is required. The characteristic of reliability means that the maximum stress caused by the action of the load will not exceed the bearing capacity. In this paper, the author presents a solution to the problem of calculation of structures, namely, the identification of reliability of pre-set design parameters, in particular, cross-sectional dimensions. If the load distribution pattern is available, employment of the regularities of distributed functions make it possible to find the pattern of distribution of maximum stresses over the structure. Similarly, we can proceed to the design of structures of pre-set rigidity, reliability and stability in the case of regular load distribution. We consider the element of design (a monolithic concrete slab, maximum stress S which depends linearly on load q. Within a pre-set period of time, the probability will not exceed the values according to the Poisson law. The analysis demonstrates that the variability of the bearing capacity produces a stronger effect on relative sizes of cross sections of a slab than the variability of loads. It is therefore particularly important to reduce the coefficient of variation of the load capacity. One of the methods contemplates the truncation of the bearing capacity distribution by pre-culling the construction material.

  14. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    OpenAIRE

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values and decide the test result. This is, in some cases, viewed as a flaw. In order to overcome this flaw and improve the power of the test, the joint tail probability of a set p-values is proposed as a ...

  15. Set-up of a pre-test mock-up experiment in preparation for the HCPB Breeder Unit mock-up experimental campaign

    Energy Technology Data Exchange (ETDEWEB)

    Hernández, F., E-mail: francisco.hernandez@kit.edu [Karlsruhe Institute of Technology (KIT), Institute for Neutron Physics and Reactor Technology (INR) (Germany); Kolb, M. [Karlsruhe Institute of Technology (KIT), Institute for Applied Materials (IAM-WPT) (Germany); Ilić, M.; Kunze, A. [Karlsruhe Institute of Technology (KIT), Institute for Neutron Physics and Reactor Technology (INR) (Germany); Németh, J. [KFKI Research Institute for Particle and Nuclear Physics (Hungary); Weth, A. von der [Karlsruhe Institute of Technology (KIT), Institute for Neutron Physics and Reactor Technology (INR) (Germany)

    2013-10-15

    Highlights: ► As preparation for the HCPB-TBM Breeder Unit out-of-pile testing campaign, a pre-test experiment (PREMUX) has been prepared and described. ► A new heater system based on a wire heater matrix has been developed for imitating the neutronic volumetric heating and it is compared with the conventional plate heaters. ► The test section is described and preliminary thermal results with the available models are presented and are to be benchmarked with PREMUX. ► The PREMUX integration in the air cooling loop L-STAR/LL in the Karlsruhe Institute for Technology is shown and future steps are discussed. -- Abstract: The complexity of the experimental set-up for testing a full-scaled Breeder Unit (BU) mock-up for the European Helium Cooled Pebble Bed Test Blanket Module (HCPB-TBM) has motivated to build a pre-test mock-up experiment (PREMUX) consisting of a slice of the BU in the Li{sub 4}SiO{sub 4} region. This pre-test aims at verifying the feasibility of the methods to be used for the subsequent testing of the full-scaled BU mock-up. Key parameters needed for the modeling of the breeder material is also to be determined by the Hot Wire Method (HWM). The modeling tools for the thermo-mechanics of the pebble beds and for the mock-up structure are to be calibrated and validated as well. This paper presents the setting-up of PREMUX in the L-STAR/LL facility at the Karlsruhe Institute of Technology. A key requirement of the experiments is to mimic the neutronic volumetric heating. A new heater concept is discussed and compared to several conventional heater configurations with respect to the estimated temperature distribution in the pebble beds. The design and integration of the thermocouple system in the heater matrix and pebble beds is also described, as well as other key aspects of the mock-up (dimensions, layout, cooling system, purge gas line, boundary conditions and integration in the test facility). The adequacy of these methods for the full-scaled BU

  16. Neutrosophic Probability, Set, And Logic (first version)

    OpenAIRE

    Smarandache, Florentin

    2000-01-01

    This project is a part of a National Science Foundation interdisciplinary project proposal. Starting from a new viewpoint in philosophy, the neutrosophy, one extends the classical "probability theory", "fuzzy set" and "fuzzy logic" to , and respectively. They are useful in artificial intelligence, neural networks, evolutionary programming, neutrosophic dynamic systems, and quantum mechanics.

  17. Interpreting results of cluster surveys in emergency settings: is the LQAS test the best option?

    Science.gov (United States)

    Bilukha, Oleg O; Blanton, Curtis

    2008-12-09

    Cluster surveys are commonly used in humanitarian emergencies to measure health and nutrition indicators. Deitchler et al. have proposed to use Lot Quality Assurance Sampling (LQAS) hypothesis testing in cluster surveys to classify the prevalence of global acute malnutrition as exceeding or not exceeding the pre-established thresholds. Field practitioners and decision-makers must clearly understand the meaning and implications of using this test in interpreting survey results to make programmatic decisions. We demonstrate that the LQAS test--as proposed by Deitchler et al.--is prone to producing false-positive results and thus is likely to suggest interventions in situations where interventions may not be needed. As an alternative, to provide more useful information for decision-making, we suggest reporting the probability of an indicator's exceeding the threshold as a direct measure of "risk". Such probability can be easily determined in field settings by using a simple spreadsheet calculator. The "risk" of exceeding the threshold can then be considered in the context of other aggravating and protective factors to make informed programmatic decisions.

  18. On calculating the probability of a set of orthologous sequences

    Directory of Open Access Journals (Sweden)

    Junfeng Liu

    2009-02-01

    Full Text Available Junfeng Liu1,2, Liang Chen3, Hongyu Zhao4, Dirk F Moore1,2, Yong Lin1,2, Weichung Joe Shih1,21Biometrics Division, The Cancer, Institute of New Jersey, New Brunswick, NJ, USA; 2Department of Biostatistics, School of Public Health, University of Medicine and Dentistry of New Jersey, Piscataway, NJ, USA; 3Department of Biological Sciences, University of Southern California, Los Angeles, CA, USA; 4Department of Epidemiology and Public Health, Yale University School of Medicine, New Haven, CT, USAAbstract: Probabilistic DNA sequence models have been intensively applied to genome research. Within the evolutionary biology framework, this article investigates the feasibility for rigorously estimating the probability of a set of orthologous DNA sequences which evolve from a common progenitor. We propose Monte Carlo integration algorithms to sample the unknown ancestral and/or root sequences a posteriori conditional on a reference sequence and apply pairwise Needleman–Wunsch alignment between the sampled and nonreference species sequences to estimate the probability. We test our algorithms on both simulated and real sequences and compare calculated probabilities from Monte Carlo integration to those induced by single multiple alignment.Keywords: evolution, Jukes–Cantor model, Monte Carlo integration, Needleman–Wunsch alignment, orthologous

  19. Test Review for Preschool-Wide Evaluation Tool (PreSET) Manual: Assessing Universal Program-Wide Positive Behavior Support in Early Childhood

    Science.gov (United States)

    Rodriguez, Billie Jo

    2013-01-01

    The Preschool-Wide Evaluation Tool (PreSET; Steed & Pomerleau, 2012) is published by Paul H. Brookes Publishing Company in Baltimore, MD. The PreSET purports to measure universal and program-wide features of early childhood programs' implementation fidelity of program-wide positive behavior intervention and support (PW-PBIS) and is,…

  20. Interpreting results of cluster surveys in emergency settings: is the LQAS test the best option?

    Directory of Open Access Journals (Sweden)

    Blanton Curtis

    2008-12-01

    Full Text Available Abstract Cluster surveys are commonly used in humanitarian emergencies to measure health and nutrition indicators. Deitchler et al. have proposed to use Lot Quality Assurance Sampling (LQAS hypothesis testing in cluster surveys to classify the prevalence of global acute malnutrition as exceeding or not exceeding the pre-established thresholds. Field practitioners and decision-makers must clearly understand the meaning and implications of using this test in interpreting survey results to make programmatic decisions. We demonstrate that the LQAS test–as proposed by Deitchler et al. – is prone to producing false-positive results and thus is likely to suggest interventions in situations where interventions may not be needed. As an alternative, to provide more useful information for decision-making, we suggest reporting the probability of an indicator's exceeding the threshold as a direct measure of "risk". Such probability can be easily determined in field settings by using a simple spreadsheet calculator. The "risk" of exceeding the threshold can then be considered in the context of other aggravating and protective factors to make informed programmatic decisions.

  1. Generation, combination and extension of random set approximations to coherent lower and upper probabilities

    International Nuclear Information System (INIS)

    Hall, Jim W.; Lawry, Jonathan

    2004-01-01

    Random set theory provides a convenient mechanism for representing uncertain knowledge including probabilistic and set-based information, and extending it through a function. This paper focuses upon the situation when the available information is in terms of coherent lower and upper probabilities, which are encountered, for example, when a probability distribution is specified by interval parameters. We propose an Iterative Rescaling Method (IRM) for constructing a random set with corresponding belief and plausibility measures that are a close outer approximation to the lower and upper probabilities. The approach is compared with the discrete approximation method of Williamson and Downs (sometimes referred to as the p-box), which generates a closer approximation to lower and upper cumulative probability distributions but in most cases a less accurate approximation to the lower and upper probabilities on the remainder of the power set. Four combination methods are compared by application to example random sets generated using the IRM

  2. ANALYSIS OF EFFECTIVENESS OF METHODOLOGICAL SYSTEM FOR PROBABILITY AND STOCHASTIC PROCESSES COMPUTER-BASED LEARNING FOR PRE-SERVICE ENGINEERS

    Directory of Open Access Journals (Sweden)

    E. Chumak

    2015-04-01

    Full Text Available The author substantiates that only methodological training systems of mathematical disciplines with implementation of information and communication technologies (ICT can meet the requirements of modern educational paradigm and make possible to increase the educational efficiency. Due to this fact, the necessity of developing the methodology of theory of probability and stochastic processes computer-based learning for pre-service engineers is underlined in the paper. The results of the experimental study for analysis of the efficiency of methodological system of theory of probability and stochastic processes computer-based learning for pre-service engineers are shown. The analysis includes three main stages: ascertaining, searching and forming. The key criteria of the efficiency of designed methodological system are the level of probabilistic and stochastic skills of students and their learning motivation. The effect of implementing the methodological system of probability theory and stochastic processes computer-based learning on the level of students’ IT literacy is shown in the paper. The expanding of the range of objectives of ICT applying by students is described by author. The level of formation of students’ learning motivation on the ascertaining and forming stages of the experiment is analyzed. The level of intrinsic learning motivation for pre-service engineers is defined on these stages of the experiment. For this purpose, the methodology of testing the students’ learning motivation in the chosen specialty is presented in the paper. The increasing of intrinsic learning motivation of the experimental group students (E group against the control group students (C group is demonstrated.

  3. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Science.gov (United States)

    2010-07-01

    ... corrective action does not resolve the deficiency, you may request to use the contaminated system as an... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION...

  4. Teaching Probability to Pre-Service Teachers with Argumentation Based Science Learning Approach

    Science.gov (United States)

    Can, Ömer Sinan; Isleyen, Tevfik

    2016-01-01

    The aim of this study is to explore the effects of the argumentation based science learning (ABSL) approach on the teaching probability to pre-service teachers. The sample of the study included 41 students studying at the Department of Elementary School Mathematics Education in a public university during the 2014-2015 academic years. The study is…

  5. Comparison of patient comprehension of rapid HIV pre-test fundamentals by information delivery format in an emergency department setting

    Directory of Open Access Journals (Sweden)

    Clark Melissa A

    2007-09-01

    Full Text Available Abstract Background Two trials were conducted to compare emergency department patient comprehension of rapid HIV pre-test information using different methods to deliver this information. Methods Patients were enrolled for these two trials at a US emergency department between February 2005 and January 2006. In Trial One, patients were randomized to a no pre-test information or an in-person discussion arm. In Trial Two, a separate group of patients were randomized to an in-person discussion arm or a Tablet PC-based video arm. The video, "Do you know about rapid HIV testing?", and the in-person discussion contained identical Centers for Disease Control and Prevention-suggested pre-test information components as well as information on rapid HIV testing with OraQuick®. Participants were compared by information arm on their comprehension of the pre-test information by their score on a 26-item questionnaire using the Wilcoxon rank-sum test. Results In Trial One, 38 patients completed the no-information arm and 31 completed the in-person discussion arm. Of these 69 patients, 63.8% had twelve years or fewer of formal education and 66.7% had previously been tested for HIV. The mean score on the questionnaire for the in-person discussion arm was higher than for the no information arm (18.7 vs. 13.3, p ≤ 0.0001. In Trial Two, 59 patients completed the in-person discussion and 55 completed the video arms. Of these 114 patients, 50.9% had twelve years or fewer of formal education and 68.4% had previously been tested for HIV. The mean score on the questionnaire for the video arm was similar to the in-person discussion arm (20.0 vs. 19.2; p ≤ 0.33. Conclusion The video "Do you know about rapid HIV testing?" appears to be an acceptable substitute for an in-person pre-test discussion on rapid HIV testing with OraQuick®. In terms of adequately informing ED patients about rapid HIV testing, either form of pre-test information is preferable than for patients

  6. Test the Overall Significance of p-values by Using Joint Tail Probability of Ordered p-values as Test Statistic

    NARCIS (Netherlands)

    Fang, Yongxiang; Wit, Ernst

    2008-01-01

    Fisher’s combined probability test is the most commonly used method to test the overall significance of a set independent p-values. However, it is very obviously that Fisher’s statistic is more sensitive to smaller p-values than to larger p-value and a small p-value may overrule the other p-values

  7. Simplified Freeman-Tukey test statistics for testing probabilities in ...

    African Journals Online (AJOL)

    This paper presents the simplified version of the Freeman-Tukey test statistic for testing hypothesis about multinomial probabilities in one, two and multidimensional contingency tables that does not require calculating the expected cell frequencies before test of significance. The simplified method established new criteria of ...

  8. Testing the statistical compatibility of independent data sets

    International Nuclear Information System (INIS)

    Maltoni, M.; Schwetz, T.

    2003-01-01

    We discuss a goodness-of-fit method which tests the compatibility between statistically independent data sets. The method gives sensible results even in cases where the χ 2 minima of the individual data sets are very low or when several parameters are fitted to a large number of data points. In particular, it avoids the problem that a possible disagreement between data sets becomes diluted by data points which are insensitive to the crucial parameters. A formal derivation of the probability distribution function for the proposed test statistics is given, based on standard theorems of statistics. The application of the method is illustrated on data from neutrino oscillation experiments, and its complementarity to the standard goodness-of-fit is discussed

  9. Testing for variation in taxonomic extinction probabilities: a suggested methodology and some results

    Science.gov (United States)

    Conroy, M.J.; Nichols, J.D.

    1984-01-01

    Several important questions in evolutionary biology and paleobiology involve sources of variation in extinction rates. In all cases of which we are aware, extinction rates have been estimated from data in which the probability that an observation (e.g., a fossil taxon) will occur is related both to extinction rates and to what we term encounter probabilities. Any statistical method for analyzing fossil data should at a minimum permit separate inferences on these two components. We develop a method for estimating taxonomic extinction rates from stratigraphic range data and for testing hypotheses about variability in these rates. We use this method to estimate extinction rates and to test the hypothesis of constant extinction rates for several sets of stratigraphic range data. The results of our tests support the hypothesis that extinction rates varied over the geologic time periods examined. We also present a test that can be used to identify periods of high or low extinction probabilities and provide an example using Phanerozoic invertebrate data. Extinction rates should be analyzed using stochastic models, in which it is recognized that stratigraphic samples are random varlates and that sampling is imperfect

  10. The prediction and probability for successful completion in medical study based on tests and pre-admission grades

    Czech Academy of Sciences Publication Activity Database

    Štuka, Č.; Martinková, Patrícia; Zvára, Karel; Zvárová, Jana

    2012-01-01

    Roč. 28, č. 2 (2012), s. 138-152 ISSN 1732-6729 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : education * admission criteria * pre-admission grades * admission test * medical study Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.149, year: 2012 http://www. education alrev.us.edu.pl/volume28.htm

  11. Pre-Service Mathematics Teachers' Use of Probability Models in Making Informal Inferences about a Chance Game

    Science.gov (United States)

    Kazak, Sibel; Pratt, Dave

    2017-01-01

    This study considers probability models as tools for both making informal statistical inferences and building stronger conceptual connections between data and chance topics in teaching statistics. In this paper, we aim to explore pre-service mathematics teachers' use of probability models for a chance game, where the sum of two dice matters in…

  12. Wald Sequential Probability Ratio Test for Space Object Conjunction Assessment

    Science.gov (United States)

    Carpenter, James R.; Markley, F Landis

    2014-01-01

    This paper shows how satellite owner/operators may use sequential estimates of collision probability, along with a prior assessment of the base risk of collision, in a compound hypothesis ratio test to inform decisions concerning collision risk mitigation maneuvers. The compound hypothesis test reduces to a simple probability ratio test, which appears to be a novel result. The test satisfies tolerances related to targeted false alarm and missed detection rates. This result is independent of the method one uses to compute the probability density that one integrates to compute collision probability. A well-established test case from the literature shows that this test yields acceptable results within the constraints of a typical operational conjunction assessment decision timeline. Another example illustrates the use of the test in a practical conjunction assessment scenario based on operations of the International Space Station.

  13. Approximation of Measurement Results of “Emergency” Signal Reception Probability

    Directory of Open Access Journals (Sweden)

    Gajda Stanisław

    2017-08-01

    Full Text Available The intended aim of this article is to present approximation results of the exemplary measurements of EMERGENCY signal reception probability. The probability is under-stood as a distance function between the aircraft and a ground-based system under established conditions. The measurements were approximated using the properties of logistic functions. This probability, as a distance function, enables to determine the range of the EMERGENCY signal for a pre-set confidence level.

  14. Return to work after cancer and pre-cancer job dissatisfaction

    DEFF Research Database (Denmark)

    Heinesen, Eskil; Kolodziejczyk, Christophe; Ladenburg, Jacob

    2017-01-01

    We investigate the association between pre-cancer job dissatisfaction and return-to-work probability 3 years after a cancer diagnosis. We use a Danish data set combining administrative data and a survey to breast and colon cancer survivors. We find that the return-to-work probability has a negative...... correlation with pre-cancer job dissatisfaction with mental demands (where the correlation is driven by the high-educated) and with physical demands and the superior (where the correlation is driven by the low-educated). Educational gradients in the probability of returning to work after cancer...... are not significantly affected by controlling for pre-cancer job dissatisfaction and pre-cancer ability to work....

  15. Healthy incentive scheme in the Irish full-day-care pre-school setting.

    LENUS (Irish Health Repository)

    Molloy, C Johnston

    2013-12-16

    A pre-school offering a full-day-care service provides for children aged 0-5 years for more than 4 h\\/d. Researchers have called for studies that will provide an understanding of nutrition and physical activity practices in this setting. Obesity prevention in pre-schools, through the development of healthy associations with food and health-related practices, has been advocated. While guidelines for the promotion of best nutrition and health-related practice in the early years\\' setting exist in a number of jurisdictions, associated regulations have been noted to be poor, with the environment of the child-care facility mainly evaluated for safety. Much cross-sectional research outlines poor nutrition and physical activity practice in this setting. However, there are few published environmental and policy-level interventions targeting the child-care provider with, to our knowledge, no evidence of such interventions in Ireland. The aim of the present paper is to review international guidelines and recommendations relating to health promotion best practice in the pre-school setting: service and resource provision; food service and food availability; and the role and involvement of parents in pre-schools. Intervention programmes and assessment tools available to measure such practice are outlined; and insight is provided into an intervention scheme, formulated from available best practice, that was introduced into the Irish full-day-care pre-school setting.

  16. Probability and Statistics The Science of Uncertainty (Revised Edition)

    CERN Document Server

    Tabak, John

    2011-01-01

    Probability and Statistics, Revised Edition deals with the history of probability, describing the modern concept of randomness and examining "pre-probabilistic" ideas of what most people today would characterize as randomness. This revised book documents some historically important early uses of probability to illustrate some very important probabilistic questions. It goes on to explore statistics and the generations of mathematicians and non-mathematicians who began to address problems in statistical analysis, including the statistical structure of data sets as well as the theory of

  17. Pre-employment medical testing in Brazil: ethical challenges.

    Science.gov (United States)

    Palhares, Dario; Laurentino dos Santos, Ivone

    2012-01-01

    Pre-employment medical tests, considered to be a practice within the subspecialty of occupational medicine, are ordered by physicians on behalf of employers. Candidates for a job may be rejected if they are found to suffer from a condition that can be worsened by the job, or one that may put other workers at risk. As the physician who orders pre-employment tests is chosen by the employer, pre-employment tests can violate both the autonomy and the privacy of the individual. This paper discusses ethical conflicts inherent in pre-employment medical testing.

  18. Estimation of component failure probability from masked binomial system testing data

    International Nuclear Information System (INIS)

    Tan Zhibin

    2005-01-01

    The component failure probability estimates from analysis of binomial system testing data are very useful because they reflect the operational failure probability of components in the field which is similar to the test environment. In practice, this type of analysis is often confounded by the problem of data masking: the status of tested components is unknown. Methods in considering this type of uncertainty are usually computationally intensive and not practical to solve the problem for complex systems. In this paper, we consider masked binomial system testing data and develop a probabilistic model to efficiently estimate component failure probabilities. In the model, all system tests are classified into test categories based on component coverage. Component coverage of test categories is modeled by a bipartite graph. Test category failure probabilities conditional on the status of covered components are defined. An EM algorithm to estimate component failure probabilities is developed based on a simple but powerful concept: equivalent failures and tests. By simulation we not only demonstrate the convergence and accuracy of the algorithm but also show that the probabilistic model is capable of analyzing systems in series, parallel and any other user defined structures. A case study illustrates an application in test case prioritization

  19. The realistic performance achievable with mycobacterial automated culture systems in high and low prevalence settings

    Directory of Open Access Journals (Sweden)

    Klatser Paul R

    2010-04-01

    Full Text Available Abstract Background Diagnostic tests are generally used in situations with similar pre-test probability of disease to where they were developed. When these tests are applied in situations with very different pre-test probabilities of disease, it is informative to model the likely implications of known characteristics of test performance in the new situation. This is the case for automated Mycobacterium tuberculosis (MTB liquid culture systems for tuberculosis case detection which were developed and are widely used in low burden settings but are only beginning to be applied on a large scale in high burden settings. Methods Here we model the performance of MTB liquid culture systems in high and low tuberculosis (TB prevalence settings using detailed published data concentrating on the likely frequency of cross-contamination events. Results Our model predicts that as the TB prevalence in the suspect population increases there is an exponential increase in the risk of MTB cross-contamination events expected in otherwise negative samples, even with equivalent technical performance of the laboratories. Quality control and strict cross-contamination measures become increasingly critical as the burden of MTB infection among TB suspects increases. Even under optimal conditions the realistically achievable specificity of these systems in high burden settings will likely be significantly below that obtained in low TB burden laboratories. Conclusions Liquid culture systems can play a valuable role in TB case detection in laboratories in high burden settings, but laboratory workers, policy makers and clinicians should be aware of the increased risks, independent of laboratory proficiency, of cross-contamination events in high burden settings.

  20. Light Duty Utility Arm system pre-operational (cold test) test plan

    International Nuclear Information System (INIS)

    Bennett, K.L.

    1995-01-01

    The Light Duty Utility (LDUA) Cold Test Facility, located in the Hanford 400 Area, will be used to support cold testing (pre- operational tests) of LDUA subsystems. Pre-operational testing is composed of subsystem development testing and rework activities, and integrated system qualification testing. Qualification testing will be conducted once development work is complete and documentation is under configuration control. Operational (hot) testing of the LDUA system will follow the testing covered in this plan and will be covered in a separate test plan

  1. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    Science.gov (United States)

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  2. Extreme points of the convex set of joint probability distributions with ...

    Indian Academy of Sciences (India)

    Here we address the following problem: If G is a standard ... convex set of all joint probability distributions on the product Borel space (X1 ×X2, F1 ⊗. F2) which .... cannot be identically zero when X and Y vary in A1 and u and v vary in H2. Thus.

  3. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  4. Sequential Probability Ration Tests : Conservative and Robust

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Shi, Wen

    2017-01-01

    In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output

  5. HIV pre-test information, discussion or counselling? A review of guidance relevant to the WHO European Region.

    Science.gov (United States)

    Bell, Stephen A; Delpech, Valerie; Raben, Dorthe; Casabona, Jordi; Tsereteli, Nino; de Wit, John

    2016-02-01

    In the context of a shift from exceptionalism to normalisation, this study examines recommendations/evidence in current pan-European/global guidelines regarding pre-test HIV testing and counselling practices in health care settings. It also reviews new research not yet included in guidelines. There is consensus that verbal informed consent must be gained prior to testing, individually, in private, confidentially, in the presence of a health care provider. All guidelines recommend pre-test information/discussion delivered verbally or via other methods (information sheet). There is agreement about a minimum standard of information to be provided before a test, but guidelines differ regarding discussion about issues encouraging patients to think about implications of the result. There is heavy reliance on expert consultation in guideline development. Referenced scientific evidence is often more than ten years old and based on US/UK research. Eight new papers are reviewed. Current HIV testing and counselling guidelines have inconsistencies regarding the extent and type of information that is recommended during pre-test discussions. The lack of new research underscores a need for new evidence from a range of European settings to support the process of expert consultation in guideline development. © The Author(s) 2015.

  6. Cost effectiveness of medical devices to diagnose pre-eclampsia in low-resource settings

    Directory of Open Access Journals (Sweden)

    Zoë M. McLaren

    Full Text Available Background: Maternal mortality remains a major health challenge facing developing countries, with pre-eclampsia accounting for up to 17% of maternal deaths. Diagnosis requires skilled health providers and devices that are appropriate for low-resource settings. This study presents the first cost-effectiveness analysis of multiple medical devices used to diagnose pre-eclampsia in low- and middle-income countries (LMICs. Methods: Blood pressure and proteinuria measurement devices, identified from compendia for LMICs, were included. We developed a decision tree framework to assess the cost-effectiveness of each device using parameter values that reflect the general standard of care based on a survey of relevant literature and expert opinion. We examined the sensitivity of our results using one-way and second-order probabilistic multivariate analyses. Results: Because the disability-adjusted life years (DALYs averted for each device were very similar, the results were influenced by the per-use cost ranking. The most cost-effective device combination was a semi-automatic blood pressure measurement device and visually read urine strip test with the lowest combined per-use cost of $0.2004 and an incremental cost effectiveness ratio of $93.6 per DALY gained relative to a baseline with no access to diagnostic devices. When access to treatment is limited, it is more cost-effective to improve access to treatment than to increase testing rates or diagnostic device sensitivity. Conclusions: Our findings were not sensitive to changes in device sensitivity, however they were sensitive to changes in the testing rate and treatment rate. Furthermore, our results suggest that simple devices are more cost-effective than complex devices. The results underscore the desirability of two design features for LMICs: ease of use and accuracy without calibration. Our findings have important implications for policy makers, health economists, health care providers and

  7. 40 CFR 89.406 - Pre-test procedures.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Pre-test procedures. 89.406 Section 89.406 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... Procedures § 89.406 Pre-test procedures. (a) Allow a minimum of 30 minutes warmup in the standby or operating...

  8. Pre-set extrusion bioprinting for multiscale heterogeneous tissue structure fabrication.

    Science.gov (United States)

    Kang, Donggu; Ahn, Geunseon; Kim, Donghwan; Kang, Hyun-Wook; Yun, Seokhwan; Yun, Won-Soo; Shim, Jin-Hyung; Jin, Songwan

    2018-06-06

    Recent advances in three-dimensional bioprinting technology have led to various attempts in fabricating human tissue-like structures. However, current bioprinting technologies have limitations for creating native tissue-like structures. To resolve these issues, we developed a new pre-set extrusion bioprinting technique that can create heterogeneous, multicellular, and multimaterial structures simultaneously. The key to this ability lies in the use of a precursor cartridge that can stably preserve a multimaterial with a pre-defined configuration that can be simply embedded in a syringe-based printer head. The multimaterial can be printed and miniaturized through a micro-nozzle without conspicuous deformation according to the pre-defined configuration of the precursor cartridge. Using this system, we fabricated heterogeneous tissue-like structures such as spinal cords, hepatic lobule, blood vessels, and capillaries. We further obtained a heterogeneous patterned model that embeds HepG2 cells with endothelial cells in a hepatic lobule-like structure. In comparison with homogeneous and heterogeneous cell printing, the heterogeneous patterned model showed a well-organized hepatic lobule structure and higher enzyme activity of CYP3A4. Therefore, this pre-set extrusion bioprinting method could be widely used in the fabrication of a variety of artificial and functional tissues or organs.

  9. The realistic performance achievable with mycobacterial automated culture systems in high and low prevalence settings

    NARCIS (Netherlands)

    van Kampen, S.C.; Anthony, R.M.; Klatser, P.R.

    2010-01-01

    Background: Diagnostic tests are generally used in situations with similar pre-test probability of disease to where they were developed. When these tests are applied in situations with very different pre-test probabilities of disease, it is informative to model the likely implications of known

  10. The realistic performance achievable with mycobacterial automated culture systems in high and low prevalence settings

    NARCIS (Netherlands)

    van Kampen, Sanne C.; Anthony, Richard M.; Klatser, Paul R.

    2010-01-01

    Diagnostic tests are generally used in situations with similar pre-test probability of disease to where they were developed. When these tests are applied in situations with very different pre-test probabilities of disease, it is informative to model the likely implications of known characteristics

  11. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    Science.gov (United States)

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  12. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    Directory of Open Access Journals (Sweden)

    Karel Doubravsky

    Full Text Available Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (rechecked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  13. Estimation and prediction of maximum daily rainfall at Sagar Island using best fit probability models

    Science.gov (United States)

    Mandal, S.; Choudhury, B. U.

    2015-07-01

    Sagar Island, setting on the continental shelf of Bay of Bengal, is one of the most vulnerable deltas to the occurrence of extreme rainfall-driven climatic hazards. Information on probability of occurrence of maximum daily rainfall will be useful in devising risk management for sustaining rainfed agrarian economy vis-a-vis food and livelihood security. Using six probability distribution models and long-term (1982-2010) daily rainfall data, we studied the probability of occurrence of annual, seasonal and monthly maximum daily rainfall (MDR) in the island. To select the best fit distribution models for annual, seasonal and monthly time series based on maximum rank with minimum value of test statistics, three statistical goodness of fit tests, viz. Kolmogorove-Smirnov test (K-S), Anderson Darling test ( A 2 ) and Chi-Square test ( X 2) were employed. The fourth probability distribution was identified from the highest overall score obtained from the three goodness of fit tests. Results revealed that normal probability distribution was best fitted for annual, post-monsoon and summer seasons MDR, while Lognormal, Weibull and Pearson 5 were best fitted for pre-monsoon, monsoon and winter seasons, respectively. The estimated annual MDR were 50, 69, 86, 106 and 114 mm for return periods of 2, 5, 10, 20 and 25 years, respectively. The probability of getting an annual MDR of >50, >100, >150, >200 and >250 mm were estimated as 99, 85, 40, 12 and 03 % level of exceedance, respectively. The monsoon, summer and winter seasons exhibited comparatively higher probabilities (78 to 85 %) for MDR of >100 mm and moderate probabilities (37 to 46 %) for >150 mm. For different recurrence intervals, the percent probability of MDR varied widely across intra- and inter-annual periods. In the island, rainfall anomaly can pose a climatic threat to the sustainability of agricultural production and thus needs adequate adaptation and mitigation measures.

  14. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  15. Pre-test analysis results of a PWR steel lined pre-stressed concrete containment model

    International Nuclear Information System (INIS)

    Basha, S.M.; Ghosh, Barnali; Patnaik, R.; Ramanujam, S.; Singh, R.K.; Kushwaha, H.S.; Venkat Raj, V.

    2000-02-01

    Pre-stressed concrete nuclear containment serves as the ultimate barrier against the release of radioactivity to the environment. This ultimate barrier must be checked for its ultimate load carrying capacity. BARC participated in a Round Robin analysis activity which is co-sponsored by Sandia National Laboratory, USA and Nuclear Power Engineering Corporation Japan for the pre-test prediction of a 1:4 size Pre-stressed Concrete Containment Vessel. In house finite element code ULCA was used to make the test predictions of displacements and strains at the standard output locations. The present report focuses on the important landmarks of the pre-test results, in sequential terms of first crack appearance, loss of pre-stress, first through thickness crack, rebar and liner yielding and finally liner tearing at the ultimate load. Global and local failure modes of the containment have been obtained from the analysis. Finally sensitivity of the numerical results with respect to different types of liners and different constitutive models in terms of bond strength between concrete and steel and tension-stiffening parameters are examined. The report highlights the important features which could be observed during the test and guidelines are given for improving the prediction in the post test computation after the test data is available. (author)

  16. The predictive validity of the BioMedical Admissions Test for pre-clinical examination performance.

    Science.gov (United States)

    Emery, Joanne L; Bell, John F

    2009-06-01

    Some medical courses in the UK have many more applicants than places and almost all applicants have the highest possible previous and predicted examination grades. The BioMedical Admissions Test (BMAT) was designed to assist in the student selection process specifically for a number of 'traditional' medical courses with clear pre-clinical and clinical phases and a strong focus on science teaching in the early years. It is intended to supplement the information provided by examination results, interviews and personal statements. This paper reports on the predictive validity of the BMAT and its predecessor, the Medical and Veterinary Admissions Test. Results from the earliest 4 years of the test (2000-2003) were matched to the pre-clinical examination results of those accepted onto the medical course at the University of Cambridge. Correlation and logistic regression analyses were performed for each cohort. Section 2 of the test ('Scientific Knowledge') correlated more strongly with examination marks than did Section 1 ('Aptitude and Skills'). It also had a stronger relationship with the probability of achieving the highest examination class. The BMAT and its predecessor demonstrate predictive validity for the pre-clinical years of the medical course at the University of Cambridge. The test identifies important differences in skills and knowledge between candidates, not shown by their previous attainment, which predict their examination performance. It is thus a valid source of additional admissions information for medical courses with a strong scientific emphasis when previous attainment is very high.

  17. Return to work after cancer and pre-cancer job dissatisfaction

    OpenAIRE

    Heinesen, Eskil; Kolodziejczyk, Christophe; Ladenburg, Jacob; Andersen, Ingelise; Thielen, Karsten

    2017-01-01

    We investigate the association between pre-cancer job dissatisfaction and return-to-work probability 3 years after a cancer diagnosis. We use a Danish data set combining administrative data and a survey to breast and colon cancer survivors. We find that the return-to-work probability has a negative correlation with pre-cancer job dissatisfaction with mental demands (where the correlation is driven by the high-educated) and with physical demands and the superior (where the correlation is drive...

  18. Conditional Probabilities in the Excursion Set Theory. Generic Barriers and non-Gaussian Initial Conditions

    CERN Document Server

    De Simone, Andrea; Riotto, Antonio

    2011-01-01

    The excursion set theory, where density perturbations evolve stochastically with the smoothing scale, provides a method for computing the dark matter halo mass function. The computation of the mass function is mapped into the so-called first-passage time problem in the presence of a moving barrier. The excursion set theory is also a powerful formalism to study other properties of dark matter halos such as halo bias, accretion rate, formation time, merging rate and the formation history of halos. This is achieved by computing conditional probabilities with non-trivial initial conditions, and the conditional two-barrier first-crossing rate. In this paper we use the recently-developed path integral formulation of the excursion set theory to calculate analytically these conditional probabilities in the presence of a generic moving barrier, including the one describing the ellipsoidal collapse, and for both Gaussian and non-Gaussian initial conditions. The non-Markovianity of the random walks induced by non-Gaussi...

  19. Pre-test evaluation of LLTR series II Test A-7

    International Nuclear Information System (INIS)

    Knittle, D.

    1981-03-01

    The purpose of this report is to present pre-test predictions of pressure histories for the A-7 test to be conducted in the Large Leak Test Rig (LLTR) at the Energy Technology Engineering Center (ETEC) in April 1981

  20. Wald Sequential Probability Ratio Test for Analysis of Orbital Conjunction Data

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis; Gold, Dara

    2013-01-01

    We propose a Wald Sequential Probability Ratio Test for analysis of commonly available predictions associated with spacecraft conjunctions. Such predictions generally consist of a relative state and relative state error covariance at the time of closest approach, under the assumption that prediction errors are Gaussian. We show that under these circumstances, the likelihood ratio of the Wald test reduces to an especially simple form, involving the current best estimate of collision probability, and a similar estimate of collision probability that is based on prior assumptions about the likelihood of collision.

  1. 10 CFR 26.65 - Pre-access drug and alcohol testing.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Pre-access drug and alcohol testing. 26.65 Section 26.65... § 26.65 Pre-access drug and alcohol testing. (a) Purpose. This section contains pre-access testing... days. If an individual has negative results from drug and alcohol tests that were conducted under the...

  2. A new MCNP trademark test set

    International Nuclear Information System (INIS)

    Brockhoff, R.C.; Hendricks, J.S.

    1994-09-01

    The MCNP test set is used to test the MCNP code after installation on various computer platforms. For MCNP4 and MCNP4A this test set included 25 test problems designed to test as many features of the MCNP code as possible. A new and better test set has been devised to increase coverage of the code from 85% to 97% with 28 problems. The new test set is as fast as and shorter than the MCNP4A test set. The authors describe the methodology for devising the new test set, the features that were not covered in the MCNP4A test set, and the changes in the MCNP4A test set that have been made for MCNP4B and its developmental versions. Finally, new bugs uncovered by the new test set and a compilation of all known MCNP4A bugs are presented

  3. Comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Racz, A. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1996-07-01

    In this paper the classical sequential probability ratio testing method (SPRT) is reconsidered. Every individual boundary crossing event of the SPRT is regarded as a new piece of evidence about the problem under hypothesis testing. The Bayes method is applied for belief updating, i.e. integrating these individual decisions. The procedure is recommended to use when the user (1) would like to be informed about the tested hypothesis continuously and (2) would like to achieve his final conclusion with high confidence level. (Author).

  4. Bayesian noninferiority test for 2 binomial probabilities as the extension of Fisher exact test.

    Science.gov (United States)

    Doi, Masaaki; Takahashi, Fumihiro; Kawasaki, Yohei

    2017-12-30

    Noninferiority trials have recently gained importance for the clinical trials of drugs and medical devices. In these trials, most statistical methods have been used from a frequentist perspective, and historical data have been used only for the specification of the noninferiority margin Δ>0. In contrast, Bayesian methods, which have been studied recently are advantageous in that they can use historical data to specify prior distributions and are expected to enable more efficient decision making than frequentist methods by borrowing information from historical trials. In the case of noninferiority trials for response probabilities π 1 ,π 2 , Bayesian methods evaluate the posterior probability of H 1 :π 1 >π 2 -Δ being true. To numerically calculate such posterior probability, complicated Appell hypergeometric function or approximation methods are used. Further, the theoretical relationship between Bayesian and frequentist methods is unclear. In this work, we give the exact expression of the posterior probability of the noninferiority under some mild conditions and propose the Bayesian noninferiority test framework which can flexibly incorporate historical data by using the conditional power prior. Further, we show the relationship between Bayesian posterior probability and the P value of the Fisher exact test. From this relationship, our method can be interpreted as the Bayesian noninferior extension of the Fisher exact test, and we can treat superiority and noninferiority in the same framework. Our method is illustrated through Monte Carlo simulations to evaluate the operating characteristics, the application to the real HIV clinical trial data, and the sample size calculation using historical data. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Impact of proof test interval and coverage on probability of failure of safety instrumented function

    International Nuclear Information System (INIS)

    Jin, Jianghong; Pang, Lei; Hu, Bin; Wang, Xiaodong

    2016-01-01

    Highlights: • Introduction of proof test coverage makes the calculation of the probability of failure for SIF more accurate. • The probability of failure undetected by proof test is independently defined as P TIF and calculated. • P TIF is quantified using reliability block diagram and simple formula of PFD avg . • Improving proof test coverage and adopting reasonable test period can reduce the probability of failure for SIF. - Abstract: Imperfection of proof test can result in the safety function failure of safety instrumented system (SIS) at any time in its life period. IEC61508 and other references ignored or only elementarily analyzed the imperfection of proof test. In order to further study the impact of the imperfection of proof test on the probability of failure for safety instrumented function (SIF), the necessity of proof test and influence of its imperfection on system performance was first analyzed theoretically. The probability of failure for safety instrumented function resulted from the imperfection of proof test was defined as probability of test independent failures (P TIF ), and P TIF was separately calculated by introducing proof test coverage and adopting reliability block diagram, with reference to the simplified calculation formula of average probability of failure on demand (PFD avg ). Research results show that: the shorter proof test period and the higher proof test coverage indicate the smaller probability of failure for safety instrumented function. The probability of failure for safety instrumented function which is calculated by introducing proof test coverage will be more accurate.

  6. Imperfection detection probability at ultrasonic testing of reactor vessels

    International Nuclear Information System (INIS)

    Kazinczy, F. de; Koernvik, L.Aa.

    1980-02-01

    The report is a lecture given at a symposium organized by the Swedish nuclear power inspectorate on February 1980. Equipments, calibration and testing procedures are reported. The estimation of defect detection probability for ultrasonic tests and the reliability of literature data are discussed. Practical testing of reactor vessels and welded joints are described. Swedish test procedures are compared with other countries. Series of test data for welded joints of the OKG-2 reactor are presented. Future recommendations for testing procedures are made. (GBn)

  7. Log-concave Probability Distributions: Theory and Statistical Testing

    DEFF Research Database (Denmark)

    An, Mark Yuing

    1996-01-01

    This paper studies the broad class of log-concave probability distributions that arise in economics of uncertainty and information. For univariate, continuous, and log-concave random variables we prove useful properties without imposing the differentiability of density functions. Discrete...... and multivariate distributions are also discussed. We propose simple non-parametric testing procedures for log-concavity. The test statistics are constructed to test one of the two implicati ons of log-concavity: increasing hazard rates and new-is-better-than-used (NBU) property. The test for increasing hazard...... rates are based on normalized spacing of the sample order statistics. The tests for NBU property fall into the category of Hoeffding's U-statistics...

  8. Patient perspectives with abbreviated versus standard pre-test HIV counseling in the prenatal setting: a randomized-controlled, non-inferiority trial.

    Science.gov (United States)

    Cohan, Deborah; Gomez, Elvira; Greenberg, Mara; Washington, Sierra; Charlebois, Edwin D

    2009-01-01

    In the US, an unacceptably high percentage of pregnant women do not undergo prenatal HIV testing. Previous studies have found increased uptake of prenatal HIV testing with abbreviated pre-test counseling, however little is known about patient decision making, testing satisfaction and knowledge in this setting. A randomized-controlled, non-inferiority trial was conducted from October 2006 through February 2008 at San Francisco General Hospital (SFGH), the public teaching hospital of the City and County of San Francisco. A total of 278 English- and Spanish-speaking pregnant women were randomized to receive either abbreviated or standard nurse-performed HIV test counseling at the initial prenatal visit. Patient decision making experience was compared between abbreviated versus standard HIV counseling strategies among a sample of low-income, urban, ethnically diverse prenatal patients. The primary outcome was the decisional conflict score (DCS) using O'Connor low-literacy scale and secondary outcomes included satisfaction with test decision, basic HIV knowledge and HIV testing uptake. We conducted an intention-to-treat analysis of 278 women--134 (48.2%) in the abbreviated arm (AA) and 144 (51.8%) in the standard arm (SA). There was no significant difference in the proportion of women with low decisional conflict (71.6% in AA vs. 76.4% in SA, p = .37), and the observed mean difference between the groups of 3.88 (95% CI: -0.65, 8.41) did not exceed the non-inferiority margin. HIV testing uptake was very high (97. 8%) and did not differ significantly between the 2 groups (99.3% in AA vs. 96.5% in SA, p = .12). Likewise, there was no difference in satisfaction with testing decision (97.8% in AA vs. 99.3% in SA, p = .36). However, women in AA had significantly lower mean HIV knowledge scores (78.4%) compared to women in SA (83.7%, pprocess, while associated with slightly lower knowledge, does not compromise patient decision making or satisfaction regarding HIV testing

  9. Clock face drawing test performance in children with ADHD.

    Science.gov (United States)

    Ghanizadeh, Ahmad; Safavi, Salar; Berk, Michael

    2013-01-01

    The utility and discriminatory pattern of the clock face drawing test in ADHD is unclear. This study therefore compared Clock Face Drawing test performance in children with ADHD and controls. 95 school children with ADHD and 191 other children were matched for gender ratio and age. ADHD symptoms severities were assessed using DSM-IV ADHD checklist and their intellectual functioning was assessed. The participants completed three clock-drawing tasks, and the following four functions were assessed: Contour score, Numbers score, Hands setting score, and Center score. All the subscales scores of the three clock drawing tests of the ADHD group were lower than that of the control group. In ADHD children, inattention and hyperactivity/ impulsivity scores were not related to free drawn clock test scores. When pre-drawn contour test was performed, inattentiveness score was statistically associated with Number score while none of the other variables of age, gender, intellectual functioning, and hand use preference were associated with that kind of score. In pre-drawn clock, no association of ADHD symptoms with any CDT subscales found significant. In addition, more errors are observed with free drawn clock and Pre-drawn contour than pre-drawn clock. Putting Numbers and Hands setting are more sensitive measures to screen ADHD than Contour and Center drawing. Test performance, except Hands setting, may have already reached a developmental plateau. It is probable that Hand setting deficit in children with ADHD may not decrease from age 8 to 14 years. Performance of children with ADHD is associated with complexity of CDT.

  10. High heat flux tests of the WENDELSTEIN 7-X pre-series target elements

    International Nuclear Information System (INIS)

    Greuner, H.; Boeswirth, B.; Boscary, J.; Plankensteiner, A.; Schedler, B.

    2007-01-01

    The high heat flux (HHF) testing of WENDELSTEIN 7-X pre-series target elements is an indispensable step in the qualification of the manufacturing process. A set of 20 full scale pre-series elements was manufactured by PLANSEE SE to validate the materials and manufacturing technologies prior to the start of the series production. The HHF tests were performed in the ion beam test facility GLADIS. All actively water-cooled elements were tested for about 100 cycles at 10 MW/m 2 (10-15 s pulse duration). Several elements were loaded with even higher cycle numbers (up to 1000) and heat loads up to 24 MW/m 2 . Hot spots were, observed at the edges of several tiles during the HHF tests indicating local bonding problems of the CFC. The thermo-mechanical behaviour under HHF loading has been evaluated and compared to the FEM predictions. The measured temperatures and strains confirm the chosen FEM approach. This allows a component optimisation to achieve a successful series production of the W7-X divertor target elements

  11. Monte Carlo simulation of the sequential probability ratio test for radiation monitoring

    International Nuclear Information System (INIS)

    Coop, K.L.

    1984-01-01

    A computer program simulates the Sequential Probability Ratio Test (SPRT) using Monte Carlo techniques. The program, SEQTEST, performs random-number sampling of either a Poisson or normal distribution to simulate radiation monitoring data. The results are in terms of the detection probabilities and the average time required for a trial. The computed SPRT results can be compared with tabulated single interval test (SIT) values to determine the better statistical test for particular monitoring applications. Use of the SPRT in a hand-and-foot alpha monitor shows that the SPRT provides better detection probabilities while generally requiring less counting time. Calculations are also performed for a monitor where the SPRT is not permitted to the take longer than the single interval test. Although the performance of the SPRT is degraded by this restriction, the detection probabilities are still similar to the SIT values, and average counting times are always less than 75% of the SIT time. Some optimal conditions for use of the SPRT are described. The SPRT should be the test of choice in many radiation monitoring situations. 6 references, 8 figures, 1 table

  12. Application of the huff model of shopping probability in the selected stores in Prešov (Prešov, the Slovak Republic

    Directory of Open Access Journals (Sweden)

    Mitríková Jana

    2015-01-01

    Full Text Available The main objective of this article is the calculation of the Huff Model and the comparison of obtained data with the questionnaire survey results. Afterwards, we will try to assess the validity of this theoretical model of shopping probability for practical use as well as its overall objectivity. Calculation of the Huff Model of Probability will be calculated for nine large-scale retail stores in fifteen local regions within the area of the third largest town in the Slovak Republic - Prešov. These results will be compared to results of the questionnaire research which was conducted during the period of November 2013 until March 2014 on a sample of 1,096 respondents. The questionnaire was addressing customers' preference for various large-scale stores as well as their place of residence. Based on the calculation results and the results obtained from the questionnaire survey, cartographic outputs were created, pointing out overall consistency as well as differences in the already mentioned results.

  13. Assessment of clinical utility of 18F-FDG PET in patients with head and neck cancer: a probability analysis

    International Nuclear Information System (INIS)

    Goerres, Gerhard W.; Mosna-Firlejczyk, Katarzyna; Schulthess, Gustav K. von; Steurer, Johann; Bachmann, Lucas M.

    2003-01-01

    The purpose of this study was to calculate disease probabilities based on data of patients with head and neck cancer in the register of our institution and to perform a systematic review of the available data on the accuracy of PET in the primary assessment and follow-up of patients with head and neck cancer. The pre-test probability of head and neck cancer among patients in our institutional data registry was assessed. Then the published literature was selected and appraised according to a standard protocol of systematic reviews. Two reviewers independently selected and extracted data on study characteristics, quality and accuracy. Accuracy data were used to form 2 x 2 contingency tables and were pooled to produce summary receiver operating characteristic (ROC) curves and summary likelihood ratios for positive and negative testing. Finally post-test probabilities were calculated on the basis of the pre-test probabilities of this patient group. All patients had cytologically or histologically proven cancer. The prevalence of additional lymph node metastases on PET in staging examinations was 19.6% (11/56), and that of locoregional recurrence on restaging PET was 28.6% (12/42). In the primary assessment of patients, PET had positive and negative likelihood ratios of 3.9 (2.56-5.93) and 0.24 (0.14-0.41), respectively. Disease probabilities were therefore 49.4% for a positive test result and 5.7% for a negative test result. In the assessment of recurrence these values were 3.96 (2.8-5.6) and 0.16 (0.1-0.25), resulting in probabilities of 49.7% and 3.8%. PET evaluation for involvement of lymph nodes had positive and negative likelihood ratios of 17.26 (10.9-27.3) and 0.19 (0.13-0.27) for primary assessment and 11.0 (2.93-41.24) and 0.14 (0.01-1.88) for detection of recurrence. The probabilities were 81.2% and 4.5% for primary assessment and 73.3% and 3.4% for assessment of recurrence. It is concluded that in this clinical setting the main advantage of PET is the

  14. Binomial Test Method for Determining Probability of Detection Capability for Fracture Critical Applications

    Science.gov (United States)

    Generazio, Edward R.

    2011-01-01

    The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that for a minimum flaw size and all greater flaw sizes, there is 0.90 probability of detection with 95% confidence (90/95 POD). Directed design of experiments for probability of detection (DOEPOD) has been developed to provide an efficient and accurate methodology that yields estimates of POD and confidence bounds for both Hit-Miss or signal amplitude testing, where signal amplitudes are reduced to Hit-Miss by using a signal threshold Directed DOEPOD uses a nonparametric approach for the analysis or inspection data that does require any assumptions about the particular functional form of a POD function. The DOEPOD procedure identifies, for a given sample set whether or not the minimum requirement of 0.90 probability of detection with 95% confidence is demonstrated for a minimum flaw size and for all greater flaw sizes (90/95 POD). The DOEPOD procedures are sequentially executed in order to minimize the number of samples needed to demonstrate that there is a 90/95 POD lower confidence bound at a given flaw size and that the POD is monotonic for flaw sizes exceeding that 90/95 POD flaw size. The conservativeness of the DOEPOD methodology results is discussed. Validated guidelines for binomial estimation of POD for fracture critical inspection are established.

  15. Setting visual pre-placement testing in a technology manufacturing environment.

    Science.gov (United States)

    Gowan, Nancy J

    2014-01-01

    Every day we use our eyes to perform activities of daily living and work. Aging changes as well as health conditions can impact an individual's visual function, making it more difficult to accurately perform work activities. Occupational therapists work closely with optometrists and employers to develop ways to accommodate for these changes so that the employee can continue to perform the work tasks. This manuscript outlines a case study of systematically developing visual demands analyses and pre-placement vision screening assessment protocols for individuals completing quality inspection positions. When the vision screening was completed, it was discovered that over 20% of the employees had visual deficits that were correctable. This screening process yielded improved quality results but also identification of previously undetected visual deficits. Further development of vision screening in the workplace is supported.

  16. The importance of pre-planning for large hydrostatic test programs

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Andrew Keith [WorleyParsons Calgary, Calgary, AB (Canada); Wong, Everett Clementi [Enbridge Pipelines Inc., Edmonton, AB (Canada)

    2010-07-01

    During the design phase of a pipeline project, large hydrostatic test programs are required to locate and secure water sources. Many companies complete hydrostatic test planning through high level desktop analysis, however this technique can result in important unplanned costs and schedule delays. The aim of this paper is to assess the cost benefits of pre-planning large hydrostatic test programs versus the costs of unplanned delays in the execution of hydrostatic testing. This comparison was based on the successful application of pre-planning of 57 mainline hydrostatic tests in the construction of the Line 4 Extension and Alberta Clipper Expansion oil pipelines by Enbridge Pipelines Inc. Results showed that costs of delays and uncertainty during construction far outweigh the costs of pre-planning. This study highlighted that pre-planning for large hydrostatic test programs should be carried out in the execution of large pipeline projects to ensure success.

  17. Overview of the testing activities on ITER sub-scale pre-compression rings

    Energy Technology Data Exchange (ETDEWEB)

    Rossi, Paolo, E-mail: paolo.rossi@enea.it [Associazione EURATOM-ENEA sulla Fusione, C.R. Frascati, C.P. 65, 00044 Frascati, Rome (Italy); Capobianchi, Mario; Crescenzi, Fabio; Massimi, Alberto; Mugnaini, Giampiero; Pizzuto, Aldo [Associazione EURATOM-ENEA sulla Fusione, C.R. Frascati, C.P. 65, 00044 Frascati, Rome (Italy); Knaster, Juan [ITER Organisation, Route de Vinon sur Verdon, 13115, St. Paul lez Durance (France); Rajainmaki, Hannu [FUSION FOR ENERGY, Josep Pla no. 2, Torres Diagonal Litoral Edificio B3, 08019 Barcelona (Spain)

    2012-08-15

    Highlights: Black-Right-Pointing-Pointer ENEA developed a high strength glass fiber-epoxy composite for ITER pre-compression rings. Black-Right-Pointing-Pointer High UTS values were obtained at RT on linear specimens (2200 MPa) and on scaled ring mock-ups (1550 MPa). Black-Right-Pointing-Pointer Creep tests showed very low creep strain and creep rates. Black-Right-Pointing-Pointer Long term tests showed no significant stress relaxation on the ring mock-ups. - Abstract: After a first R and D and testing activity to develop and characterize by tensile and creep tests a high strength glass fiber-epoxy composite as reference material for the manufacture of ITER pre-compression rings, ENEA designed and manufactured a dedicated testing facility and different sub-scale composite ring mock-ups in order to characterize their mechanical properties. The paper reports the results of the overall testing activities performed during the last years on a total number of eleven sub-scale pre-compression ring mock-ups manufactured by winding S2 glass fibers on a diameter of 1 m (1/5 of the full scale) both by vacuum pressure epoxy impregnation (VPI) and filament wet winding techniques (WW). The first three rings were manufactured by ENEA Frascati thanks to a particular VPI technique; one of them was used as base composite material to manufacture different sets of specimens for shear, compression and non destructive tests (NDT). Then, five other mock-ups were manufactured following ENEA VPI process and three using WW technique by two different industrial companies. The rings were tested in ENEA Frascati in a dedicated hydraulic testing machine consisting of 18 radial actuators working in position control with a total load capability of 1000 tons. The complete testing campaign consisted of six ultimate tensile strength (UTS) tests and four stress relaxation (SR) tests. The tests demonstrated that the composite (S2 glass-epoxy) is a valid and viable solution for the ITER pre

  18. Clock Face Drawing Test Performance in Children with ADHD

    Directory of Open Access Journals (Sweden)

    Ahmad Ghanizadeh

    2013-01-01

    Full Text Available  Introduction: The utility and discriminatory pattern of the clock face drawing test in ADHD is unclear. This study therefore compared Clock Face Drawing test performance in children with ADHD and controls.   Material & methods: 95 children with ADHD and 191 school children were matched for gender ratio and age. ADHD symptoms severities were assessed using DSM-IV ADHD checklist and their intellectual functioning was assessed. The participants completed three clock-drawing tasks, and the following four functions were assessed: Contour score, Numbers score, Hands setting score, and Center score    Results: All the subscales scores of the three clock drawing tests of the ADHD group were lower than that of the control group. In ADHD children, inattention and hyperactivity/impulsivity scores were not related with free drawn clock test scores. When pre-drawn contour test was performed, inattentiveness score was statistically associated with Number score. None of the other variables of age, gender, intellectual functioning, and hand use preference were associated with Numbers score. In pre-drawn clock, no association of ADHD symptoms with any CDT subscales was significant. In addition, more errors are observed with free drawn clock and Pre-drawn contour than pre-drawn clock.    Conclusion: Putting Numbers and Hands setting are more sensitive measures to screen ADHD than Contour and Center drawing. Test performance, except Hands setting, may have already reached a developmental plateau. It is probable that Hand setting deficit in children with ADHD may not decrease from age 8 to 14 years. Performance of children with ADHD is associated with the complexity of CDT.

  19. Further comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kulacsy, K. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1997-05-23

    The Bayesian method for belief updating proposed in Racz (1996) is examined. The interpretation of the belief function introduced therein is found, and the method is compared to the classical binary Sequential Probability Ratio Testing method (SPRT). (author).

  20. Fasting capillary blood glucose: an appropriate measurement in screening for diabetes and pre-diabetes in low-resource rural settings.

    Science.gov (United States)

    Zhao, X; Zhao, W; Zhang, H; Li, J; Shu, Y; Li, S; Cai, L; Zhou, J; Li, Y; Hu, R

    2013-01-01

    To evaluate the efficiency of fasting capillary blood glucose (FCG) measurement as compared with fasting venous plasma glucose (FPG) measurement in screening diabetes and pre-diabetes in low-resource rural settings. In 2010, 993 participants were randomly selected from 9 villages in Yunnan province using cluster sampling method. Samples for FCG and FPG test were obtained after demographics and physical examination. The oral glucose tolerance test was performed in parallel as gold standard for diagnosis. Diagnostic capacities of the FCG measurement in predicting undiagnosed diabetes and pre-diabetes were assessed. The performance of FCG and FPG tests was compared. Fifty-seven individuals with undiagnosed diabetes and 145 subjects with pre-diabetes were detected. The concordance between FCG and FPG levels was high (r = 0.75, p curve (AUC) for FCG test in predicting diabetes was 0.88 [95% confidence interval (CI) 0.82-0.93] with the optimal cutoff value of 5.65 mmol/l, sensitivity of 84.2%, and specificity of 79.3%. The corresponding values in FPG tests were 0.92 (95% CI 0.88-0.97) (AUC), 6.51 mmol/l (optimal cutoff point), 82.5% (sensitivity) and 98.3% (specificity), respectively. No significant difference was found in the AUC for the two screening strategies. FCG measurement is considered to be a convenient, practicable screening method in low-resource rural communities with acceptable test properties.

  1. Pre-test evaluation of LLTR Series II Test A-6

    International Nuclear Information System (INIS)

    Knittle, D.

    1980-11-01

    Purpose of this report is to present pre-test predictions of pressure histories for the A6 test to be conducted in the Large Leak Test Facility (LLTF) at the Energy Technology Engineering Center. A6 is part of a test program being conducted to evaluate the effects of leaks produced by a double-ended guillotine rupture of a single tube. A6 will provide data on the CRBR prototypical double rupture disc performance

  2. Safeguarding a Lunar Rover with Wald's Sequential Probability Ratio Test

    Science.gov (United States)

    Furlong, Michael; Dille, Michael; Wong, Uland; Nefian, Ara

    2016-01-01

    The virtual bumper is a safeguarding mechanism for autonomous and remotely operated robots. In this paper we take a new approach to the virtual bumper system by using an old statistical test. By using a modified version of Wald's sequential probability ratio test we demonstrate that we can reduce the number of false positive reported by the virtual bumper, thereby saving valuable mission time. We use the concept of sequential probability ratio to control vehicle speed in the presence of possible obstacles in order to increase certainty about whether or not obstacles are present. Our new algorithm reduces the chances of collision by approximately 98 relative to traditional virtual bumper safeguarding without speed control.

  3. Deriving the probability of a linear opinion pooling method being superior to a set of alternatives

    International Nuclear Information System (INIS)

    Bolger, Donnacha; Houlding, Brett

    2017-01-01

    Linear opinion pools are a common method for combining a set of distinct opinions into a single succinct opinion, often to be used in a decision making task. In this paper we consider a method, termed the Plug-in approach, for determining the weights to be assigned in this linear pool, in a manner that can be deemed as rational in some sense, while incorporating multiple forms of learning over time into its process. The environment that we consider is one in which every source in the pool is herself a decision maker (DM), in contrast to the more common setting in which expert judgments are amalgamated for use by a single DM. We discuss a simulation study that was conducted to show the merits of our technique, and demonstrate how theoretical probabilistic arguments can be used to exactly quantify the probability of this technique being superior (in terms of a probability density metric) to a set of alternatives. Illustrations are given of simulated proportions converging to these true probabilities in a range of commonly used distributional cases. - Highlights: • A novel context for combination of expert opinion is provided. • A dynamic reliability assessment method is stated, justified by properties and a data study. • The theoretical grounding underlying the data-driven justification is explored. • We conclude with areas for expansion and further relevant research.

  4. Limited test data: The choice between confidence limits and inverse probability

    International Nuclear Information System (INIS)

    Nichols, P.

    1975-01-01

    For a unit which has been successfully designed to a high standard of reliability, any test programme of reasonable size will result in only a small number of failures. In these circumstances the failure rate estimated from the tests will depend on the statistical treatment applied. When a large number of units is to be manufactured, an unexpected high failure rate will certainly result in a large number of failures, so it is necessary to guard against optimistic unrepresentative test results by using a confidence limit approach. If only a small number of production units is involved, failures may not occur even with a higher than expected failure rate, and so one may be able to accept a method which allows for the possibility of either optimistic or pessimistic test results, and in this case an inverse probability approach, based on Bayes' theorem, might be used. The paper first draws attention to an apparently significant difference in the numerical results from the two methods, particularly for the overall probability of several units arranged in redundant logic. It then discusses a possible objection to the inverse method, followed by a demonstration that, for a large population and a very reasonable choice of prior probability, the inverse probability and confidence limit methods give the same numerical result. Finally, it is argued that a confidence limit approach is overpessimistic when a small number of production units is involved, and that both methods give the same answer for a large population. (author)

  5. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  6. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  7. Estimation of the common cause failure probabilities of the components under mixed testing schemes

    International Nuclear Information System (INIS)

    Kang, Dae Il; Hwang, Mee Jeong; Han, Sang Hoon

    2009-01-01

    For the case where trains or channels of standby safety systems consisting of more than two redundant components are tested in a staggered manner, the standby safety components within a train can be tested simultaneously or consecutively. In this case, mixed testing schemes, staggered and non-staggered testing schemes, are used for testing the components. Approximate formulas, based on the basic parameter method, were developed for the estimation of the common cause failure (CCF) probabilities of the components under mixed testing schemes. The developed formulas were applied to the four redundant check valves of the auxiliary feed water system as a demonstration study for their appropriateness. For a comparison, we estimated the CCF probabilities of the four redundant check valves for the mixed, staggered, and non-staggered testing schemes. The CCF probabilities of the four redundant check valves for the mixed testing schemes were estimated to be higher than those for the staggered testing scheme, and lower than those for the non-staggered testing scheme.

  8. Finding upper bounds for software failure probabilities - experiments and results

    International Nuclear Information System (INIS)

    Kristiansen, Monica; Winther, Rune

    2005-09-01

    This report looks into some aspects of using Bayesian hypothesis testing to find upper bounds for software failure probabilities. In the first part, the report evaluates the Bayesian hypothesis testing approach for finding upper bounds for failure probabilities of single software components. The report shows how different choices of prior probability distributions for a software component's failure probability influence the number of tests required to obtain adequate confidence in a software component. In the evaluation, both the effect of the shape of the prior distribution as well as one's prior confidence in the software component were investigated. In addition, different choices of prior probability distributions are discussed based on their relevance in a software context. In the second part, ideas on how the Bayesian hypothesis testing approach can be extended to assess systems consisting of multiple software components are given. One of the main challenges when assessing systems consisting of multiple software components is to include dependency aspects in the software reliability models. However, different types of failure dependencies between software components must be modelled differently. Identifying different types of failure dependencies are therefore an important condition for choosing a prior probability distribution, which correctly reflects one's prior belief in the probability for software components failing dependently. In this report, software components include both general in-house software components, as well as pre-developed software components (e.g. COTS, SOUP, etc). (Author)

  9. The accuracy of clinical and biochemical estimates in defining the pre-test probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Garvie, N.W.; Salehzahi, F.; Kuitert, L.

    2002-01-01

    Full text: The PIOPED survey confirmed the significance of the high probability ventilation/perfusion scan (HP V/Q scan) in establishing the diagnosis of pulmonary embolism (PE). In an interesting sentence, however, the authors indicated that 'the clinicians' assessment of the likelihood of PE (prior probability)' can substantially increase the predictive value of the investigation. The criteria used for this assessment were not published, and this statement conflicts with the belief that the clinical diagnosis of pulmonary embolism is unreliable. A medical history was obtained from 668 patients undergoing V/Q lung scans for suspected PE, and certain clinical features linked to PE were, when present, documented. These included pleuritic chest pain, haemoptysis, dyspnoea, clinical evidence of DVT, recent surgery and right ventricular strain pattern an ECG. D-Dimer levels and initial arterial oxygen saturation (PaO2) levels were also obtained. The prevalence of these clinical and biochemical criteria was then compared between HP (61) and normal (171) scans after exclusion of all equivocal or intermediate scan outcomes (436), (where lung scintigraphy was unable to provide a definite diagnosis). D-Dimer and/or oxygen saturation levels, were similarly compared in each group. A true positive result was scored for each clinical or biochemical criterion when linked with a high probability scan and, conversely, a false positive score when the scan outcome was normal. In this fashion, the positive predictive value (PPV) and, when appropriate, the negative predictive value (NPV) was obtained for each risk factor. In the context of PE, DVT and post-operative status prove the more reliable predictors of a high probability outcome. Where both features were present, the PPV rose to 0.57. A normal D-Dimer level was a better excluder of PE than a normal oxygen saturation level (NPV 0.78-v-0.44). Conversely, a raised D-Dimer, or reduced oxygen saturation, were both a little value in

  10. Estimation of the common cause failure probabilities on the component group with mixed testing scheme

    International Nuclear Information System (INIS)

    Hwang, Meejeong; Kang, Dae Il

    2011-01-01

    Highlights: ► This paper presents a method to estimate the common cause failure probabilities on the common cause component group with mixed testing schemes. ► The CCF probabilities are dependent on the testing schemes such as staggered testing or non-staggered testing. ► There are many CCCGs with specific mixed testing schemes in real plant operation. ► Therefore, a general formula which is applicable to both alternate periodic testing scheme and train level mixed testing scheme was derived. - Abstract: This paper presents a method to estimate the common cause failure (CCF) probabilities on the common cause component group (CCCG) with mixed testing schemes such as the train level mixed testing scheme or the alternate periodic testing scheme. In the train level mixed testing scheme, the components are tested in a non-staggered way within the same train, but the components are tested in a staggered way between the trains. The alternate periodic testing scheme indicates that all components in the same CCCG are tested in a non-staggered way during the planned maintenance period, but they are tested in a staggered way during normal plant operation. Since the CCF probabilities are dependent on the testing schemes such as staggered testing or non-staggered testing, CCF estimators have two kinds of formulas in accordance with the testing schemes. Thus, there are general formulas to estimate the CCF probability on the staggered testing scheme and non-staggered testing scheme. However, in real plant operation, there are many CCCGs with specific mixed testing schemes. Recently, Barros () and Kang () proposed a CCF factor estimation method to reflect the alternate periodic testing scheme and the train level mixed testing scheme. In this paper, a general formula which is applicable to both the alternate periodic testing scheme and the train level mixed testing scheme was derived.

  11. Sequential Probability Ratio Test for Spacecraft Collision Avoidance Maneuver Decisions

    Science.gov (United States)

    Carpenter, J. Russell; Markley, F. Landis

    2013-01-01

    A document discusses sequential probability ratio tests that explicitly allow decision-makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models the null hypotheses that the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming, highly elliptical orbit formation flying mission.

  12. The universe of ANA testing: a case for point-of-care ANA testing.

    Science.gov (United States)

    Konstantinov, Konstantin N; Rubin, Robert L

    2017-12-01

    Testing for total antinuclear antibodies (ANA) is a critical tool for diagnosis and management of autoimmune diseases at both the primary care and subspecialty settings. Repurposing of ANA from a test for lupus to a test for any autoimmune condition has driven the increase in ANA requests. Changes in ANA referral patterns include early or subclinical autoimmune disease detection in patients with low pre-test probability and use of negative ANA results to rule out underlying autoimmune disease. A positive result can lead to further diagnostic considerations. Currently, ANA tests are performed in centralized laboratories; an alternative would be ANA testing at the clinical point-of-care (POC). By virtue of its near real-time data collection capability, low cost, and ease of use, we believe the POC ANA has the potential to enable a new paradigm shift in autoimmune serology testing.

  13. White blood cell and platelet count as adjuncts to standard clinical evaluation for risk assessment in patients at low probability of acute aortic syndrome.

    Science.gov (United States)

    Morello, Fulvio; Cavalot, Giulia; Giachino, Francesca; Tizzani, Maria; Nazerian, Peiman; Carbone, Federica; Pivetta, Emanuele; Mengozzi, Giulio; Moiraghi, Corrado; Lupia, Enrico

    2017-08-01

    Pre-test probability assessment is key in the approach to suspected acute aortic syndromes (AASs). However, most patients with AAS-compatible symptoms are classified at low probability, warranting further evaluation for decision on aortic imaging. White blood cell count, platelet count and fibrinogen explore pathophysiological pathways mobilized in AASs and are routinely assayed in the workup of AASs. However, the diagnostic performance of these variables for AASs, alone and as a bundle, is unknown. We tested the hypothesis that white blood cell count, platelet count and/or fibrinogen at presentation may be applied as additional tools to standard clinical evaluation for pre-test risk assessment in patients at low probability of AAS. This was a retrospective observational study conducted on consecutive patients managed in our Emergency Department from 2009 to 2014 for suspected AAS. White blood cell count, platelet count and fibrinogen were assayed during evaluation in the Emergency Department. The final diagnosis was obtained by computed tomography angiography. The pre-test probability of AAS was defined according to guidelines. Of 1210 patients with suspected AAS, 1006 (83.1%) were classified at low probability, and 271 (22.4%) were diagnosed with AAS. Within patients at low probability, presence of at least one alteration among white blood cell count >9*10 3 /µl, platelet count probability, white blood cell count >9*10 3 /µl and platelet count probability, the estimated risk of AAS based on the number of alterations amongst white blood cell count >9*10 3 /µl and platelet count probability to fine-tune risk assessment of AAS.

  14. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  15. Over-pressure test on BARCOM pre-stressed concrete containment

    Energy Technology Data Exchange (ETDEWEB)

    Parmar, R.M.; Singh, Tarvinder; Thangamani, I.; Trivedi, Neha; Singh, Ram Kumar, E-mail: rksingh@barc.gov.in

    2014-04-01

    Bhabha Atomic Research Centre (BARC), Trombay has organized an International Round Robin Analysis program to carry out the ultimate load capacity assessment of BARC Containment (BARCOM) test model. The test model located in BARC facilities Tarapur; is a 1:4 scale representation of 540 MWe Pressurized Heavy Water Reactor (PHWR) pre-stressed concrete inner containment structure of Tarapur Atomic Power Station (TAPS) unit 3 and 4. There are a large number of sensors installed in BARCOM that include vibratory wire strain gauges of embedded and spot-welded type, surface mounted electrical resistance strain gauges, dial gauges, earth pressure cells, tilt meters and high resolution digital camera systems for structural response, crack monitoring and fracture parameter measurement to evaluate the local and global behavior of the containment test model. The model has been tested pneumatically during the low pressure tests (LPTs) followed by proof test (PT) and integrated leakage rate test (ILRT) during commissioning. Further the over pressure test (OPT) has been carried out to establish the failure mode of BARCOM Test-Model. The over-pressure test will be completed shortly to reach the functional failure of the test model. Pre-test evaluation of BARCOM was carried out with the results obtained from the registered international round robin participants in January 2009 followed by the post-test assessment in February 2011. The test results along with the various failure modes related to the structural members – concrete, rebars and tendons identified in terms of prescribed milestones are presented in this paper along with the comparison of the pre-test predictions submitted by the registered participants of the Round Robin Analysis for BARCOM test model.

  16. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  17. A Monte Carlo calculation of the pionium break-up probability with different sets of pionium target cross sections

    International Nuclear Information System (INIS)

    Santamarina, C; Schumann, M; Afanasyev, L G; Heim, T

    2003-01-01

    Chiral perturbation theory predicts the lifetime of pionium, a hydrogen-like π + π - atom, to better than 3% precision. The goal of the DIRAC experiment at CERN is to obtain and check this value experimentally by measuring the break-up probability of pionium in a target. In order to accurately measure the lifetime one needs to know the relationship between the break-up probability and the lifetime to 1% accuracy. We have obtained this dependence by modelling the evolution of pionic atoms in the target using Monte Carlo methods. The model relies on the computation of the pionium-target-atom interaction cross sections. Three different sets of pionium-target cross sections with varying degrees of complexity were used: from the simplest first-order Born approximation involving only the electrostatic interaction to a more advanced approach, taking into account multiphoton exchanges and relativistic effects. We conclude that, in order to obtain the pionium lifetime to 1% accuracy from the break-up probability, the pionium-target cross sections must be known with the same accuracy for the low excited bound states of the pionic atom. This result has been achieved, for low Z targets, with the two most precise cross section sets. For large Z targets only the set accounting for multiphoton exchange satisfies the condition

  18. Crop size influences pre-dispersal seed predation in the Brazilian Cerrado

    Directory of Open Access Journals (Sweden)

    Alexander V. Christianini

    2017-11-01

    Full Text Available ABSTRACT Many pre-dispersal seed predators are specialized insects that rely on seeds for larval development. These insects may respond to the amount of seeds produced by a plant (i.e. crop size, increasing the proportion of seeds damaged with increases in seed numbers. Large seeds have more resources and spend more time in plants to complete their development and are probably more prone to be preyed on by those insects than small seeds. Here I tested how crop size and seed mass influence pre-dispersal seed predation in plants from the Cerrado savannas of Brazil. I related plant crop size to pre-dispersal seed predation for Xylopia aromatica and Erythroxylum pelleterianum. A literature review was performed to test if seed mass may explain among-species differences in pre-dispersal seed predation. Pre-dispersal losses increased proportionally to crop size in the two species investigated, but some species show positive or no density-dependent seed predation in literature, indicating that seed losses are not a simple function of crop sizes. Seed mass did not explain pre-dispersal seed loss differences among 14 species with data available. Pre-dispersal losses are often small and probably less important than seed dispersal and establishment limitation for plant recruitment in Cerrado savannas.

  19. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  20. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  1. Rotavirus vaccine effectiveness in low-income settings: An evaluation of the test-negative design

    OpenAIRE

    Schwartz, Lauren M.; Halloran, M. Elizabeth; Rowhani-Rahbar, Ali; Neuzil, Kathleen M.; Victor, John C.

    2017-01-01

    Background The test-negative design (TND), an epidemiologic method currently used to measure rotavirus vaccine (RV) effectiveness, compares the vaccination status of rotavirus-positive cases and rotavirus-negative controls meeting a pre-defined case definition for acute gastroenteritis. Despite the use of this study design in low-income settings, the TND has not been evaluated to measure rotavirus vaccine effectiveness. Methods This study builds upon prior methods to evaluate the use of the T...

  2. Pre-test habituation improves the reliability of a handheld test of mechanical nociceptive threshold in dairy cows

    DEFF Research Database (Denmark)

    Raundal, P. M.; Andersen, P. H.; Toft, Nils

    2015-01-01

    Mechanical nociceptive threshold (MNT) testing has been used to investigate aspects of painful states in bovine claws. We investigated a handheld tool, where the applied stimulation force was monitored continuously relative to a pre-encoded based target force. The effect on MNT of two pre-testing...... habituation procedures was performed in two different experiments comprising a total of 88 sound Holsteins dairy cows kept either inside or outside their home environment. MNT testing was performed using five consecutive mechanical nociceptive stimulations per cow per test at a fixed pre-encoded target rate...... of 2.1 N/s. The habituation procedure performed in dairy cows kept in their home environment led to lowered intra-individual coefficient of variation of MNT (P test...

  3. Effects of concurrent caffeine and mobile phone exposure on local target probability processing in the human brain.

    Science.gov (United States)

    Trunk, Attila; Stefanics, Gábor; Zentai, Norbert; Bacskay, Ivett; Felinger, Attila; Thuróczy, György; Hernádi, István

    2015-09-23

    Millions of people use mobile phones (MP) while drinking coffee or other caffeine containing beverages. Little is known about the potential combined effects of MP irradiation and caffeine on cognitive functions. Here we investigated whether caffeine intake and concurrent exposure to Universal Mobile Telecommunications System (UMTS) MP-like irradiation may interactively influence neuro-cognitive function in an active visual oddball paradigm. In a full factorial experimental design, 25 participants performed a simple visual target detection task while reaction time (RT) and electroencephalogram (EEG) was recorded. Target trials were divided into Low and High probability sets based on target-to-target distance. We analyzed single trial RT and alpha-band power (amplitude) in the pre-target interval. We found that RT was shorter in High vs. Low local probability trials, and caffeine further shortened RT in High probability trials relative to the baseline condition suggesting that caffeine improves the efficiency of implicit short-term memory. Caffeine also decreased pre-target alpha amplitude resulting in higher arousal level. Furthermore, pre-target gamma power positively correlated with RT, which may have facilitated target detection. However, in the present pharmacologically validated study UMTS exposure either alone or in combination with caffeine did not alter RT or pre-stimulus oscillatory brain activity.

  4. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  5. Test set for initial value problem solvers

    NARCIS (Netherlands)

    W.M. Lioen (Walter); J.J.B. de Swart (Jacques)

    1998-01-01

    textabstractThe CWI test set for IVP solvers presents a collection of Initial Value Problems to test solvers for implicit differential equations. This test set can both decrease the effort for the code developer to test his software in a reliable way, and cross the bridge between the application

  6. Gene set analysis using variance component tests.

    Science.gov (United States)

    Huang, Yen-Tsung; Lin, Xihong

    2013-06-28

    Gene set analyses have become increasingly important in genomic research, as many complex diseases are contributed jointly by alterations of numerous genes. Genes often coordinate together as a functional repertoire, e.g., a biological pathway/network and are highly correlated. However, most of the existing gene set analysis methods do not fully account for the correlation among the genes. Here we propose to tackle this important feature of a gene set to improve statistical power in gene set analyses. We propose to model the effects of an independent variable, e.g., exposure/biological status (yes/no), on multiple gene expression values in a gene set using a multivariate linear regression model, where the correlation among the genes is explicitly modeled using a working covariance matrix. We develop TEGS (Test for the Effect of a Gene Set), a variance component test for the gene set effects by assuming a common distribution for regression coefficients in multivariate linear regression models, and calculate the p-values using permutation and a scaled chi-square approximation. We show using simulations that type I error is protected under different choices of working covariance matrices and power is improved as the working covariance approaches the true covariance. The global test is a special case of TEGS when correlation among genes in a gene set is ignored. Using both simulation data and a published diabetes dataset, we show that our test outperforms the commonly used approaches, the global test and gene set enrichment analysis (GSEA). We develop a gene set analyses method (TEGS) under the multivariate regression framework, which directly models the interdependence of the expression values in a gene set using a working covariance. TEGS outperforms two widely used methods, GSEA and global test in both simulation and a diabetes microarray data.

  7. Online pre-race education improves test scores for volunteers at a marathon.

    Science.gov (United States)

    Maxwell, Shane; Renier, Colleen; Sikka, Robby; Widstrom, Luke; Paulson, William; Christensen, Trent; Olson, David; Nelson, Benjamin

    2017-09-01

    This study examined whether an online course would lead to increased knowledge about the medical issues volunteers encounter during a marathon. Health care professionals who volunteered to provide medical coverage for an annual marathon were eligible for the study. Demographic information about medical volunteers including profession, specialty, education level and number of marathons they had volunteered for was collected. A 15-question test about the most commonly encountered medical issues was created by the authors and administered before and after the volunteers took the online educational course and compared to a pilot study the previous year. Seventy-four subjects completed the pre-test. Those who participated in the pilot study last year (N = 15) had pre-test scores that were an average of 2.4 points higher than those who did not (mean ranks: pilot study = 51.6 vs. non-pilot = 33.9, p = 0.004). Of the 74 subjects who completed the pre-test, 54 also completed the post-test. The overall post-pre mean score difference was 3.8 ± 2.7 (t = 10.5 df = 53 p online education demonstrated a long-term (one-year) increase in test scores. Testing also continued to show short-term improvement in post-course test scores, compared to pre-course test scores. In general, marathon medical volunteers who had no volunteer experience demonstrated greater improvement than those who had prior volunteer experience.

  8. 40 CFR 90.408 - Pre-test procedures.

    Science.gov (United States)

    2010-07-01

    ....408 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... during service accumulation is allowed only in accordance with § 90.118. (b) Engine pre-test preparation... by § 90.324(a). If necessary, allow the heated sample line, filters, and pumps to reach operating...

  9. Post-test probability for neonatal hyperbilirubinemia based on umbilical cord blood bilirubin, direct antiglobulin test, and ABO compatibility results.

    Science.gov (United States)

    Peeters, Bart; Geerts, Inge; Van Mullem, Mia; Micalessi, Isabel; Saegeman, Veroniek; Moerman, Jan

    2016-05-01

    Many hospitals opt for early postnatal discharge of newborns with a potential risk of readmission for neonatal hyperbilirubinemia. Assays/algorithms with the possibility to improve prediction of significant neonatal hyperbilirubinemia are needed to optimize screening protocols and safe discharge of neonates. This study investigated the predictive value of umbilical cord blood (UCB) testing for significant hyperbilirubinemia. Neonatal UCB bilirubin, UCB direct antiglobulin test (DAT), and blood group were determined, as well as the maternal blood group and the red blood cell antibody status. Moreover, in newborns with clinically apparent jaundice after visual assessment, plasma total bilirubin (TB) was measured. Clinical factors positively associated with UCB bilirubin were ABO incompatibility, positive DAT, presence of maternal red cell antibodies, alarming visual assessment and significant hyperbilirubinemia in the first 6 days of life. UCB bilirubin performed clinically well with an area under the receiver-operating characteristic curve (AUC) of 0.82 (95 % CI 0.80-0.84). The combined UCB bilirubin, DAT, and blood group analysis outperformed results of these parameters considered separately to detect significant hyperbilirubinemia and correlated exponentially with hyperbilirubinemia post-test probability. Post-test probabilities for neonatal hyperbilirubinemia can be calculated using exponential functions defined by UCB bilirubin, DAT, and ABO compatibility results. • The diagnostic value of the triad umbilical cord blood bilirubin measurement, direct antiglobulin testing and blood group analysis for neonatal hyperbilirubinemia remains unclear in literature. • Currently no guideline recommends screening for hyperbilirubinemia using umbilical cord blood. What is New: • Post-test probability for hyperbilirubinemia correlated exponentially with umbilical cord blood bilirubin in different risk groups defined by direct antiglobulin test and ABO blood group

  10. Test set up description and performances for HAWAII-2RG detector characterization at ESTEC

    Science.gov (United States)

    Crouzet, P.-E.; ter Haar, J.; de Wit, F.; Beaufort, T.; Butler, B.; Smit, H.; van der Luijt, C.; Martin, D.

    2012-07-01

    In the frame work of the European Space Agency's Cosmic Vision program, the Euclid mission has the objective to map the geometry of the Dark Universe. Galaxies and clusters of galaxies will be observed in the visible and near-infrared wavelengths by an imaging and spectroscopic channel. For the Near Infrared Spectrometer instrument (NISP), the state-of-the-art HAWAII-2RG detectors will be used, associated with the SIDECAR ASIC readout electronic which will perform the image frame acquisitions. To characterize and validate the performance of these detectors, a test bench has been designed, tested and validated. This publication describes the pre-tests performed to build the set up dedicated to dark current measurements and tests requiring reasonably uniform light levels (such as for conversion gain measurements). Successful cryogenic and vacuum tests on commercial LEDs and photodiodes are shown. An optimized feed through in stainless steel with a V-groove to pot the flex cable connecting the SIDECAR ASIC to the room temperature board (JADE2) has been designed and tested. The test set up for quantum efficiency measurements consisting of a lamp, a monochromator, an integrating sphere and set of cold filters, and which is currently under construction will ensure a uniform illumination across the detector with variations lower than 2%. A dedicated spot projector for intra-pixel measurements has been designed and built to reach a spot diameter of 5 μm at 920nm with 2nm of bandwidth [1].

  11. 40 CFR 91.408 - Pre-test procedures.

    Science.gov (United States)

    2010-07-01

    ....408 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... accordance with § 91.117. (b) Engine pre-test preparation. (1) Drain and charge the fuel tank(s) with the..., including the sample probe, using mode 1 from Table 2 in appendix A of this subpart. The emission sampling...

  12. Choreographer Pre-Testing Code Analysis and Operational Testing.

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, David J. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Harrison, Christopher B. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Perr, C. W. [Sandia National Laboratories (SNL-CA), Livermore, CA (United States); Hurd, Steven A [Sandia National Laboratories (SNL-CA), Livermore, CA (United States)

    2014-07-01

    Choreographer is a "moving target defense system", designed to protect against attacks aimed at IP addresses without corresponding domain name system (DNS) lookups. It coordinates actions between a DNS server and a Network Address Translation (NAT) device to regularly change which publicly available IP addresses' traffic will be routed to the protected device versus routed to a honeypot. More details about how Choreographer operates can be found in Section 2: Introducing Choreographer. Operational considerations for the successful deployment of Choreographer can be found in Section 3. The Testing & Evaluation (T&E) for Choreographer involved 3 phases: Pre-testing, Code Analysis, and Operational Testing. Pre-testing, described in Section 4, involved installing and configuring an instance of Choreographer and verifying it would operate as expected for a simple use case. Our findings were that it was simple and straightforward to prepare a system for a Choreographer installation as well as configure Choreographer to work in a representative environment. Code Analysis, described in Section 5, consisted of running a static code analyzer (HP Fortify) and conducting dynamic analysis tests using the Valgrind instrumentation framework. Choreographer performed well, such that only a few errors that might possibly be problematic in a given operating situation were identified. Operational Testing, described in Section 6, involved operating Choreographer in a representative environment created through EmulyticsTM . Depending upon the amount of server resources dedicated to Choreographer vis-á-vis the amount of client traffic handled, Choreographer had varying degrees of operational success. In an environment with a poorly resourced Choreographer server and as few as 50-100 clients, Choreographer failed to properly route traffic over half the time. Yet, with a well-resourced server, Choreographer handled over 1000 clients without missrouting. Choreographer

  13. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  14. A well test analysis method accounting for pre-test operations

    International Nuclear Information System (INIS)

    Silin, D.B.; Tsang, C.-F.

    2003-01-01

    We propose to use regular monitoring data from a production or injection well for estimating the formation hydraulic properties in the vicinity of the wellbore without interrupting the operations. In our approach, we select a portion of the pumping data over a certain time interval and then derive our conclusions from analysis of these data. A distinctive feature of the proposed approach differing it form conventional methods is in the introduction of an additional parameter, an effective pre-test pumping rate. The additional parameter is derived based on a rigorous asymptotic analysis of the flow model. Thus, we account for the non-uniform pressure distribution at the beginning of testing time interval caused by pre-test operations at the well. By synthetic and field examples, we demonstrate that deviation of the matching curve from the data that is usually attributed to skin and wellbore storage effects, can also be interpreted through this new parameter. Moreover, with our method, the data curve is matched equally well and the results of the analysis remain stable when the analyzed data interval is perturbed, whereas traditional methods are sensitive to the choice of the data interval. A special efficient minimization procedure has been developed for searching the best fitting parameters. We enhanced our analysis above with a procedure of estimating ambient reservoir pressure and dimensionless wellbore radius. The methods reported here have been implemented in code ODA (Operations Data Analysis). A beta version of the code is available for free testing and evaluation to interested parties

  15. MASTER: a model to improve and standardize clinical breakpoints for antimicrobial susceptibility testing using forecast probabilities.

    Science.gov (United States)

    Blöchliger, Nicolas; Keller, Peter M; Böttger, Erik C; Hombach, Michael

    2017-09-01

    The procedure for setting clinical breakpoints (CBPs) for antimicrobial susceptibility has been poorly standardized with respect to population data, pharmacokinetic parameters and clinical outcome. Tools to standardize CBP setting could result in improved antibiogram forecast probabilities. We propose a model to estimate probabilities for methodological categorization errors and defined zones of methodological uncertainty (ZMUs), i.e. ranges of zone diameters that cannot reliably be classified. The impact of ZMUs on methodological error rates was used for CBP optimization. The model distinguishes theoretical true inhibition zone diameters from observed diameters, which suffer from methodological variation. True diameter distributions are described with a normal mixture model. The model was fitted to observed inhibition zone diameters of clinical Escherichia coli strains. Repeated measurements for a quality control strain were used to quantify methodological variation. For 9 of 13 antibiotics analysed, our model predicted error rates of  0.1% for ampicillin, cefoxitin, cefuroxime and amoxicillin/clavulanic acid. Increasing the susceptible CBP (cefoxitin) and introducing ZMUs (ampicillin, cefuroxime, amoxicillin/clavulanic acid) decreased error rates to < 0.1%. ZMUs contained low numbers of isolates for ampicillin and cefuroxime (3% and 6%), whereas the ZMU for amoxicillin/clavulanic acid contained 41% of all isolates and was considered not practical. We demonstrate that CBPs can be improved and standardized by minimizing methodological categorization error rates. ZMUs may be introduced if an intermediate zone is not appropriate for pharmacokinetic/pharmacodynamic or drug dosing reasons. Optimized CBPs will provide a standardized antibiotic susceptibility testing interpretation at a defined level of probability. © The Author 2017. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For

  16. Review of Pre-Analytical Errors in Oral Glucose Tolerance Testing in a Tertiary Care Hospital.

    Science.gov (United States)

    Nanda, Rachita; Patel, Suprava; Sahoo, Sibashish; Mohapatra, Eli

    2018-03-13

    The pre-pre-analytical and pre-analytical phases form a major chunk of the errors in a laboratory. The process has taken into consideration a very common procedure which is the oral glucose tolerance test to identify the pre-pre-analytical errors. Quality indicators provide evidence of quality, support accountability and help in the decision making of laboratory personnel. The aim of this research is to evaluate pre-analytical performance of the oral glucose tolerance test procedure. An observational study that was conducted overa period of three months, in the phlebotomy and accessioning unit of our laboratory using questionnaire that examined the pre-pre-analytical errors through a scoring system. The pre-analytical phase was analyzed for each sample collected as per seven quality indicators. About 25% of the population gave wrong answer with regard to the question that tested the knowledge of patient preparation. The appropriateness of test result QI-1 had the most error. Although QI-5 for sample collection had a low error rate, it is a very important indicator as any wrongly collected sample can alter the test result. Evaluating the pre-analytical and pre-pre-analytical phase is essential and must be conducted routinely on a yearly basis to identify errors and take corrective action and to facilitate their gradual introduction into routine practice.

  17. Recommendations for the tuning of rare event probability estimators

    International Nuclear Information System (INIS)

    Balesdent, Mathieu; Morio, Jérôme; Marzat, Julien

    2015-01-01

    Being able to accurately estimate rare event probabilities is a challenging issue in order to improve the reliability of complex systems. Several powerful methods such as importance sampling, importance splitting or extreme value theory have been proposed in order to reduce the computational cost and to improve the accuracy of extreme probability estimation. However, the performance of these methods is highly correlated with the choice of tuning parameters, which are very difficult to determine. In order to highlight recommended tunings for such methods, an empirical campaign of automatic tuning on a set of representative test cases is conducted for splitting methods. It allows to provide a reduced set of tuning parameters that may lead to the reliable estimation of rare event probability for various problems. The relevance of the obtained result is assessed on a series of real-world aerospace problems

  18. A support vector machine based test for incongruence between sets of trees in tree space

    Science.gov (United States)

    2012-01-01

    Background The increased use of multi-locus data sets for phylogenetic reconstruction has increased the need to determine whether a set of gene trees significantly deviate from the phylogenetic patterns of other genes. Such unusual gene trees may have been influenced by other evolutionary processes such as selection, gene duplication, or horizontal gene transfer. Results Motivated by this problem we propose a nonparametric goodness-of-fit test for two empirical distributions of gene trees, and we developed the software GeneOut to estimate a p-value for the test. Our approach maps trees into a multi-dimensional vector space and then applies support vector machines (SVMs) to measure the separation between two sets of pre-defined trees. We use a permutation test to assess the significance of the SVM separation. To demonstrate the performance of GeneOut, we applied it to the comparison of gene trees simulated within different species trees across a range of species tree depths. Applied directly to sets of simulated gene trees with large sample sizes, GeneOut was able to detect very small differences between two set of gene trees generated under different species trees. Our statistical test can also include tree reconstruction into its test framework through a variety of phylogenetic optimality criteria. When applied to DNA sequence data simulated from different sets of gene trees, results in the form of receiver operating characteristic (ROC) curves indicated that GeneOut performed well in the detection of differences between sets of trees with different distributions in a multi-dimensional space. Furthermore, it controlled false positive and false negative rates very well, indicating a high degree of accuracy. Conclusions The non-parametric nature of our statistical test provides fast and efficient analyses, and makes it an applicable test for any scenario where evolutionary or other factors can lead to trees with different multi-dimensional distributions. The

  19. Prevalence and Determinants of Pre-Hypertension among Omani Adults Attending Non-Communicable Disease Screening Program in Primary Care Setting in Sohar City

    Directory of Open Access Journals (Sweden)

    Ali Abdullah Al-Maqbali

    2013-09-01

    Full Text Available Objectives: To estimate the prevalence of pre-hypertension and its association with some selected cardiovascular risk factors among the Omani adult population in the primary healthcare setting.Method: A cross-sectional study involving a sample taken from a National Screening Program of chronic non-communicable diseases in primary healthcare institutions, Sohar city, Sultanate of Oman (July 2006 - December 2007. Inclusion criteria included Omanis aged 40 years or above residents of Sohar city attending primary healthcare institutions not previously diagnosed with diabetes mellitus, hypertension, or chronic kidney diseases. Descriptive statistics were used to describe the demographic, physical and metabolic characteristics. Univariate analysis was used to identify the significant association between the characteristics and normal blood pressure, pre-hypertension and hypertension. Chi-squared test was used for categorical variables analysis and independent t-test was used for continuous variables analysis. In order to examine the strength of significant associations, the multinomial logistic regression analysis was used.Results: There were 1498 participants, 41% were males and 59% were females. Overall, pre-hypertension was observed in 45% of the total study population (95% CI: 0.422 - 0.473. There were more males affected than females (46% versus 44%. About 34% of the total study population was hypertensive. The multinomial logistic regression analysis revealed that an increase of one unit of age, body mass index, fasting blood glucose and total blood cholesterol, were significantly associated with higher risk in both pre-hypertension and hypertension. High odds ratio of pre-hypertension and hypertension was found with the total blood cholesterol.Conclusion: The prevalence of pre-hypertension was high among the Omani adult population. The determinants of pre-hypertension in this research age, body mass index, fasting blood glucose and total blood

  20. Prediction of pre-eclampsia: a protocol for systematic reviews of test accuracy

    Directory of Open Access Journals (Sweden)

    Khan Khalid S

    2006-10-01

    Full Text Available Abstract Background Pre-eclampsia, a syndrome of hypertension and proteinuria, is a major cause of maternal and perinatal morbidity and mortality. Accurate prediction of pre-eclampsia is important, since high risk women could benefit from intensive monitoring and preventive treatment. However, decision making is currently hampered due to lack of precise and up to date comprehensive evidence summaries on estimates of risk of developing pre-eclampsia. Methods/Design A series of systematic reviews and meta-analyses will be undertaken to determine, among women in early pregnancy, the accuracy of various tests (history, examinations and investigations for predicting pre-eclampsia. We will search Medline, Embase, Cochrane Library, MEDION, citation lists of review articles and eligible primary articles and will contact experts in the field. Reviewers working independently will select studies, extract data, and assess study validity according to established criteria. Language restrictions will not be applied. Bivariate meta-analysis of sensitivity and specificity will be considered for tests whose studies allow generation of 2 × 2 tables. Discussion The results of the test accuracy reviews will be integrated with results of effectiveness reviews of preventive interventions to assess the impact of test-intervention combinations for prevention of pre-eclampsia.

  1. Probable relationship between partitions of the set of codons and the origin of the genetic code.

    Science.gov (United States)

    Salinas, Dino G; Gallardo, Mauricio O; Osorio, Manuel I

    2014-03-01

    Here we study the distribution of randomly generated partitions of the set of amino acid-coding codons. Some results are an application from a previous work, about the Stirling numbers of the second kind and triplet codes, both to the cases of triplet codes having four stop codons, as in mammalian mitochondrial genetic code, and hypothetical doublet codes. Extending previous results, in this work it is found that the most probable number of blocks of synonymous codons, in a genetic code, is similar to the number of amino acids when there are four stop codons, as well as it could be for a primigenious doublet code. Also it is studied the integer partitions associated to patterns of synonymous codons and it is shown, for the canonical code, that the standard deviation inside an integer partition is one of the most probable. We think that, in some early epoch, the genetic code might have had a maximum of the disorder or entropy, independent of the assignment between codons and amino acids, reaching a state similar to "code freeze" proposed by Francis Crick. In later stages, maybe deterministic rules have reassigned codons to amino acids, forming the natural codes, such as the canonical code, but keeping the numerical features describing the set partitions and the integer partitions, like a "fossil numbers"; both kinds of partitions about the set of amino acid-coding codons. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  3. Closed Form Aliasing Probability For Q-ary Symmetric Errors

    Directory of Open Access Journals (Sweden)

    Geetani Edirisooriya

    1996-01-01

    Full Text Available In Built-In Self-Test (BIST techniques, test data reduction can be achieved using Linear Feedback Shift Registers (LFSRs. A faulty circuit may escape detection due to loss of information inherent to data compaction schemes. This is referred to as aliasing. The probability of aliasing in Multiple-Input Shift-Registers (MISRs has been studied under various bit error models. By modeling the signature analyzer as a Markov process we show that the closed form expression derived for aliasing probability previously, for MISRs with primitive polynomials under q-ary symmetric error model holds for all MISRs irrespective of their feedback polynomials and for group cellular automata signature analyzers as well. If the erroneous behaviour of a circuit can be modelled with q-ary symmetric errors, then the test circuit complexity and propagation delay associated with the signature analyzer can be minimized by using a set of m single bit LFSRs without increasing the probability of aliasing.

  4. Level set segmentation of medical images based on local region statistics and maximum a posteriori probability.

    Science.gov (United States)

    Cui, Wenchao; Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan

    2013-01-01

    This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.

  5. Test Program Set (TPS) Lab

    Data.gov (United States)

    Federal Laboratory Consortium — The ARDEC TPS Laboratory provides an organic Test Program Set (TPS) development, maintenance, and life cycle management capability for DoD LCMC materiel developers....

  6. Cognitive Laboratory Experiences : On Pre-testing Computerised Questionnaires

    NARCIS (Netherlands)

    Snijkers, G.J.M.E.

    2002-01-01

    In the literature on questionnaire design and survey methodology, pre-testing is mentioned as a way to evaluate questionnaires (i.e. investigate whether they work as intended) and control for measurement errors (i.e. assess data quality). As the American Statistical Association puts it (ASA, 1999,

  7. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  8. Targeting as the basis for pre-test market of lithium-ion battery

    Science.gov (United States)

    Yuniaristanto, Zakaria, R.; Saputri, V. H. L.; Sutopo, W.; Kadir, E. A.

    2017-11-01

    This article discusses about market segmentation and targeting as a first step in pre-test market of a new technology. The benefits of targeting towards pre-test market are pre-test market can be conducted to focus on selected target markets so there is no bias during the pre-test market. In determining the target market then do some surveys to identify the state of market in the future, so that the marketing process is not misplaced. Lithium ion battery which is commercialized through start-up companies is the case study. This start-up companies must be able to respond the changes and bring in customers as well as maintain them so that companies can survive and evolve to achieve its objectives. The research aims to determine market segments and target market effectively. Marketing strategy (segmentation and targeting) is used to make questionnaire and cluster analysis in data processing. Respondents were selected by purposive sampling and have obtained data as many as 80 samples. As the results study, there are three segments for lithium ion battery with their own distinguished characteristics and there are two segments that can be used as the target market for the company.

  9. Salmonella testing of pooled pre-enrichment broth cultures for screening multiple food samples.

    Science.gov (United States)

    Price, W R; Olsen, R A; Hunter, J E

    1972-04-01

    A method has been described for testing multiple food samples for Salmonella without loss in sensitivity. The method pools multiple pre-enrichment broth cultures into single enrichment broths. The subsequent stages of the Salmonella analysis are not altered. The method was found applicable to several dry food materials including nonfat dry milk, dried egg albumin, cocoa, cottonseed flour, wheat flour, and shredded coconut. As many as 25 pre-enrichment broth cultures were pooled without apparent loss in the sensitivity of Salmonella detection as compared to individual sample analysis. The procedure offers a simple, yet effective, way to increase sample capacity in the Salmonella testing of foods, particularly where a large proportion of samples ordinarily is negative. It also permits small portions of pre-enrichment broth cultures to be retained for subsequent individual analysis if positive tests are found. Salmonella testing of pooled pre-enrichment broths provides increased consumer protection for a given amount of analytical effort as compared to individual sample analysis.

  10. Use of a National Continuing Medical Education Meeting to Provide Simulation-Based Training in Temporary Hemodialysis Catheter Insertion Skills: A Pre-Test Post-Test Study

    Directory of Open Access Journals (Sweden)

    Edward G Clark

    2014-10-01

    Full Text Available Background: Simulation-based-mastery-learning (SBML is an effective method to train nephrology fellows to competently insert temporary, non-tunneled hemodialysis catheters (NTHCs. Previous studies of SBML for NTHC-insertion have been conducted at a local level. Objectives: Determine if SBML for NTHC-insertion can be effective when provided at a national continuing medical education (CME meeting. Describe the correlation of demographic factors, prior experience with NTHC-insertion and procedural self-confidence with simulated performance of the procedure. Design: Pre-test – post-test study. Setting: 2014 Canadian Society of Nephrology annual meeting. Participants: Nephrology fellows, internal medicine residents and medical students. Measurements: Participants were surveyed regarding demographics, prior NTHC-insertion experience, procedural self-confidence and attitudes regarding the training they received. NTHC-insertion skills were assessed using a 28-item checklist. Methods: Participants underwent a pre-test of their NTHC-insertion skills at the internal jugular site using a realistic patient simulator and ultrasound machine. Participants then had a training session that included a didactic presentation and 2 hours of deliberate practice using the simulator. On the following day, trainees completed a post-test of their NTHC-insertion skills. All participants were required to meet or exceed a minimum passing score (MPS previously set at 79%. Trainees who did not reach the MPS were required to perform more deliberate practice until the MPS was achieved. Results: Twenty-two individuals participated in SBML training. None met or exceeded the MPS at baseline with a median checklist score of 20 (IQR, 7.25 to 21. Seventeen of 22 participants (77% completed post-testing and improved their scores to a median of 27 (IQR, 26 to 28; p < 0.001. All met or exceeded the MPS on their first attempt. There were no significant correlations between demographics

  11. Pre- and post-test analyses of a KKL turbine trip test at 109% power using RETRAN-3D

    Energy Technology Data Exchange (ETDEWEB)

    Coddington, P

    2001-03-01

    As part of the PSI/HSK STARS project, pre-test calculations have been performed for a KKL turbine trip test at 109% power using the RETRAN-3D code. In this paper, we first present the results of these calculations, together with a description of the test and a comparison of the results with the measured plant data, and then discuss in more detail the differences between the pre-test results and the plant measurements, including the differences in the initial and boundary conditions, and how these differences influenced the calculated results. Finally, we comment on a series of post-test and sensitivity analyses, which were performed to resolve some of the discrepancies. The results of the pre-test (blind) calculations show good overall agreement with the experimental data, particularly for the maximum in the steam-line mass flow rate following the opening of the turbine bypass valves. This is of critical importance, since the steam-line flow has the least margin to the reactor scram limit. The agreement is especially good since the control rod banks used for the selected rod insertion were changed from those given in the test specification. Following a review of the comparison of the pre-test calculations with the measured data, several deficiencies in the RETRAN-3D model for KKL were identified and corrected as part of the post-test analysis. This allowed for both an improvement in the calculated results, and a deeper understanding of the behaviour of the turbine trip transient. (author)

  12. Pre- and post-test analyses of a KKL turbine trip test at 109% power using RETRAN-3D

    International Nuclear Information System (INIS)

    Coddington, P.

    2001-01-01

    As part of the PSI/HSK STARS project, pre-test calculations have been performed for a KKL turbine trip test at 109% power using the RETRAN-3D code. In this paper, we first present the results of these calculations, together with a description of the test and a comparison of the results with the measured plant data, and then discuss in more detail the differences between the pre-test results and the plant measurements, including the differences in the initial and boundary conditions, and how these differences influenced the calculated results. Finally, we comment on a series of post-test and sensitivity analyses, which were performed to resolve some of the discrepancies. The results of the pre-test (blind) calculations show good overall agreement with the experimental data, particularly for the maximum in the steam-line mass flow rate following the opening of the turbine bypass valves. This is of critical importance, since the steam-line flow has the least margin to the reactor scram limit. The agreement is especially good since the control rod banks used for the selected rod insertion were changed from those given in the test specification. Following a review of the comparison of the pre-test calculations with the measured data, several deficiencies in the RETRAN-3D model for KKL were identified and corrected as part of the post-test analysis. This allowed for both an improvement in the calculated results, and a deeper understanding of the behaviour of the turbine trip transient. (author)

  13. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  14. Opportunity to learn: Investigating possible predictors for pre-course Test Of Astronomy STandards TOAST scores

    Science.gov (United States)

    Berryhill, Katie J.

    As astronomy education researchers become more interested in experimentally testing innovative teaching strategies to enhance learning in introductory astronomy survey courses ("ASTRO 101"), scholars are placing increased attention toward better understanding factors impacting student gain scores on the widely used Test Of Astronomy STandards (TOAST). Usually used in a pre-test and post-test research design, one might naturally assume that the pre-course differences observed between high- and low-scoring college students might be due in large part to their pre-existing motivation, interest, experience in science, and attitudes about astronomy. To explore this notion, 11 non-science majoring undergraduates taking ASTRO 101 at west coast community colleges were interviewed in the first few weeks of the course to better understand students' pre-existing affect toward learning astronomy with an eye toward predicting student success. In answering this question, we hope to contribute to our understanding of the incoming knowledge of students taking undergraduate introductory astronomy classes, but also gain insight into how faculty can best meet those students' needs and assist them in achieving success. Perhaps surprisingly, there was only weak correlation between students' motivation toward learning astronomy and their pre-test scores. Instead, the most fruitful predictor of TOAST pre-test scores was the quantity of pre-existing, informal, self-directed astronomy learning experiences.

  15. Development and evaluation of probability density functions for a set of human exposure factors

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.

    1999-06-01

    The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors.

  16. Development and evaluation of probability density functions for a set of human exposure factors

    International Nuclear Information System (INIS)

    Maddalena, R.L.; McKone, T.E.; Bodnar, A.; Jacobson, J.

    1999-01-01

    The purpose of this report is to describe efforts carried out during 1998 and 1999 at the Lawrence Berkeley National Laboratory to assist the U.S. EPA in developing and ranking the robustness of a set of default probability distributions for exposure assessment factors. Among the current needs of the exposure-assessment community is the need to provide data for linking exposure, dose, and health information in ways that improve environmental surveillance, improve predictive models, and enhance risk assessment and risk management (NAS, 1994). The U.S. Environmental Protection Agency (EPA) Office of Emergency and Remedial Response (OERR) plays a lead role in developing national guidance and planning future activities that support the EPA Superfund Program. OERR is in the process of updating its 1989 Risk Assessment Guidance for Superfund (RAGS) as part of the EPA Superfund reform activities. Volume III of RAGS, when completed in 1999 will provide guidance for conducting probabilistic risk assessments. This revised document will contain technical information including probability density functions (PDFs) and methods used to develop and evaluate these PDFs. The PDFs provided in this EPA document are limited to those relating to exposure factors

  17. EVOLVE : a Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation II

    CERN Document Server

    Coello, Carlos; Tantar, Alexandru-Adrian; Tantar, Emilia; Bouvry, Pascal; Moral, Pierre; Legrand, Pierrick; EVOLVE 2012

    2013-01-01

    This book comprises a selection of papers from the EVOLVE 2012 held in Mexico City, Mexico. The aim of the EVOLVE is to build a bridge between probability, set oriented numerics and evolutionary computing, as to identify new common and challenging research aspects. The conference is also intended to foster a growing interest for robust and efficient methods with a sound theoretical background. EVOLVE is intended to unify theory-inspired methods and cutting-edge techniques ensuring performance guarantee factors. By gathering researchers with different backgrounds, a unified view and vocabulary can emerge where the theoretical advancements may echo in different domains. Summarizing, the EVOLVE focuses on challenging aspects arising at the passage from theory to new paradigms and aims to provide a unified view while raising questions related to reliability,  performance guarantees and modeling. The papers of the EVOLVE 2012 make a contribution to this goal. 

  18. Set-up for differential manometers testing

    International Nuclear Information System (INIS)

    Ratushnyj, M.I.; Galkin, Yu.V.; Nechaj, A.G.

    1985-01-01

    Set-up characteristic for controlling and testing metrological characteristics of TPP and NPP differential manometers with extreme pressure drop upto 250 kPa is briefly described. The set-up provides with automatic and manual assignment of values of gauge air pressure with errors of 0.1 and 0.25% correspondingly. The set-up is supplied with standard equipment to measure output signals. Set-up supply is carried out by a one-phase alternating current circuit with 220 V. Air supply is carried out by O.4-0.6 MPa. pressure of a pneumatic system. Application of the set-up increases operating efficiency 5 times while checking and turning differential manometers

  19. A low false negative filter for detecting rare bird species from short video segments using a probable observation data set-based EKF method.

    Science.gov (United States)

    Song, Dezhen; Xu, Yiliang

    2010-09-01

    We report a new filter to assist the search for rare bird species. Since a rare bird only appears in front of a camera with very low occurrence (e.g., less than ten times per year) for very short duration (e.g., less than a fraction of a second), our algorithm must have a very low false negative rate. We verify the bird body axis information with the known bird flying dynamics from the short video segment. Since a regular extended Kalman filter (EKF) cannot converge due to high measurement error and limited data, we develop a novel probable observation data set (PODS)-based EKF method. The new PODS-EKF searches the measurement error range for all probable observation data that ensures the convergence of the corresponding EKF in short time frame. The algorithm has been extensively tested using both simulated inputs and real video data of four representative bird species. In the physical experiments, our algorithm has been tested on rock pigeons and red-tailed hawks with 119 motion sequences. The area under the ROC curve is 95.0%. During the one-year search of ivory-billed woodpeckers, the system reduces the raw video data of 29.41 TB to only 146.7 MB (reduction rate 99.9995%).

  20. BNL NONLINEAR PRE TEST SEISMIC ANALYSIS FOR THE NUPEC ULTIMATE STRENGTH PIPING TEST PROGRAM

    International Nuclear Information System (INIS)

    DEGRASSI, G.; HOFMAYER, C.; MURPHY, C.; SUZUKI, K.; NAMITA, Y.

    2003-01-01

    The Nuclear Power Engineering Corporation (NUPEC) of Japan has been conducting a multi-year research program to investigate the behavior of nuclear power plant piping systems under large seismic loads. The objectives of the program are: to develop a better understanding of the elasto-plastic response and ultimate strength of nuclear piping; to ascertain the seismic safety margin of current piping design codes; and to assess new piping code allowable stress rules. Under this program, NUPEC has performed a large-scale seismic proving test of a representative nuclear power plant piping system. In support of the proving test, a series of materials tests, static and dynamic piping component tests, and seismic tests of simplified piping systems have also been performed. As part of collaborative efforts between the United States and Japan on seismic issues, the US Nuclear Regulatory Commission (USNRC) and its contractor, the Brookhaven National Laboratory (BNL), are participating in this research program by performing pre-test and post-test analyses, and by evaluating the significance of the program results with regard to safety margins. This paper describes BNL's pre-test analysis to predict the elasto-plastic response for one of NUPEC's simplified piping system seismic tests. The capability to simulate the anticipated ratcheting response of the system was of particular interest. Analyses were performed using classical bilinear and multilinear kinematic hardening models as well as a nonlinear kinematic hardening model. Comparisons of analysis results for each plasticity model against test results for a static cycling elbow component test and for a simplified piping system seismic test are presented in the paper

  1. A Teaching Method on Basic Chemistry for Freshman : Teaching Method with Pre-test and Post-test

    OpenAIRE

    立木, 次郎; 武井, 庚二

    2003-01-01

    This report deals with a teaching method on basic chemistry for freshman. This teaching method contains guidance and instruction to how to understand basic chemistry. Pre-test and post-test have been put into practice each time. Each test was returned to students at class in the following weeks.

  2. Prognostic value of stress echocardiography in women with high (⩾80%) probability of coronary artery disease

    OpenAIRE

    Davar, J; Roberts, E; Coghlan, J; Evans, T; Lipkin, D

    2001-01-01

    OBJECTIVE—To assess the prognostic significance of stress echocardiography in women with a high probability of coronary artery disease (CAD).
SETTING—Secondary and tertiary cardiology unit at a university teaching hospital.
PARTICIPANTS—A total of 135 women (mean (SD) age 63 (9) years) with pre-test probability of CAD ⩾80% were selected from a database of patients investigated by treadmill or dobutamine stress echocardiography between 1995 and 1998.
MAIN OUTCOME MEASURES—Patients were followe...

  3. Second-order asymptotics for quantum hypothesis testing in settings beyond i.i.d.—quantum lattice systems and more

    International Nuclear Information System (INIS)

    Datta, Nilanjana; Rouzé, Cambyse; Pautrat, Yan

    2016-01-01

    Quantum Stein’s lemma is a cornerstone of quantum statistics and concerns the problem of correctly identifying a quantum state, given the knowledge that it is one of two specific states (ρ or σ). It was originally derived in the asymptotic i.i.d. setting, in which arbitrarily many (say, n) identical copies of the state (ρ"⊗"n or σ"⊗"n) are considered to be available. In this setting, the lemma states that, for any given upper bound on the probability α_n of erroneously inferring the state to be σ, the probability β_n of erroneously inferring the state to be ρ decays exponentially in n, with the rate of decay converging to the relative entropy of the two states. The second order asymptotics for quantum hypothesis testing, which establishes the speed of convergence of this rate of decay to its limiting value, was derived in the i.i.d. setting independently by Tomamichel and Hayashi, and Li. We extend this result to settings beyond i.i.d. Examples of these include Gibbs states of quantum spin systems (with finite-range, translation-invariant interactions) at high temperatures, and quasi-free states of fermionic lattice gases.

  4. A Web Based Framework for Pre-release Testing of Mobile Applications

    Directory of Open Access Journals (Sweden)

    Hamdy Abeer

    2016-01-01

    Full Text Available Mobile applications are becoming an integral part of daily life and of business’s marketing plan. They are helpful in promoting for the business, attracting and retaining customers. Software testing is vital to ensure the delivery of high quality mobile applications that could be accessed across different platforms and meet business and technical requirements. This paper proposes a web based tool, namely Pons, for the distribution of pre-release mobile applications for the purpose of manual testing. Pons facilities building, running, and manually testing Android applications directly in the browser. It gets the developers and end users engaged in testing the applications in one place, alleviates the tester’s burden of installing and maintaining testing environments, and provides a platform for developers to rapidly iterate on the software and integrate changes over time. Thus, it speeds up the pre-release testing process, reduces its cost and increases customer satisfaction.

  5. Assessing the Probability that a Finding Is Genuine for Large-Scale Genetic Association Studies.

    Science.gov (United States)

    Kuo, Chia-Ling; Vsevolozhskaya, Olga A; Zaykin, Dmitri V

    2015-01-01

    Genetic association studies routinely involve massive numbers of statistical tests accompanied by P-values. Whole genome sequencing technologies increased the potential number of tested variants to tens of millions. The more tests are performed, the smaller P-value is required to be deemed significant. However, a small P-value is not equivalent to small chances of a spurious finding and significance thresholds may fail to serve as efficient filters against false results. While the Bayesian approach can provide a direct assessment of the probability that a finding is spurious, its adoption in association studies has been slow, due in part to the ubiquity of P-values and the automated way they are, as a rule, produced by software packages. Attempts to design simple ways to convert an association P-value into the probability that a finding is spurious have been met with difficulties. The False Positive Report Probability (FPRP) method has gained increasing popularity. However, FPRP is not designed to estimate the probability for a particular finding, because it is defined for an entire region of hypothetical findings with P-values at least as small as the one observed for that finding. Here we propose a method that lets researchers extract probability that a finding is spurious directly from a P-value. Considering the counterpart of that probability, we term this method POFIG: the Probability that a Finding is Genuine. Our approach shares FPRP's simplicity, but gives a valid probability that a finding is spurious given a P-value. In addition to straightforward interpretation, POFIG has desirable statistical properties. The POFIG average across a set of tentative associations provides an estimated proportion of false discoveries in that set. POFIGs are easily combined across studies and are immune to multiple testing and selection bias. We illustrate an application of POFIG method via analysis of GWAS associations with Crohn's disease.

  6. Verbal versus Numerical Probabilities: Does Format Presentation of Probabilistic Information regarding Breast Cancer Screening Affect Women's Comprehension?

    Science.gov (United States)

    Vahabi, Mandana

    2010-01-01

    Objective: To test whether the format in which women receive probabilistic information about breast cancer and mammography affects their comprehension. Methods: A convenience sample of 180 women received pre-assembled randomized packages containing a breast health information brochure, with probabilities presented in either verbal or numeric…

  7. Pre-transplantation glucose testing for predicting new-onset diabetes mellitus after renal transplantation.

    Science.gov (United States)

    Ramesh Prasad, G V; Huang, M; Bandukwala, F; Nash, M M; Rapi, L; Montada-Atin, T; Meliton, G; Zaltzman, J S

    2009-02-01

    New-onset diabetes after renal transplantation (NODAT) adversely affects graft and patient survival. However, NODAT risk based on pre-transplant blood glucose (BG) levels has not been defined. Our goal was to identify the best pre-transplant testing method and cut-off values. We performed a case-control analysis of non-diabetic recipients who received a live donor allograft with at least 6 months post-transplant survival. Pre-transplant glucose abnormalities were excluded through 75 g oral glucose tolerance testing (OGTT) and random BG (RBG) measurement. NODAT was defined based on 2003 Canadian Diabetes Association criteria. Multivariate logistic and Cox regression analysis was performed to determine independent predictor variables for NODAT. Receiver-operating-characteristic (ROC) curves were constructed to determine threshold BG values for diabetes risk. 151 recipients met initial entry criteria. 12 had pre-transplant impaired fasting glucose and/or impaired glucose tolerance, among who 7 (58%) developed NODAT. In the remaining 139, 24 (17%) developed NODAT. NODAT risk exceeded 25% for those with pre-transplant RBG > 6.0 mmol/l and 50% if > 7.2 mmol/l. Pre-transplant RBG provided the highest AUC (0.69, p = 0.002) by ROC analysis. Increasing age (p = 0.025), acute rejection (p = 0.011), and RBG > 6.0 mmol/l (p = 0.001) were independent predictors of NODAT. Pre-transplant glucose testing is a specific marker for NODAT. Patients can be counseled of their incremental risk even within the normal BG range if the OGTT is normal.

  8. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  9. A global, incremental development method for a web-based prostate cancer treatment decision aid and usability testing in a Dutch clinical setting.

    Science.gov (United States)

    Cuypers, Maarten; Lamers, Romy Ed; Kil, Paul Jm; The, Regina; Karssen, Klemens; van de Poll-Franse, Lonneke V; de Vries, Marieke

    2017-07-01

    Many new decision aids are developed while aspects of existing decision aids could also be useful, leading to a sub-optimal use of resources. To support treatment decision-making in prostate cancer patients, a pre-existing evidence-based Canadian decision aid was adjusted to Dutch clinical setting. After analyses of the original decision aid and routines in Dutch prostate cancer care, adjustments to the decision aid structure and content were made. Subsequent usability testing (N = 11) resulted in 212 comments. Care providers mainly provided feedback on medical content, and patients commented most on usability and summary layout. All participants reported that the decision aid was comprehensible and well-structured and would recommend decision aid use. After usability testing, final adjustments to the decision aid were made. The presented methods could be useful for cultural adaptation of pre-existing tools into other languages and settings, ensuring optimal usage of previous scientific and practical efforts and allowing for a global, incremental decision aid development process.

  10. On pre-test sensitisation and peer assessment to enhance learning gain in science education

    NARCIS (Netherlands)

    Bos, Floor/Floris

    2009-01-01

    *The main part of this thesis focuses on designing, optimising, and studying the embedding of two types of interventions: pre-testing and peer assessment, both supported by or combined with ICT-tools. * Pre-test sensitisation is used intentionally to boost the learning gain of the main intervention,

  11. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  12. Detecting intrajudge inconsistency in standard setting using test items with a selected-response format

    NARCIS (Netherlands)

    van der Linden, Willem J.; Vos, Hendrik J.; Chang, Lei

    2002-01-01

    In judgmental standard setting experiments, it may be difficult to specify subjective probabilities that adequately take the properties of the items into account. As a result, these probabilities are not consistent with each other in the sense that they do not refer to the same borderline level of

  13. Finite test sets development method for test execution of safety critical software

    International Nuclear Information System (INIS)

    Shin, Sung Min; Kim, Hee Eun; Kang, Hyun Gook; Lee, Sung Jiun

    2014-01-01

    The V and V method has been utilized for this safety critical software, while SRGM has difficulties because of lack of failure occurrence data on developing phase. For the safety critical software, however, failure data cannot be gathered after installation in real plant when we consider the severe consequence. Therefore, to complement the V and V method, the test-based method need to be developed. Some studies on test-based reliability quantification method for safety critical software have been conducted in nuclear field. These studies provide useful guidance on generating test sets. An important concept of the guidance is that the test sets represent 'trajectories' (a series of successive values for the input variables of a program that occur during the operation of the software over time) in the space of inputs to the software.. Actually, the inputs to the software depends on the state of plant at that time, and these inputs form a new internal state of the software by changing values of some variables. In other words, internal state of the software at specific timing depends on the history of past inputs. Here the internal state of the software which can be changed by past inputs is named as Context of Software (CoS). In a certain CoS, a software failure occurs when a fault is triggered by some inputs. To cover the failure occurrence mechanism of a software, preceding researches insist that the inputs should be a trajectory form. However, in this approach, there are two critical problems. One is the length of the trajectory input. Input trajectory should long enough to cover failure mechanism, but the enough length is not clear. What is worse, to cover some accident scenario, one set of input should represent dozen hours of successive values. The other problem is number of tests needed. To satisfy a target reliability with reasonable confidence level, very large number of test sets are required. Development of this number of test sets is a herculean

  14. Probability in reasoning: a developmental test on conditionals.

    Science.gov (United States)

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  15. On the probability of cure for heavy-ion radiotherapy

    International Nuclear Information System (INIS)

    Hanin, Leonid; Zaider, Marco

    2014-01-01

    The probability of a cure in radiation therapy (RT)—viewed as the probability of eventual extinction of all cancer cells—is unobservable, and the only way to compute it is through modeling the dynamics of cancer cell population during and post-treatment. The conundrum at the heart of biophysical models aimed at such prospective calculations is the absence of information on the initial size of the subpopulation of clonogenic cancer cells (also called stem-like cancer cells), that largely determines the outcome of RT, both in an individual and population settings. Other relevant parameters (e.g. potential doubling time, cell loss factor and survival probability as a function of dose) are, at least in principle, amenable to empirical determination. In this article we demonstrate that, for heavy-ion RT, microdosimetric considerations (justifiably ignored in conventional RT) combined with an expression for the clone extinction probability obtained from a mechanistic model of radiation cell survival lead to useful upper bounds on the size of the pre-treatment population of clonogenic cancer cells as well as upper and lower bounds on the cure probability. The main practical impact of these limiting values is the ability to make predictions about the probability of a cure for a given population of patients treated to newer, still unexplored treatment modalities from the empirically determined probability of a cure for the same or similar population resulting from conventional low linear energy transfer (typically photon/electron) RT. We also propose that the current trend to deliver a lower total dose in a smaller number of fractions with larger-than-conventional doses per fraction has physical limits that must be understood before embarking on a particular treatment schedule. (paper)

  16. Probability of background to produce a signal-like excess, for all Higgs masses tested.

    CERN Document Server

    ATLAS, collaboration

    2012-01-01

    The probability of background to produce a signal-like excess, for all the Higgs boson masses tested. At almost all masses, the probability (solid curve) is at least a few percent; however, at 126.5 GeV it dips to 3x10-7, or one chance in three million, the '5-sigma' gold-standard normally used for the discovery of a new particle. A Standard Model Higgs boson with that mass would produce a dip to 4.6 sigma.

  17. Fabrication and testing of W7-X pre-series target elements

    International Nuclear Information System (INIS)

    Boscary, J; Boeswirth, B; Greuner, H; Grigull, P; Missirlian, M; Plankensteiner, A; Schedler, B; Friedrich, T; Schlosser, J; Streibl, B; Traxler, H

    2007-01-01

    The assembly of the highly-loaded target plates of the WENDELSTEIN 7-X (W7-X) divertor requires the fabrication of 890 target elements (TEs). The plasma facing material is made of CFC NB31 flat tiles bonded to a CuCrZr copper alloy water-cooled heat sink. The elements are designed to remove a stationary heat flux and power up to 10 MW m -2 and 100 kW, respectively. Before launching the serial fabrication, pre-series activities aimed at qualifying the design, the manufacturing route and the non-destructive examinations (NDEs). High heat flux (HHF) tests performed on full-scale pre-series TEs resulted in an improvement of the design of the bond between tiles and heat sink to reduce the stresses during operation. The consequence is the fabrication of additional pre-series TEs to be tested in the HHF facility GLADIS. NDEs of this bond based on thermography methods are developed to define the acceptance criteria suitable for serial fabrication

  18. The impact of Nursing Rounds on the practice environment and nurse satisfaction in intensive care: pre-test post-test comparative study.

    Science.gov (United States)

    Aitken, Leanne M; Burmeister, Elizabeth; Clayton, Samantha; Dalais, Christine; Gardner, Glenn

    2011-08-01

    Factors previously shown to influence patient care include effective decision making, team work, evidence based practice, staffing and job satisfaction. Clinical rounds have the potential to optimise these factors and impact on patient outcomes, but use of this strategy by intensive care nurses has not been reported. To determine the effect of implementing Nursing Rounds in the intensive care environment on patient care planning and nurses' perceptions of the practice environment and work satisfaction. Pre-test post-test 2 group comparative design. Two intensive care units in tertiary teaching hospitals in Australia. A convenience sample of registered nurses (n=244) working full time or part time in the participating intensive care units. Nurses in participating intensive care units were asked to complete the Practice Environment Scale-Nursing Work Index (PES-NWI) and the Nursing Worklife Satisfaction Scale (NWSS) prior to and after a 12 month period during which regular Nursing Rounds were conducted in the intervention unit. Issues raised during Nursing Rounds were described and categorised. The characteristics of the sample and scale scores were summarised with differences between pre and post scores analysed using t-tests for continuous variables and chi-square tests for categorical variables. Independent predictors of the PES-NWI were determined using multivariate linear regression. Nursing Rounds resulted in 577 changes being initiated for 171 patients reviewed; these changes related to the physical, psychological--individual, psychological--family, or professional practice aspects of care. Total PES-NWI and NWSS scores were similar before and after the study period in both participating units. The NWSS sub-scale of interaction between nurses improved in the intervention unit during the study period (pre--4.85±0.93; post--5.36±0.89, p=0.002) with no significant increase in the control group. Factors independently related to higher PES-NWI included

  19. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  20. Pre-Gas Drilling Drinking Water Testing--An Educational Opportunity for Extension

    Science.gov (United States)

    Swistock, Brian; Clark, James

    2015-01-01

    The increase in shale gas drilling in Pennsylvania has resulted in thousands of landowners receiving predrilling testing of their drinking water. Landowners often have difficulty understanding test reports resulting in low awareness of pre-existing problems. Extension and several partners developed a program to improve understanding of…

  1. Evaluation of the theory-based Quality Improvement in Physical Therapy (QUIP) programme: a one-group, pre-test post-test pilot study.

    Science.gov (United States)

    Rutten, Geert M; Harting, Janneke; Bartholomew, L Kay; Schlief, Angelique; Oostendorp, Rob A B; de Vries, Nanne K

    2013-05-25

    Guideline adherence in physical therapy is far from optimal, which has consequences for the effectiveness and efficiency of physical therapy care. Programmes to enhance guideline adherence have, so far, been relatively ineffective. We systematically developed a theory-based Quality Improvement in Physical Therapy (QUIP) programme aimed at the individual performance level (practicing physiotherapists; PTs) and the practice organization level (practice quality manager; PQM). The aim of the study was to pilot test the multilevel QUIP programme's effectiveness and the fidelity, acceptability and feasibility of its implementation. A one-group, pre-test, post-test pilot study (N = 8 practices; N = 32 PTs, 8 of whom were also PQMs) done between September and December 2009. Guideline adherence was measured using clinical vignettes that addressed 12 quality indicators reflecting the guidelines' main recommendations. Determinants of adherence were measured using quantitative methods (questionnaires). Delivery of the programme and management changes were assessed using qualitative methods (observations, group interviews, and document analyses). Changes in adherence and determinants were tested in the paired samples T-tests and expressed in effect sizes (Cohen's d). Overall adherence did not change (3.1%; p = .138). Adherence to three quality indicators improved (8%, 24%, 43%; .000 ≤ p ≤ .023). Adherence to one quality indicator decreased (-15.7%; p = .004). Scores on various determinants of individual performance improved and favourable changes at practice organizational level were observed. Improvements were associated with the programme's multilevel approach, collective goal setting, and the application of self-regulation; unfavourable findings with programme deficits. The one-group pre-test post-test design limits the internal validity of the study, the self-selected sample its external validity. The QUIP programme has the potential to change physical

  2. Efficiency of the pre-heater against flow rate on primary the beta test loop

    International Nuclear Information System (INIS)

    Edy Sumarno; Kiswanta; Bambang Heru; Ainur R; Joko P

    2013-01-01

    Calculation of efficiency of the pre-heater has been carried out against the flow rate on primary the BETA Test Loop. BETA test loop (UUB) is a facilities of experiments to study the thermal hydraulic phenomenon, especially for thermal hydraulic post-LOCA (Lost of Coolant Accident). Sequences removal on the BETA Test Loop contained a pre-heater that serves as a getter heat from the primary side to the secondary side, determination of efficiency is to compare the incoming heat energy with the energy taken out by a secondary fluid. Characterization is intended to determine the performance of a pre-heater, then used as tool for analysis, and as a reference design experiments. Calculation of efficiency methods performed by operating the pre-heater with fluid flow rate variation on the primary side. Calculation of efficiency on the results obtained that the efficiency change with every change of flow rate, the flow rate is 71.26% on 163.50 ml/s and 60.65% on 850.90 ml/s. Efficiency value can be even greater if the pre-heater tank is wrapped with thermal insulation so there is no heat leakage. (author)

  3. Probability judgments under ambiguity and conflict.

    Science.gov (United States)

    Smithson, Michael

    2015-01-01

    Whether conflict and ambiguity are distinct kinds of uncertainty remains an open question, as does their joint impact on judgments of overall uncertainty. This paper reviews recent advances in our understanding of human judgment and decision making when both ambiguity and conflict are present, and presents two types of testable models of judgments under conflict and ambiguity. The first type concerns estimate-pooling to arrive at "best" probability estimates. The second type is models of subjective assessments of conflict and ambiguity. These models are developed for dealing with both described and experienced information. A framework for testing these models in the described-information setting is presented, including a reanalysis of a multi-nation data-set to test best-estimate models, and a study of participants' assessments of conflict, ambiguity, and overall uncertainty reported by Smithson (2013). A framework for research in the experienced-information setting is then developed, that differs substantially from extant paradigms in the literature. This framework yields new models of "best" estimates and perceived conflict. The paper concludes with specific suggestions for future research on judgment and decision making under conflict and ambiguity.

  4. Pre-symptomatic increase in urine-orosomucoid excretion in pre-eclamptic women

    DEFF Research Database (Denmark)

    Kronborg, Camilla Skovhus; Allen, Jim; Vittinghus, Erik

    2007-01-01

    , 32 women developed pre-eclampsia, and 5 controls for every case of pre-eclampsia were found. Blood samples were collected 4 times and urine samples 6 times from the 18/19th week and throughout pregnancy. Orosomucoid and albumin in plasma were analysed by standard methods, and in urine by sandwich...... in orosomucoid. In the plasma samples, orosomucoid was significantly higher late in pre-eclamptic pregnancies (>or=36th week, p=0.0275). CONCLUSIONS: Pre-eclampsia is associated with a pre-symptomatic increase in the urine excretion of orosomucoid, and orosomucoid excretion precedes that of albumin. Orosomucoid...... excretion can probably be used as a prognostic tool in combination with other screening methods, and seems to be a more sensitive marker for evolving pre-eclampsia than albumin. Plasma orosomucoid is significantly increased late in pre-eclampsia. Thus, the increased excretion of orosomucoid must primarily...

  5. Supervised pre-processing approaches in multiple class variables classification for fish recruitment forecasting

    KAUST Repository

    Fernandes, José Antonio

    2013-02-01

    A multi-species approach to fisheries management requires taking into account the interactions between species in order to improve recruitment forecasting of the fish species. Recent advances in Bayesian networks direct the learning of models with several interrelated variables to be forecasted simultaneously. These models are known as multi-dimensional Bayesian network classifiers (MDBNs). Pre-processing steps are critical for the posterior learning of the model in these kinds of domains. Therefore, in the present study, a set of \\'state-of-the-art\\' uni-dimensional pre-processing methods, within the categories of missing data imputation, feature discretization and feature subset selection, are adapted to be used with MDBNs. A framework that includes the proposed multi-dimensional supervised pre-processing methods, coupled with a MDBN classifier, is tested with synthetic datasets and the real domain of fish recruitment forecasting. The correctly forecasting of three fish species (anchovy, sardine and hake) simultaneously is doubled (from 17.3% to 29.5%) using the multi-dimensional approach in comparison to mono-species models. The probability assessments also show high improvement reducing the average error (estimated by means of Brier score) from 0.35 to 0.27. Finally, these differences are superior to the forecasting of species by pairs. © 2012 Elsevier Ltd.

  6. Test set of gaseous analytes at Hanford tank farms

    International Nuclear Information System (INIS)

    1997-01-01

    DOE has stored toxic and radioactive waste materials in large underground tanks. When the vapors in the tank headspaces vent to the open atmosphere a potentially dangerous situation can occur for personnel in the area. An open-path atmospheric pollution monitor is being developed to monitor the open air space above these tanks. In developing this infrared spectra monitor as a safety alert instrument, it is important to know what hazardous gases, called the Analytes of Concern, are most likely to be found in dangerous concentrations. The monitor must consider other gases which could interfere with measurements of the Analytes of Concern. The total list of gases called the Test Set Analytes form the basis for testing the pollution monitor. Prior measurements in 54 tank headspaces have detected 102 toxic air pollutants (TAPs) and over 1000 other analytes. The hazardous Analytes are ranked herein by a Hazardous Atmosphere Rating which combines their measured concentration, their density relative to air, and the concentration at which they become dangerous. The top 20 toxic air pollutants, as ranked by the Hazardous Atmosphere Rating, and the top 20 other analytes, in terms of measured concentrations, are analyzed for possible inclusion in the Test Set Analytes. Of these 40 gases, 20 are selected. To these 20 gases are added the 6 omnipresent atmospheric gases with the highest concentrations, since their spectra could interfere with measurements of the other spectra. The 26 Test Set Analytes are divided into a Primary Set and a Secondary Set. The Primary Set, gases which must be detectable by the monitor, includes the 6 atmospheric gases and the 6 hazardous gases which have been measured at dangerous concentrations. The Secondary Set gases need not be monitored at this time. The infrared spectra indicates that the pollution monitor will detect all 26 Test Set Analytes by thermal emission and will detect 15 Test Set Analytes by laser absorption

  7. ASK Procedure for Instrumented Pre-cracked Charpy-Type Tests

    International Nuclear Information System (INIS)

    Varga, T.; Njo, D.H.; Prantl, G.

    1981-01-01

    The essential technical content of the ASK procedure originated from development work in Switzerland since 1963, and practical experiences gained since 1972. The remainder of the content and the format of the procedure are based on the ASTM E 24.03.03. (Tentative Draft Copy) 'Proposed Method for Pre-cracked Charpy Impact and Slow-Bend Testing of Metallic Materials' by C. E. Harbower, 1973. Two different velocities, 5 m/s and 0.1 m/s were used with a Schnadt-type machine of rigid construction. The stiffness of the machine proved to be very suitable for instrumented testing. The instrumented Schnadt-Type machine was equipped with strain gauges both on the top of the pendulum and on the chisel. A static force calibration was followed by energy calibration, comparing potential energy losses with the area under the force-deflection curve. Deflection was measured using a high frequency eddy current method on the pendulum, and for slow testing by means of an inductive gauge on the chisel. Charpy-Type specimens of 1.0 mm max notch depth and 0.12 mm max notch radius were pre-cracked using a resonant fatigue testing machine, or an eccentric drive machine. Crack propagation rate da/dN was measured using 'Russenberger' measuring gauges. In addition a new technique for the detection of dynamic crack initiation, developed at the Institute of Research and Technology (TVFA) in Vienna is discussed and some results presented

  8. CHARACTERIZATIONS OF FUZZY SOFT PRE SEPARATION AXIOMS

    OpenAIRE

    El-Latif, Alaa Mohamed Abd

    2015-01-01

    − The notions of fuzzy pre open soft sets and fuzzy pre closed soft sets were introducedby Abd El-latif et al. [2]. In this paper, we continue the study on fuzzy soft topological spaces andinvestigate the properties of fuzzy pre open soft sets, fuzzy pre closed soft sets and study variousproperties and notions related to these structures. In particular, we study the relationship betweenfuzzy pre soft interior fuzzy pre soft closure. Moreover, we study the properties of fuzzy soft pre regulars...

  9. Pre-test analyses for the NESC1 spinning cylinder experiment

    International Nuclear Information System (INIS)

    Fokkens, J.H.

    1995-10-01

    The spinning cylinder experiment organised by the Network for the Evaluation of Steel Components (NESC) is designed to investigate the cleavage initiation behaviour of both surface breaking and subclad defects in simulated end of life RPV material, exposed to a pressurised thermal shock transient. Pre-test structural integrity assessments are performed by the NESC Structural Analysis Task Group (TG3). The results of these structural integrity assessments are used to determine the design of the experiment and especially the sizes of the introduced defects. In this report the results of the pre-test analyses performed by the group Applied Mechanics at ECN - Nuclear Energy are described. Elastic as well as elasto-plastic structural analyses are performed for a surface breaking and a subclad defect in a forged cylinder with a 4 mm cladding. The semi elliptical defects have a depth of 40 mm and an aspect ratio of 1:3. (orig.)

  10. An intelligent system based on fuzzy probabilities for medical diagnosis – a study in aphasia diagnosis

    Directory of Open Access Journals (Sweden)

    Majid Moshtagh Khorasani

    2009-04-01

    Full Text Available

    • BACKGROUND: Aphasia diagnosis is particularly challenging due to the linguistic uncertainty and vagueness, inconsistencies in the definition of aphasic syndromes, large number of measurements with  mprecision, natural diversity and subjectivity in test objects as well as in opinions of experts who diagnose the disease.
    • METHODS: Fuzzy probability is proposed here as the basic framework for handling the uncertainties in medical diagnosis and particularly aphasia diagnosis. To efficiently construct this fuzzy probabilistic mapping, statistical analysis is performed that constructs input membership functions as well as determines an effective set of input features.
    • RESULTS: Considering the high sensitivity of performance measures to different distribution of testing/training sets, a statistical t-test of significance is applied to compare fuzzy approach results with NN  esults as well as author’s earlier work using fuzzy logic. The proposed fuzzy probability estimator approach clearly provides better diagnosis for both classes of data sets. Specifically, for the first and second type of fuzzy probability classifiers, i.e. spontaneous speech and comprehensive model, P-values are 2.24E-08 and 0.0059, espectively, strongly rejecting the null hypothesis.
    • CONCLUSIONS: The technique is applied and compared on both comprehensive and spontaneous speech test data for diagnosis of four Aphasia types: Anomic, Broca, Global and Wernicke. Statistical analysis confirms that the proposed approach can significantly improve accuracy using fewer Aphasia features.
    • KEYWORDS: Aphasia, fuzzy probability, fuzzy logic, medical diagnosis, fuzzy rules.

  11. PreCam

    Energy Technology Data Exchange (ETDEWEB)

    Allam, Sahar S. [Fermilab; Tucker, Douglas L. [Fermilab

    2015-01-01

    The Dark Energy Survey (DES) will be taking the next step in probing the properties of Dark Energy and in understanding the physics of cosmic acceleration. A step towards the photometric calibration of DES is to have a quick, bright survey in the DES footprint (PreCam), using a pre-production set of the Dark Energy Camera (DECam) CCDs and a set of 100 mm×100 mm DES filters. The objective of the PreCam Survey is to create a network of calibrated DES grizY standard stars that will be used for DES nightly calibrations and to improve the DES global relative calibrations. Here, we describe the first year of PreCam observation, results, and photometric calibrations.

  12. Automotive RF immunity test set-up analysis

    NARCIS (Netherlands)

    Coenen, M.J.; Pues, H.; Bousquet, T.; Gillon, R.; Gielen, G.; Baric, A.

    2011-01-01

    Though the automotive RF emission and RF immunity requirements are highly justifiable, the application of those requirements in an non-intended manner leads to false conclusions and unnecessary redesigns for the electronics involved. When the test results become too dependent upon the test set-up

  13. Numerical modelling as a cost-reduction tool for probability of detection of bolt hole eddy current testing

    Science.gov (United States)

    Mandache, C.; Khan, M.; Fahr, A.; Yanishevsky, M.

    2011-03-01

    Probability of detection (PoD) studies are broadly used to determine the reliability of specific nondestructive inspection procedures, as well as to provide data for damage tolerance life estimations and calculation of inspection intervals for critical components. They require inspections on a large set of samples, a fact that makes these statistical assessments time- and cost-consuming. Physics-based numerical simulations of nondestructive testing inspections could be used as a cost-effective alternative to empirical investigations. They realistically predict the inspection outputs as functions of the input characteristics related to the test piece, transducer and instrument settings, which are subsequently used to partially substitute and/or complement inspection data in PoD analysis. This work focuses on the numerical modelling aspects of eddy current testing for the bolt hole inspections of wing box structures typical of the Lockheed Martin C-130 Hercules and P-3 Orion aircraft, found in the air force inventory of many countries. Boundary element-based numerical modelling software was employed to predict the eddy current signal responses when varying inspection parameters related to probe characteristics, crack geometry and test piece properties. Two demonstrator exercises were used for eddy current signal prediction when lowering the driver probe frequency and changing the material's electrical conductivity, followed by subsequent discussions and examination of the implications on using simulated data in the PoD analysis. Despite some simplifying assumptions, the modelled eddy current signals were found to provide similar results to the actual inspections. It is concluded that physics-based numerical simulations have the potential to partially substitute or complement inspection data required for PoD studies, reducing the cost, time, effort and resources necessary for a full empirical PoD assessment.

  14. Ideas for Testing of Planetary Gear Sets of Automotive Transmissions

    Directory of Open Access Journals (Sweden)

    Achtenová Gabriela

    2017-06-01

    Full Text Available The article describes the concept of modular stand, where is possible to provide tests of gear pairs with fixed axes from mechanical automotive gearboxes, as well as tests of separate planetary sets from automatic gearboxes. Special attention in the article will be paid to the variant dedicated for testing of planetary gear sets. This variant is particularly interesting because: 1 it is rarely described in the literature, and 2 this topology allows big simplification with respect to testing of standard gearwheels. In the planetary closed-loop stand it is possible to directly link two identical planetary sets. Without any bracing flange or other connecting clutches, shafts or gear sets, just two planetary sets face-to-face will be assembled and connected to the electric motor.

  15. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  16. Effects of Different Water and Super Plasticizer Amount, Pre-Setting and Curing Regimes on the Behavior of Reactive Powder Concrete

    Directory of Open Access Journals (Sweden)

    M. A. Dashti Rahmatabadi

    2014-12-01

    Full Text Available Reactive Powder Concrete (RPC is an ultra high performance concrete which has superior mechanical and physical properties. The RPC is composed of cement and very fine powders such as crushed quartz (100–600 μm and silica fume with very low water/binder ratio (W/B (less than 0.20 and Super Plasticizer (SP. The RPC has a very high compressive and tensile strength with better durability properties than current high performance concretes. Application of very low water/binder ratio with a high dosage of super plasticizer, different heat curing processes and pre-setting pressure improve mechanical and physical properties of RPC. In this study, the RPC is composed of available materials in Iran. Two different mixing proportions, different water/binder ratios for preparation of samples, different super plasticizer dosages, five different (0, 25, 50, 100 and 150 MPa pre-setting pressure and 7 different curing regimes were used in samples preparation and experiments. Results showed that appropriate water/binder ratio and super plasticizer dosage, higher temperature and pre-setting pressure increase the workability, density and compressive strength of compositions.

  17. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  18. In-Service and Pre-Service Early Childhood Teachers' Views and Intentions about ICT Use in Early Childhood Settings: A Comparative Study

    Science.gov (United States)

    Gialamas, Vasilis; Nikolopoulou, Kleopatra

    2010-01-01

    This paper regards a comparative study which investigates in-service and pre-service Greek early childhood teachers' views and intentions about integrating and using computers in early childhood settings. Views and intentions were investigated via a questionnaire administered to 240 in-service and 428 pre-service early childhood teachers.…

  19. A Teaching Method on Basic Chemistry for Freshman (II) : Teaching Method with Pre-test and Post-test

    OpenAIRE

    立木, 次郎; 武井, 庚二

    2004-01-01

    This report deals with review of a teaching method on basic chemistry for freshman in this first semester. We tried to review this teaching method with pre-test and post-test by means of the official and private questionnaires. Several hints and thoughts on teaching skills are obtained from this analysis.

  20. Achievement of course outcome in vector calculus pre-test questions ...

    African Journals Online (AJOL)

    No Abstract. Keywords: pre-test; course outcome; bloom taxanomy; Rasch measurement model; vector calculus. Full Text: EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT · AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians ...

  1. Integrative set enrichment testing for multiple omics platforms

    Directory of Open Access Journals (Sweden)

    Poisson Laila M

    2011-11-01

    Full Text Available Abstract Background Enrichment testing assesses the overall evidence of differential expression behavior of the elements within a defined set. When we have measured many molecular aspects, e.g. gene expression, metabolites, proteins, it is desirable to assess their differential tendencies jointly across platforms using an integrated set enrichment test. In this work we explore the properties of several methods for performing a combined enrichment test using gene expression and metabolomics as the motivating platforms. Results Using two simulation models we explored the properties of several enrichment methods including two novel methods: the logistic regression 2-degree of freedom Wald test and the 2-dimensional permutation p-value for the sum-of-squared statistics test. In relation to their univariate counterparts we find that the joint tests can improve our ability to detect results that are marginal univariately. We also find that joint tests improve the ranking of associated pathways compared to their univariate counterparts. However, there is a risk of Type I error inflation with some methods and self-contained methods lose specificity when the sets are not representative of underlying association. Conclusions In this work we show that consideration of data from multiple platforms, in conjunction with summarization via a priori pathway information, leads to increased power in detection of genomic associations with phenotypes.

  2. Empirical Investigation of Job Applicants' Reactions to Taking a Pre-Employment Honesty Test.

    Science.gov (United States)

    Jones, John W.; Joy, Dennis

    Employee theft is widespread and difficult to detect. Many companies have attempted to control the employee theft problem through pre-employment screening. The use of paper-and-pencil honesty tests in this process has become increasingly common. These two studies empirically investigated job applicants' (N=450) reactions to taking a pre-employment…

  3. A Test Set for stiff Initial Value Problem Solvers in the open source software R: Package deTestSet

    NARCIS (Netherlands)

    Mazzia, F.; Cash, J.R.; Soetaert, K.

    2012-01-01

    In this paper we present the R package deTestSet that includes challenging test problems written as ordinary differential equations (ODEs), differential algebraic equations (DAEs) of index up to 3 and implicit differential equations (IDES). In addition it includes 6 new codes to solve initial value

  4. Rotavirus vaccine effectiveness in low-income settings: An evaluation of the test-negative design.

    Science.gov (United States)

    Schwartz, Lauren M; Halloran, M Elizabeth; Rowhani-Rahbar, Ali; Neuzil, Kathleen M; Victor, John C

    2017-01-03

    The test-negative design (TND), an epidemiologic method currently used to measure rotavirus vaccine (RV) effectiveness, compares the vaccination status of rotavirus-positive cases and rotavirus-negative controls meeting a pre-defined case definition for acute gastroenteritis. Despite the use of this study design in low-income settings, the TND has not been evaluated to measure rotavirus vaccine effectiveness. This study builds upon prior methods to evaluate the use of the TND for influenza vaccine using a randomized controlled clinical trial database. Test-negative vaccine effectiveness (VE-TND) estimates were derived from three large randomized placebo-controlled trials (RCTs) of monovalent (RV1) and pentavalent (RV5) rotavirus vaccines in sub-Saharan Africa and Asia. Derived VE-TND estimates were compared to the original RCT vaccine efficacy estimates (VE-RCTs). The core assumption of the TND (i.e., rotavirus vaccine has no effect on rotavirus-negative diarrhea) was also assessed. TND vaccine effectiveness estimates were nearly equivalent to original RCT vaccine efficacy estimates. Neither RV had a substantial effect on rotavirus-negative diarrhea. This study supports the TND as an appropriate epidemiologic study design to measure rotavirus vaccine effectiveness in low-income settings. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  5. The utility of imputed matched sets. Analyzing probabilistically linked databases in a low information setting.

    Science.gov (United States)

    Thomas, A M; Cook, L J; Dean, J M; Olson, L M

    2014-01-01

    To compare results from high probability matched sets versus imputed matched sets across differing levels of linkage information. A series of linkages with varying amounts of available information were performed on two simulated datasets derived from multiyear motor vehicle crash (MVC) and hospital databases, where true matches were known. Distributions of high probability and imputed matched sets were compared against the true match population for occupant age, MVC county, and MVC hour. Regression models were fit to simulated log hospital charges and hospitalization status. High probability and imputed matched sets were not significantly different from occupant age, MVC county, and MVC hour in high information settings (p > 0.999). In low information settings, high probability matched sets were significantly different from occupant age and MVC county (p sets were not (p > 0.493). High information settings saw no significant differences in inference of simulated log hospital charges and hospitalization status between the two methods. High probability and imputed matched sets were significantly different from the outcomes in low information settings; however, imputed matched sets were more robust. The level of information available to a linkage is an important consideration. High probability matched sets are suitable for high to moderate information settings and for situations involving case-specific analysis. Conversely, imputed matched sets are preferable for low information settings when conducting population-based analyses.

  6. Finite test sets development method for test execution of safety critical software

    International Nuclear Information System (INIS)

    El-Bordany Ayman; Yun, Won Young

    2014-01-01

    It reads inputs, computes new states, and updates output for each scan cycle. Korea Nuclear Instrumentation and Control System (KNICS) has recently developed a fully digitalized Reactor Protection System (RPS) based on PLD. As a digital system, this RPS is equipped with a dedicated software. The Reliability of this software is crucial to NPPs safety where its malfunction may cause irreversible consequences and affect the whole system as a Common Cause Failure (CCF). To guarantee the reliability of the whole system, the reliability of this software needs to be quantified. There are three representative methods for software reliability quantification, namely the Verification and Validation (V and V) quality-based method, the Software Reliability Growth Model (SRGM), and the test-based method. An important concept of the guidance is that the test sets represent 'trajectories' (a series of successive values for the input variables of a program that occur during the operation of the software over time) in the space of inputs to the software.. Actually, the inputs to the software depends on the state of plant at that time, and these inputs form a new internal state of the software by changing values of some variables. In other words, internal state of the software at specific timing depends on the history of past inputs. Here the internal state of the software which can be changed by past inputs is named as Context of Software (CoS). In a certain CoS, a software failure occurs when a fault is triggered by some inputs. To cover the failure occurrence mechanism of a software, preceding researches insist that the inputs should be a trajectory form. However, in this approach, there are two critical problems. One is the length of the trajectory input. Input trajectory should long enough to cover failure mechanism, but the enough length is not clear. What is worse, to cover some accident scenario, one set of input should represent dozen hours of successive values

  7. What probabilities tell about quantum systems, with application to entropy and entanglement

    CERN Document Server

    Myers, John M

    2010-01-01

    The use of parameters to describe an experimenter's control over the devices used in an experiment is familiar in quantum physics, for example in connection with Bell inequalities. Parameters are also interesting in a different but related context, as we noticed when we proved a formal separation in quantum mechanics between linear operators and the probabilities that these operators generate. In comparing an experiment against its description by a density operator and detection operators, one compares tallies of experimental outcomes against the probabilities generated by the operators but not directly against the operators. Recognizing that the accessibility of operators to experimental tests is only indirect, via probabilities, motivates us to ask what probabilities tell us about operators, or, put more precisely, “what combinations of a parameterized density operator and parameterized detection operators generate any given set of parametrized probabilities?”

  8. Finite-element pre-analysis for pressurized thermoshock tests

    International Nuclear Information System (INIS)

    Keinaenen, H.; Talja, H.; Lehtonen, M.; Rintamaa, R.; Bljumin, A.; Timofeev, B.

    1992-05-01

    The behaviour of a model pressure vessel is studied in a pressurized thermal shock loading. The tests were performed at the Prometey Institute in St. Petersburg. The calculations were performed at the Technical Research Centre of Finland. The report describes the preliminary finite-element analyses for the fourth, fifth and sixth thermoshock tests with the first model pressure vessel. Seven pressurized thermoshock tests were made with the same model using five different flaw geometries. In the first three tests the flaw was actually a blunt notch. In the two following tests (tests 4 and 5) a sharp pre-crack was produced before the test. In the last two test (tests 6 and 7) the old crack was used. According to the measurements and post-test ultrasonic examination of the crack front, the sixth test led to significant crack extension. Both temperatures and stresses were calculated using the finite-element method. The calculations were made using the idealized initial flaw geometry and preliminary material data. Both two-and three-dimensional models were used in the calculations. J-integral values were calculated from the elastic-plastic finite-element results. The stress intensity factor values were evaluated on the basis of the calculated J-integrals and compared with the preliminary material fracture toughness data obtained from the Prometey Institute

  9. FUMEX cases 1, 2, and 3 calculated pre-test and post-test results

    Energy Technology Data Exchange (ETDEWEB)

    Stefanova, S; Vitkova, M; Passage, G; Manolova, M; Simeonova, V [Bylgarska Akademiya na Naukite, Sofia (Bulgaria). Inst. za Yadrena Izsledvaniya i Yadrena Energetika; Scheglov, A; Proselkov, V [Russian Research Centre Kurchatov Inst., Moscow (Russian Federation); Kharalampieva, Ts [Kombinat Atomna Energetika, Kozloduj (Bulgaria)

    1994-12-31

    Two versions (modified pre-test and modified post-test) of PIN-micro code were used to analyse the fuel rod behaviour of three FUMEX experiments. The experience of applying PIN-micro code with its simple structure and old conception of the steady-state operation shows significant difficulties in treating the complex processes like those in FUMEX experiments. These difficulties were partially overcame through different model modifications and corrections based on special engineering estimations and the results obtained as a whole do not seem unreasonable. The calculations have been performed by a group from two Bulgarian institutions in collaboration with specialists from the Kurchatov Research Center. 1 tab., 14 figs., 8 refs.

  10. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  11. Evaluation of test-strategies for estimating probability of low prevalence of paratuberculosis in Danish dairy herds

    DEFF Research Database (Denmark)

    Sergeant, E.S.G.; Nielsen, Søren S.; Toft, Nils

    2008-01-01

    of this study was to develop a method to estimate the probability of low within-herd prevalence of paratuberculosis for Danish dairy herds. A stochastic simulation model was developed using the R(R) programming environment. Features of this model included: use of age-specific estimates of test......-sensitivity and specificity; use of a distribution of observed values (rather than a fixed, low value) for design prevalence; and estimates of the probability of low prevalence (Pr-Low) based on a specific number of test-positive animals, rather than for a result less than or equal to a specified cut-point number of reactors....... Using this model, five herd-testing strategies were evaluated: (1) milk-ELISA on all lactating cows; (2) milk-ELISA on lactating cows 4 years old; (4) faecal culture on all lactating cows; and (5) milk-ELISA plus faecal culture in series on all lactating cows. The five testing strategies were evaluated...

  12. Behavior of pre-irradiated fuel under a simulated RIA condition. Results of NSRR Test JM-5

    International Nuclear Information System (INIS)

    Fuketa, Toyoshi; Sasajima, Hideo; Mori, Yukihide; Tanzawa, Sadamitsu; Ishijima, Kiyomi; Kobayashi, Shinsho; Kamata, Hiroshi; Homma, Kozo; Sakai, Haruyuki.

    1995-11-01

    This report presents results from the power burst experiment with pre-irradiated fuel rod, Test JM-5, conducted in the Nuclear Safety Research Reactor (NSRR). The data concerning test method, pre-irradiation, pre-pulse fuel examination, pulse irradiation, transient records and post-pulse fuel examination are described, and interpretations and discussions of the results are presented. Preceding to the pulse irradiation in the NSRR, test fuel rod was irradiated in the Japan Materials Testing Reactor (JMTR) up to a fuel burnup of 25.7 MWd/kgU with average linear heat rate of 33.4 kW/m. The fuel rod was subjected to the pulse irradiation resulting in a desposited energy of 223 ± 7 cal/g·fuel (0.93 ± 0.03 kJ/g·fuel) and a peak fuel enthalpy of 167 ± 5 cal/g·fuel (0.70 ± 0.02 kJ/g·fuel) under stagnant water cooling condition at atmospheric pressure and ambient temperature. Test fuel rod behavior was assessed from pre- and post-pulse fuel examinations and transient records during the pulse. The Test JM-5 resulted in cladding failure. More than twenty small cracks were found in the post-test cladding, and most of the defects located in pre-existing locally hydrided region. The result indicates an occurrence of fuel failure by PCMI (pellet/cladding mechanical interaction) in combination with decreased integrity of hydrided cladding. (author)

  13. 基于概率模型的ATC系统冲突目标生成算法%Probability-Based Method of Generating the Conflict Trajectories for ATC System

    Institute of Scientific and Technical Information of China (English)

    苏志刚; 眭聪聪; 吴仁彪

    2011-01-01

    For testing the capability of short term conflict alerting of air traffic control system, two methods are usually used. The former is to set a higher threshold, use the real data testing whether the system can alert when distance between two flights gets lower than the threshold. However, this method is not reliable. The second method is simulating flights which will conflict and obtain their trajectory from calculating, and then send these data to ATC system to see its reaction. This method is usually too simple to test whether the system can pre-detect a conflict effectively. To solve these problems, a probabilistic approach is used in this paper to simulate air-crafts with given probability of conflicting. Firstly, we derived the conflict probability of turing flights from Prandaini' s method of conflict probability estimation for linear flight. Then using reverse derivation we got the motion parameters of two targets whose conflict probability was pre-setted. At last, we simulated this pair of targets' track and anlysised their conflict probability. The simulation results show that the targets' probability of conflict was in line with the previous assumption. The trajectories generated by this algorithm are more realistic then a more effective conclusion of ATC system' s capability of short term conflict alerting and pre-detecting will be provided.%通常用于测试空中交通管制(Air Traffic Control,ATC)自动化系统的飞行冲突告警功能的方法主要有放宽系统告警值和向系统输入模拟的飞行冲突目标的雷达数据.前一种方法存在不可靠性,第二种方法由于只产生简单的确定目标轨迹数据,因此只能简单地测试系统能否告警,无法对系统的飞行冲突预测能力作出评价.为了使用于测试系统的模拟雷达数据更符合实际飞行情况,并检测系统预测飞行冲突的技术水平,本文提出了一种基于飞行

  14. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  15. Clinical utility of routine pre-operative axillary ultrasound and fine needle aspiration cytology in patient selection for sentinel lymph node biopsy.

    Science.gov (United States)

    Rattay, T; Muttalib, M; Khalifa, E; Duncan, A; Parker, S J

    2012-04-01

    In patients with operable breast cancer, pre-operative evaluation of the axilla may be of use in the selection of appropriate axillary surgery. Pre-operative axillary ultrasound (US) and fine needle aspiration cytology (FNAC) assessments have become routine practice in many breast units, although the evidence base is still gathering. This study assessed the clinical utility of US+/-FNAC in patient selection for either axillary node clearance (ANC) or sentinel lymph node biopsy (SLNB) in patients undergoing surgery for operable breast cancer. Over a two-year period, 348 patients with a clinically negative axilla underwent axillary US. 67 patients with suspicious nodes on US also underwent FNAC. The sensitivity and specificity of axillary investigations to determine nodal involvement were 56% (confidence interval: 47-64%) and 90% (84-93%) for US alone, and 76% (61-87%) and 100% (65-100%) for FNAC combined with US, respectively. With a positive US, the post-test probability was 78%. A negative US carried a post-test probability of 25%. When FNAC was positive, the post-test probability was greater than unity. A negative FNAC yielded a post-test probability of 52%. All patients with positive FNAC and most patients with suspicious US were listed for axillary node clearance (ANC) after consideration at the multi-disciplinary team (MDT) meeting. With pre-operative axillary US+/-FNAC, 20% of patients were saved a potential second axillary procedure, facilitating a reduction in the overall re-operation rate to 12%. In this study, a positive pre-operative US+/-FNAC directs patients towards ANC. When the result is negative, other clinico-pathological factors need to be taken into account in the selection of the appropriate axillary procedure. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  17. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  18. The importance of risk models for management of pulmonary nodules; Die Bedeutung von Risikomodellen fuer das Management pulmonaler Rundherde

    Energy Technology Data Exchange (ETDEWEB)

    Prosch, H.; Baltzer, P. [Medizinische Universitaet Wien, Allgemeines Krankenhaus, Universitaetsklinik fuer Radiologie und Nuklearmedizin, Wien (Austria)

    2014-05-15

    Pulmonary nodules are a frequent finding in computed tomography (CT) investigations. Further diagnostic work-up of detected nodules mainly depends on the so-called pre-test probability, i.e. the probability that the nodule is malignant or benign. The pre-test probability can be calculated by combining all relevant information, such as the age and the sex of the patient, the smoking history, and history of previous malignancies, as well as the size and CT morphology of the nodule. If additional investigations are performed to further investigate the nodules, all results must be interpreted taking into account the pre-test probability and the test performance of the investigation in order to estimate the post-test probability. In cases with a low pre-test probability, a negative result from an exact test can exclude malignancies but a positive test cannot prove malignancy in such a setting. In cases with a high pre-test probability, a positive test result can be considered as proof of malignancy but a negative test result does not exclude malignancy. (orig.) [German] Pulmonale Rundherde sind ein haeufiger Befund bei CT-Untersuchungen des Thorax. Die weiterfuehrende Abklaerung der gefunden Rundherde haengt im Wesentlichen von der so genannten Vortestwahrscheinlichkeit ab, der Wahrscheinlichkeit ob der Rundherd maligne ist oder nicht. Diese Vortestwahrscheinlichkeit laesst sich durch die Kombination aller relevanten Vorinformationen wie Alter und Geschlecht des Patienten, Raucheranamnese, Tumoranamnese, Groesse und CT-Morphologie des Rundherdes genau berechnen oder intuitiv abschaetzen. Werden weiterfuehrende Untersuchungen zur Abklaerung des Rundherdes durchgefuehrt, ist das Ergebnis dieser Untersuchung, die Nachtestwahrscheinlichkeit fuer das Vorliegen von Malignitaet, in Abhaengigkeit von der Vortestwahrscheinlichkeit und der Testguete der Untersuchung zu interpretieren. Waehrend ein genauer Test im Falle niedriger Vortestwahrscheinlichkeiten Malignitaet mit

  19. Geochronology, petrogenesis and tectonic settings of pre- and syn-ore granites from the W-Mo deposits (East Kounrad, Zhanet and Akshatau), Central Kazakhstan

    Science.gov (United States)

    Li, GuangMing; Cao, MingJian; Qin, KeZhang; Evans, Noreen J.; Hollings, Pete; Seitmuratova, Eleonora Yusupovha

    2016-05-01

    There is significant debate regarding the mineralization ages of the East Kounrad, Zhanet and Akshatau W-Mo deposits of Central Kazakhstan, and the petrogenesis and tectono-magmatic evolution of the granites associated with these deposits. To address these issues, we present molybdenite Re-Os dating, zircon U-Pb dating, whole rock geochemistry as well as Sr-Nd-Pb and zircon O-Hf isotopic analyses on the pre-mineralization and ore-forming granites. U-Pb dating of zircons from pre-mineralization granitic rocks yield Late Carboniferous ages of 320-309 Ma, whereas ore-forming granites have Early Permian ages of 298-285 Ma. Molybdenite Re-Os isotopic data indicate a mineralization age of 296 Ma at East Kounrad, 294 Ma at Akshatau and 285 Ma at Zhanet. The pre-ore and ore-forming granites are high-K calc-alkaline, metaluminous to slightly peraluminous I-type granites. The pre-mineralization granites are relatively unfractionated, whereas the ore-forming granites are highly fractionated. The fractionating mineral phases are probably K-feldspar, apatite, Ti-bearing phases and minor plagioclase. The pre-mineralization and ore-forming rocks are characterized by similar Sr-Nd-Pb-Hf-O isotopic compositions ((87Sr/86Sr)i = 0.70308-0.70501, εNd (t) = - 0.5 to + 2.8, 207Pb/204Pb = 15.60-15.82, zircon εHf (t) = + 1.2 to + 15.6 and δ18O = + 4.6 to + 10.3‰), whole rock TDMC (Nd) (840-1120 Ma) and zircon TDMC (Hf) (320-1240 Ma). The isotopic characteristics are consistent with a hybrid magma source caused by 10-30% assimilation of ancient crust by juvenile lower crust. The geochronology and geochemistry of these granites show that the Late Carboniferous pre-mineralization granitic rocks formed during subduction, whereas the Early Permian ore-forming, highly fractionated granite probably underwent significant fractionation with a restite assemblage of K-feldspar, apatite, Ti-bearing phases and minor plagioclase and developed during collision between the Yili and Kazakhstan

  20. Anesthesiologists' and surgeons' perceptions about routine pre-operative testing in low-risk patients: application of the Theoretical Domains Framework (TDF) to identify factors that influence physicians' decisions to order pre-operative tests.

    Science.gov (United States)

    Patey, Andrea M; Islam, Rafat; Francis, Jill J; Bryson, Gregory L; Grimshaw, Jeremy M

    2012-06-09

    Routine pre-operative tests for anesthesia management are often ordered by both anesthesiologists and surgeons for healthy patients undergoing low-risk surgery. The Theoretical Domains Framework (TDF) was developed to investigate determinants of behaviour and identify potential behaviour change interventions. In this study, the TDF is used to explore anaesthesiologists' and surgeons' perceptions of ordering routine tests for healthy patients undergoing low-risk surgery. Sixteen clinicians (eleven anesthesiologists and five surgeons) throughout Ontario were recruited. An interview guide based on the TDF was developed to identify beliefs about pre-operative testing practices. Content analysis of physicians' statements into the relevant theoretical domains was performed. Specific beliefs were identified by grouping similar utterances of the interview participants. Relevant domains were identified by noting the frequencies of the beliefs reported, presence of conflicting beliefs, and perceived influence on the performance of the behaviour under investigation. Seven of the twelve domains were identified as likely relevant to changing clinicians' behaviour about pre-operative test ordering for anesthesia management. Key beliefs were identified within these domains including: conflicting comments about who was responsible for the test-ordering (Social/professional role and identity); inability to cancel tests ordered by fellow physicians (Beliefs about capabilities and social influences); and the problem with tests being completed before the anesthesiologists see the patient (Beliefs about capabilities and Environmental context and resources). Often, tests were ordered by an anesthesiologist based on who may be the attending anesthesiologist on the day of surgery while surgeons ordered tests they thought anesthesiologists may need (Social influences). There were also conflicting comments about the potential consequences associated with reducing testing, from negative

  1. Appraising Pre-service EFL Teachers' Assessment in Language Testing Course Using Revised Bloom's Taxonomy

    Directory of Open Access Journals (Sweden)

    Elham Mohammadi

    2015-07-01

    Full Text Available The teachers need to be conceived as a “change agent” and not as a mere transmitter of knowledge and culture. In developing countries like Iran, one of the most significant concerns in the field of teachers’ education is efficiency of pre-service programs. To this aim, the current descriptive-evaluative study intended to describe the state of pre-service teachers' assessment in the field of language testing by (a examining the exam questions to find out whether they are aligned with curriculum objectives and syllabus (content validity, (b exploring whether they take care of higher order cognitive processes and (c finding what combinations of cognitive process levels and knowledge types in Revised Bloom's Taxonomy are prevalent in the questions. The results exhibited an unbalanced coverage of content in exams. Also the questions were found to be inadequate in terms of measuring complex cognitive skills (Analyze and Evaluate; Remember and Understand domains take up 91.6 % of all questions and no item was found for Create. Three combinations of cognitive process level and knowledge type was dominant in the data set: (1 Remember Factual Knowledge, (2 Understand Conceptual Knowledge, and (3 Apply Procedural Knowledge. These associations confirm the Anderson and Krathwohl's (2001 proposition.

  2. Do Fitness Apps Need Text Reminders? An Experiment Testing Goal-Setting Text Message Reminders to Promote Self-Monitoring.

    Science.gov (United States)

    Liu, Shuang; Willoughby, Jessica F

    2018-01-01

    Fitness tracking apps have the potential to change unhealthy lifestyles, but users' lack of compliance is an issue. The current intervention examined the effectiveness of using goal-setting theory-based text message reminders to promote tracking activities on fitness apps. We conducted a 2-week experiment with pre- and post-tests with young adults (n = 50). Participants were randomly assigned to two groups-a goal-setting text message reminder group and a generic text message reminder group. Participants were asked to use a fitness tracking app to log physical activity and diet for the duration of the study. Participants who received goal-setting reminders logged significantly more physical activities than those who only received generic reminders. Further, participants who received goal-setting reminders liked the messages and showed significantly increased self-efficacy, awareness of personal goals, motivation, and intention to use the app. The study shows that incorporating goal-setting theory-based text message reminders can be useful to boost user compliance with self-monitoring fitness apps by reinforcing users' personal goals and enhancing cognitive factors associated with health behavior change.

  3. BWR spent fuel storage cask performance test. Volume 2. Pre- and post-test decay heat, heat transfer, and shielding analyses

    International Nuclear Information System (INIS)

    Wiles, L.E.; Lombardo, N.J.; Heeb, C.M.; Jenquin, U.P.; Michener, T.E.; Wheeler, C.L.; Creer, J.M.; McCann, R.A.

    1986-06-01

    This report describes the decay heat, heat transfer, and shielding analyses conducted in support of performance testing of a Ridhihalgh, Eggers and Associates REA 2033 boiling water reactor (BWR) spent fuel storage cask. The cask testing program was conducted for the US Department of Energy (DOE) Commercial Spent Fuel Management Program by the Pacific Northwest Laboratory (PNL) and by General Electric at the latters' Morris Operation (GE-MO) as reported in Volume I. The analyses effort consisted of performing pretest calculations to (1) select spent fuel for the test; (2) symmetrically load the spent fuel assemblies in the cask to ensure lateral symmetry of decay heat generation rates; (3) optimally locate temperature and dose rate instrumentation in the cask and spent fuel assemblies; and (4) evaluate the ORIGEN2 (decay heat), HYDRA and COBRA-SFS (heat transfer), and QAD and DOT (shielding) computer codes. The emphasis of this second volume is on the comparison of code predictions to experimental test data in support of the code evaluation process. Code evaluations were accomplished by comparing pretest (actually pre-look, since some predictions were not completed until testing was in progress) predictions with experimental cask testing data reported in Volume I. No attempt was made in this study to compare the two heat transfer codes because results of other evaluations have not been completed, and a comparison based on one data set may lead to erroneous conclusions

  4. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  5. Experimental data report for Test TS-2 reactivity initiated accident test in NSRR with pre-irradiated BWR fuel rod

    International Nuclear Information System (INIS)

    Nakamura, Takehiko; Yoshinaga, Makio; Sobajima, Makoto; Fujishiro, Toshio; Kobayashi, Shinsho; Yamahara, Takeshi; Sukegawa, Tomohide; Kikuchi, Teruo

    1993-02-01

    This report presents experimental data for Test TS-2 which was the second test in a series of Reactivity Initiated Accident (RIA) condition test using pre-irradiated BWR fuel rods, performed at the Nuclear Safety Research Reactor (NSRR) in February, 1990. Test fuel rod used in the Test TS-2 was a short sized BWR (7x7) type rod which was fabricated from a commercial rod irradiated at Tsuruga Unit 1 power reactor. The fuel had an initial enrichment of 2.79% and a burnup of 21.3Gwd/tU (bundle average). A pulse irradiation of the test fuel rod was performed under a cooling condition of stagnant water at atmospheric pressure and at ambient temperature which simulated a BWR's cold start-up RIA event. The energy deposition of the fuel rod in this test was evaluated to be 72±5cal/g·fuel (66±5cal/g·fuel in peak fuel enthalpy) and no fuel failure was observed. Descriptions on test conditions, test procedures, transient behavior of the test rod during the pulse irradiation, and, results of pre and post pulse irradiation examinations are described in this report. (author)

  6. Urine specimen validity test for drug abuse testing in workplace and court settings.

    Science.gov (United States)

    Lin, Shin-Yu; Lee, Hei-Hwa; Lee, Jong-Feng; Chen, Bai-Hsiun

    2018-01-01

    In recent decades, urine drug testing in the workplace has become common in many countries in the world. There have been several studies concerning the use of the urine specimen validity test (SVT) for drug abuse testing administered in the workplace. However, very little data exists concerning the urine SVT on drug abuse tests from court specimens, including dilute, substituted, adulterated, and invalid tests. We investigated 21,696 submitted urine drug test samples for SVT from workplace and court settings in southern Taiwan over 5 years. All immunoassay screen-positive urine specimen drug tests were confirmed by gas chromatography/mass spectrometry. We found that the mean 5-year prevalence of tampering (dilute, substituted, or invalid tests) in urine specimens from the workplace and court settings were 1.09% and 3.81%, respectively. The mean 5-year percentage of dilute, substituted, and invalid urine specimens from the workplace were 89.2%, 6.8%, and 4.1%, respectively. The mean 5-year percentage of dilute, substituted, and invalid urine specimens from the court were 94.8%, 1.4%, and 3.8%, respectively. No adulterated cases were found among the workplace or court samples. The most common drug identified from the workplace specimens was amphetamine, followed by opiates. The most common drug identified from the court specimens was ketamine, followed by amphetamine. We suggest that all urine specimens taken for drug testing from both the workplace and court settings need to be tested for validity. Copyright © 2017. Published by Elsevier B.V.

  7. Sibship effects on dispersal behaviour in a pre-industrial human population.

    Science.gov (United States)

    Nitsch, A; Lummaa, V; Faurie, C

    2016-10-01

    Understanding dispersal behaviour and its determinants is critical for studies on life-history maximizing strategies. Although many studies have investigated the causes of dispersal, few have focused on the importance of sibship, despite that sibling interactions are predicted to lead to intrafamilial differences in dispersal patterns. Using a large demographic data set from pre-industrial Finland (n = 9000), we tested whether the sex-specific probability of dispersal depended on the presence of same-sex or opposite-sex elder siblings who can both compete and cooperate in the family. Overall, following our predictions, the presence of same-sex elder siblings increased the probability of dispersal from natal population for both sexes, whereas the number of opposite-sex siblings had less influence. Among males, dispersal was strongly linked to access to land resources. Female dispersal was mainly associated with competition over availability of mates but likely mediated by competition over access to wealthy mates rather mate availability per se. Besides ecological constraints, sibling interactions are strongly linked with dispersal decisions and need to be better considered in the studies on the evolution of family dynamics and fitness maximizing strategies in humans and other species. © 2016 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2016 European Society For Evolutionary Biology.

  8. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  9. Pre-screening Discussions and Prostate-Specific Antigen Testing for Prostate Cancer Screening.

    Science.gov (United States)

    Li, Jun; Zhao, Guixiang; Hall, Ingrid J

    2015-08-01

    For many men, the net benefit of prostate cancer screening with prostate-specific antigen (PSA) tests may be small. Many major medical organizations have issued recommendations for prostate cancer screening, stressing the need for shared decision making before ordering a test. The purpose of this study is to better understand associations between discussions about benefits and harms of PSA testing and uptake of the test among men aged ≥40 years. Associations between pre-screening discussions and PSA testing were examined using self-reported data from the 2012 Behavioral Risk Factor Surveillance System. Unadjusted prevalence of PSA testing was estimated and AORs were calculated using logistic regression in 2014. The multivariate analysis showed that men who had ever discussed advantages of PSA testing only or discussed both advantages and disadvantages were more likely, respectively, to report having had a test within the past year than men who had no discussions (ptesting with their healthcare providers were more likely (AOR=2.75, 95% CI=2.00, 3.79) to report getting tested than men who had no discussions. Discussions of the benefits or harms of PSA testing are positively associated with increased uptake of the test. Given the conflicting recommendations for prostate cancer screening and increasing importance of shared decision making, this study points to the need for understanding how pre-screening discussions are being conducted in clinical practice and the role played by patients' values and preferences in decisions about PSA testing. Published by Elsevier Inc.

  10. Data for a pre-performance test of self-developed electronic tongue sensors

    Directory of Open Access Journals (Sweden)

    Laura Isabell Immohr

    2016-12-01

    Full Text Available This article presents data, which can be applied for a pre-performance test of self-developed electronic tongue sensors. Contained data is related to the research article “Impact of Sodium Lauryl Sulfate in oral liquids on E-Tongue Measurements” (http://dx.doi.org/10.1016/j.ijpharm.2016.10.045; (L.I. Immohr, R. Turner, M. Pein-Hackelbusch, 2016 [1]. Sensor responses were obtained from 10 subsequent measurements and four different concentrations of quinine hydrochloride by electronic tongue (TS-5000Z, Insent Inc., Atsugi-Shi, Japan measurements. Based on the data for the pre-performance testing, which were calculated based on the fluctuation range of the sensor responses around the median, stability criteria and required preconditions cycles were defined.

  11. Investigating Pre-Service Early Childhood Teachers' Views and Intentions about Integrating and Using Computers in Early Childhood Settings: Compilation of an Instrument

    Science.gov (United States)

    Nikolopoulou, Kleopatra; Gialamas, Vasilis

    2009-01-01

    This paper discusses the compilation of an instrument in order to investigate pre-service early childhood teachers' views and intentions about integrating and using computers in early childhood settings. For the purpose of this study a questionnaire was compiled and administered to 258 pre-service early childhood teachers (PECTs), in Greece. A…

  12. Allergic contact dermatitis from ophthalmic products: can pre-treatment with sodium lauryl sulfate increase patch test sensitivity?

    Science.gov (United States)

    Corazza, Monica; Virgili, Annarosa

    2005-05-01

    In patients suspected of allergic contact dermatitis because of topical ophthalmic medicaments, patch tests performed with patients' own products are often negative. The irritant anionic surfactant sodium lauryl sulfate (SLS) may alter the stratum corneum and increase antigen penetration. Pre-treatment of the skin with SLS 0.5% for 24 h was performed in the sites of patch tests with patients' own products in 15 selected patients. In patients previously negative to their own products tested with conventional patch tests, SLS pre-treatment showed 6 new relevant positive reactions and induced a stronger positive reaction in 1 patient. SLS pre-treatment could be proposed as an alternative promising method, which may increase sensitivity of patch tests with patients' own products.

  13. Use of a national continuing medical education meeting to provide simulation-based training in temporary hemodialysis catheter insertion skills: a pre-test post-test study.

    Science.gov (United States)

    Clark, Edward G; Paparello, James J; Wayne, Diane B; Edwards, Cedric; Hoar, Stephanie; McQuillan, Rory; Schachter, Michael E; Barsuk, Jeffrey H

    2014-01-01

    Simulation-based-mastery-learning (SBML) is an effective method to train nephrology fellows to competently insert temporary, non-tunneled hemodialysis catheters (NTHCs). Previous studies of SBML for NTHC-insertion have been conducted at a local level. Determine if SBML for NTHC-insertion can be effective when provided at a national continuing medical education (CME) meeting. Describe the correlation of demographic factors, prior experience with NTHC-insertion and procedural self-confidence with simulated performance of the procedure. Pre-test - post-test study. 2014 Canadian Society of Nephrology annual meeting. Nephrology fellows, internal medicine residents and medical students. Participants were surveyed regarding demographics, prior NTHC-insertion experience, procedural self-confidence and attitudes regarding the training they received. NTHC-insertion skills were assessed using a 28-item checklist. Participants underwent a pre-test of their NTHC-insertion skills at the internal jugular site using a realistic patient simulator and ultrasound machine. Participants then had a training session that included a didactic presentation and 2 hours of deliberate practice using the simulator. On the following day, trainees completed a post-test of their NTHC-insertion skills. All participants were required to meet or exceed a minimum passing score (MPS) previously set at 79%. Trainees who did not reach the MPS were required to perform more deliberate practice until the MPS was achieved. Twenty-two individuals participated in SBML training. None met or exceeded the MPS at baseline with a median checklist score of 20 (IQR, 7.25 to 21). Seventeen of 22 participants (77%) completed post-testing and improved their scores to a median of 27 (IQR, 26 to 28; p < 0.001). All met or exceeded the MPS on their first attempt. There were no significant correlations between demographics, prior experience or procedural self-confidence with pre-test performance. Small sample-size and

  14. Staged-Fault Testing of Distance Protection Relay Settings

    Science.gov (United States)

    Havelka, J.; Malarić, R.; Frlan, K.

    2012-01-01

    In order to analyze the operation of the protection system during induced fault testing in the Croatian power system, a simulation using the CAPE software has been performed. The CAPE software (Computer-Aided Protection Engineering) is expert software intended primarily for relay protection engineers, which calculates current and voltage values during faults in the power system, so that relay protection devices can be properly set up. Once the accuracy of the simulation model had been confirmed, a series of simulations were performed in order to obtain the optimal fault location to test the protection system. The simulation results were used to specify the test sequence definitions for the end-to-end relay testing using advanced testing equipment with GPS synchronization for secondary injection in protection schemes based on communication. The objective of the end-to-end testing was to perform field validation of the protection settings, including verification of the circuit breaker operation, telecommunication channel time and the effectiveness of the relay algorithms. Once the end-to-end secondary injection testing had been completed, the induced fault testing was performed with three-end lines loaded and in service. This paper describes and analyses the test procedure, consisting of CAPE simulations, end-to-end test with advanced secondary equipment and staged-fault test of a three-end power line in the Croatian transmission system.

  15. Acceptability of HIV/AIDS testing among pre-marital couples in Iran (2012).

    Science.gov (United States)

    Ayatollahi, Jamshid; Nasab Sarab, Mohammad Ali Bagheri; Sharifi, Mohammad Reza; Shahcheraghi, Seyed Hossein

    2014-07-01

    Human immunodeficiency virus (HIV)/acquired immune deficiency syndrome (AIDS) is a lifestyle-related disease. This disease is transmitted through unprotected sex, contaminated needles, infected blood transfusion and from mother to child during pregnancy and delivery. Prevention of infection with HIV, mainly through safe sex and needle exchange programmes is a solution to prevent the spread of the disease. Knowledge about HIV state helps to prevent and subsequently reduce the harm to the later generation. The purpose of this study was to assess the willingness rate of couples referred to the family regulation pre-marital counselling centre for performing HIV test before marriage in Yazd. In this descriptive study, a simple random sampling was done among people referred to Akbari clinic. The couples were 1000 men and 1000 women referred to the premarital counselling centre for pre-marital HIV testing in Yazd in the year 2012. They were in situations of pregnancy, delivery or nursing and milking. The data were analyzed using Statistical Package for the Social Sciences (SPSS) software and chi-square statistical test. There was a significant statistical difference between the age groups about willingness for HIV testing before marriage (P marriage was significant. Therefore, HIV testing before marriage as a routine test was suggested.

  16. Probabilistic Open Set Recognition

    Science.gov (United States)

    Jain, Lalit Prithviraj

    support vector machines. Building from the success of statistical EVT based recognition methods such as PI-SVM and W-SVM on the open set problem, we present a new general supervised learning algorithm for multi-class classification and multi-class open set recognition called the Extreme Value Local Basis (EVLB). The design of this algorithm is motivated by the observation that extrema from known negative class distributions are the closest negative points to any positive sample during training, and thus should be used to define the parameters of a probabilistic decision model. In the EVLB, the kernel distribution for each positive training sample is estimated via an EVT distribution fit over the distances to the separating hyperplane between positive training sample and closest negative samples, with a subset of the overall positive training data retained to form a probabilistic decision boundary. Using this subset as a frame of reference, the probability of a sample at test time decreases as it moves away from the positive class. Possessing this property, the EVLB is well-suited to open set recognition problems where samples from unknown or novel classes are encountered at test. Our experimental evaluation shows that the EVLB provides a substantial improvement in scalability compared to standard radial basis function kernel machines, as well as P I-SVM and W-SVM, with improved accuracy in many cases. We evaluate our algorithm on open set variations of the standard visual learning benchmarks, as well as with an open subset of classes from Caltech 256 and ImageNet. Our experiments show that PI-SVM, WSVM and EVLB provide significant advances over the previous state-of-the-art solutions for the same tasks.

  17. Rejecting probability summation for radial frequency patterns, not so Quick!

    Science.gov (United States)

    Baldwin, Alex S; Schmidtmann, Gunnar; Kingdom, Frederick A A; Hess, Robert F

    2016-05-01

    Radial frequency (RF) patterns are used to assess how the visual system processes shape. They are thought to be detected globally. This is supported by studies that have found summation for RF patterns to be greater than what is possible if the parts were being independently detected and performance only then improved with an increasing number of cycles by probability summation between them. However, the model of probability summation employed in these previous studies was based on High Threshold Theory (HTT), rather than Signal Detection Theory (SDT). We conducted rating scale experiments to investigate the receiver operating characteristics. We find these are of the curved form predicted by SDT, rather than the straight lines predicted by HTT. This means that to test probability summation we must use a model based on SDT. We conducted a set of summation experiments finding that thresholds decrease as the number of modulated cycles increases at approximately the same rate as previously found. As this could be consistent with either additive or probability summation, we performed maximum-likelihood fitting of a set of summation models (Matlab code provided in our Supplementary material) and assessed the fits using cross validation. We find we are not able to distinguish whether the responses to the parts of an RF pattern are combined by additive or probability summation, because the predictions are too similar. We present similar results for summation between separate RF patterns, suggesting that the summation process there may be the same as that within a single RF. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Anesthesiologists’ and surgeons’ perceptions about routine pre-operative testing in low-risk patients: application of the Theoretical Domains Framework (TDF to identify factors that influence physicians’ decisions to order pre-operative tests

    Directory of Open Access Journals (Sweden)

    Patey Andrea M

    2012-06-01

    Full Text Available Abstract Background Routine pre-operative tests for anesthesia management are often ordered by both anesthesiologists and surgeons for healthy patients undergoing low-risk surgery. The Theoretical Domains Framework (TDF was developed to investigate determinants of behaviour and identify potential behaviour change interventions. In this study, the TDF is used to explore anaesthesiologists’ and surgeons’ perceptions of ordering routine tests for healthy patients undergoing low-risk surgery. Methods Sixteen clinicians (eleven anesthesiologists and five surgeons throughout Ontario were recruited. An interview guide based on the TDF was developed to identify beliefs about pre-operative testing practices. Content analysis of physicians’ statements into the relevant theoretical domains was performed. Specific beliefs were identified by grouping similar utterances of the interview participants. Relevant domains were identified by noting the frequencies of the beliefs reported, presence of conflicting beliefs, and perceived influence on the performance of the behaviour under investigation. Results Seven of the twelve domains were identified as likely relevant to changing clinicians’ behaviour about pre-operative test ordering for anesthesia management. Key beliefs were identified within these domains including: conflicting comments about who was responsible for the test-ordering (Social/professional role and identity; inability to cancel tests ordered by fellow physicians (Beliefs about capabilities and social influences; and the problem with tests being completed before the anesthesiologists see the patient (Beliefs about capabilities and Environmental context and resources. Often, tests were ordered by an anesthesiologist based on who may be the attending anesthesiologist on the day of surgery while surgeons ordered tests they thought anesthesiologists may need (Social influences. There were also conflicting comments about the potential

  20. Same day ART initiation versus clinic-based pre-ART assessment and counselling for individuals newly tested HIV-positive during community-based HIV testing in rural Lesotho - a randomized controlled trial (CASCADE trial).

    Science.gov (United States)

    Labhardt, Niklaus Daniel; Ringera, Isaac; Lejone, Thabo Ishmael; Masethothi, Phofu; Thaanyane, T'sepang; Kamele, Mashaete; Gupta, Ravi Shankar; Thin, Kyaw; Cerutti, Bernard; Klimkait, Thomas; Fritz, Christiane; Glass, Tracy Renée

    2016-04-14

    Achievement of the UNAIDS 90-90-90 targets in Sub-Sahara Africa is challenged by a weak care-cascade with poor linkage to care and retention in care. Community-based HIV testing and counselling (HTC) is widely used in African countries. However, rates of linkage to care and initiation of antiretroviral therapy (ART) in individuals who tested HIV-positive are often very low. A frequently cited reason for non-linkage to care is the time-consuming pre-ART assessment often requiring several clinic visits before ART-initiation. This two-armed open-label randomized controlled trial compares in individuals tested HIV-positive during community-based HTC the proposition of same-day community-based ART-initiation to the standard of care pre-ART assessment at the clinic. Home-based HTC campaigns will be conducted in catchment areas of six clinics in rural Lesotho. Households where at least one individual tested HIV positive will be randomized. In the standard of care group individuals receive post-test counselling and referral to the nearest clinic for pre-ART assessment and counselling. Once they have started ART the follow-up schedule foresees monthly clinic visits. Individuals randomized to the intervention group receive on the spot point-of-care pre-ART assessment and adherence counselling with the proposition to start ART that same day. Once they have started ART, follow-up clinic visits will be less frequent. First primary outcome is linkage to care (individual presents at the clinic at least once within 3 months after the HIV test). The second primary outcome is viral suppression 12 months after enrolment in the study. We plan to enrol a minimum of 260 households with 1:1 allocation and parallel assignment into both arms. This trial will show if in individuals tested HIV-positive during community-based HTC campaigns the proposition of same-day ART initiation in the community, combined with less frequent follow-up visits at the clinic could be a pragmatic approach to

  1. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  2. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  3. S.E.T., CSNI Separate Effects Test Facility Validation Matrix

    International Nuclear Information System (INIS)

    1997-01-01

    1 - Description of test facility: The SET matrix of experiments is suitable for the developmental assessment of thermal-hydraulics transient system computer codes by selecting individual tests from selected facilities, relevant to each phenomena. Test facilities differ from one another in geometrical dimensions, geometrical configuration and operating capabilities or conditions. Correlation between SET facility and phenomena were calculated on the basis of suitability for model validation (which means that a facility is designed in such a way as to stimulate the phenomena assumed to occur in a plant and is sufficiently instrumented); limited suitability for model variation (which means that a facility is designed in such a way as to stimulate the phenomena assumed to occur in a plant but has problems associated with imperfect scaling, different test fluids or insufficient instrumentation); and unsuitability for model validation. 2 - Description of test: Whereas integral experiments are usually designed to follow the behaviour of a reactor system in various off-normal or accident transients, separate effects tests focus on the behaviour of a single component, or on the characteristics of one thermal-hydraulic phenomenon. The construction of a separate effects test matrix is an attempt to collect together the best sets of openly available test data for code validation, assessment and improvement, from the wide range of experiments that have been carried out world-wide in the field of thermal hydraulics. In all, 2094 tests are included in the SET matrix

  4. An Independent Filter for Gene Set Testing Based on Spectral Enrichment

    NARCIS (Netherlands)

    Frost, H Robert; Li, Zhigang; Asselbergs, Folkert W; Moore, Jason H

    2015-01-01

    Gene set testing has become an indispensable tool for the analysis of high-dimensional genomic data. An important motivation for testing gene sets, rather than individual genomic variables, is to improve statistical power by reducing the number of tested hypotheses. Given the dramatic growth in

  5. Portland, Oregon Test Data Set Arterial Loop Detector Data

    Data.gov (United States)

    Department of Transportation — This set of data files was acquired under USDOT FHWA cooperative agreement DTFH61-11-H-00025 as one of the four test data sets acquired by the USDOT Data Capture and...

  6. Portland, Oregon Test Data Set Freeway Loop Detector Data

    Data.gov (United States)

    Department of Transportation — This set of data files was acquired under USDOT FHWA cooperative agreement DTFH61-11-H-00025 as one of the four test data sets acquired by the USDOT Data Capture and...

  7. Pre-irradiation tests on U-Si alloys

    International Nuclear Information System (INIS)

    Howe, L.M.; Bell, L.G.

    1958-05-01

    Pre-irradiation tests of hardness, density, electrical resistivity, and corrosion resistance as well as metallographic and X-ray examinations were undertaken on U-Si core material, which had been co-extruded in Zr--2, in order that the effect of irradiation on alloys in the epsilon range could be assessed. In addition, a study of the epsilonization of arc-melted material was undertaken in order to rain familiarity with the epsilonization process and to obtain information on the corrosion behaviour of epsilonized material. Sheathed U-Si samples in the epsilonized and de-epsilonized conditions have been irradiated in the X-2 loop, with a water temperature of 275 o C. The samples have been examined after 250 MWD/Tonne and show no dimensional change. (author)

  8. Problem-Solving Test: Nucleocytoplasmic Shuttling of Pre-mRNA Binding Proteins

    Science.gov (United States)

    Szeberenyi, Jozsef

    2012-01-01

    Terms to be familiar with before you start to solve the test: transcription, pre-mRNA, RNA processing, RNA transport, RNA polymerase II, direct and indirect immunofluorescence staining, cell fractionation by centrifugation, oligo(dT)-cellulose chromatography, washing and elution of the column, ribonuclease, SDS-polyacrylamide gel electrophoresis,…

  9. Probability, statistics, and associated computing techniques

    International Nuclear Information System (INIS)

    James, F.

    1983-01-01

    This chapter attempts to explore the extent to which it is possible for the experimental physicist to find optimal statistical techniques to provide a unique and unambiguous quantitative measure of the significance of raw data. Discusses statistics as the inverse of probability; normal theory of parameter estimation; normal theory (Gaussian measurements); the universality of the Gaussian distribution; real-life resolution functions; combination and propagation of uncertainties; the sum or difference of 2 variables; local theory, or the propagation of small errors; error on the ratio of 2 discrete variables; the propagation of large errors; confidence intervals; classical theory; Bayesian theory; use of the likelihood function; the second derivative of the log-likelihood function; multiparameter confidence intervals; the method of MINOS; least squares; the Gauss-Markov theorem; maximum likelihood for uniform error distribution; the Chebyshev fit; the parameter uncertainties; the efficiency of the Chebyshev estimator; error symmetrization; robustness vs. efficiency; testing of hypotheses (e.g., the Neyman-Pearson test); goodness-of-fit; distribution-free tests; comparing two one-dimensional distributions; comparing multidimensional distributions; and permutation tests for comparing two point sets

  10. Development of Pre-set Counter-rotating Streamwise Vortices in Wavy Channel

    KAUST Repository

    Budiman, A.C.

    2015-10-23

    Development of counter-rotating streamwise vortices in a rectangular channel with one-sided wavy surface has been experimentally quantified using hot-wire anemometry. The wavy surface has fixed amplitude of 3.75 mm. The counter-rotating vortices are pre-set by means of a sawtooth pattern cut at the leading edge of the wavy surface. Variations of the central streamwise velocity Uc with a channel gap H = 35 mm and 50 mm (corresponding to a Reynolds number from 1600 to 4400) change the instability of the flow which can be distinguished from the velocity contours at a certain spanwise plane. The streamwise velocity contours and turbulence intensity for Reynolds number Re = 3100 and H = 35 mm show the disappearance of the mushroom-like vortices prior to turbulence near the second peak of the wavy surface, while for higher Re, this phenomenon occurs earlier. Under certain conditions, for example, for Re = 4400 and H = 50 mm, the splitting of the vortices can also be observed.

  11. Development of Pre-set Counter-rotating Streamwise Vortices in Wavy Channel

    KAUST Repository

    Budiman, A.C.; Mitsudharmadi, Hatsari; Bouremel, Y.; Winoto, S.H.; Low, H.T.

    2015-01-01

    Development of counter-rotating streamwise vortices in a rectangular channel with one-sided wavy surface has been experimentally quantified using hot-wire anemometry. The wavy surface has fixed amplitude of 3.75 mm. The counter-rotating vortices are pre-set by means of a sawtooth pattern cut at the leading edge of the wavy surface. Variations of the central streamwise velocity Uc with a channel gap H = 35 mm and 50 mm (corresponding to a Reynolds number from 1600 to 4400) change the instability of the flow which can be distinguished from the velocity contours at a certain spanwise plane. The streamwise velocity contours and turbulence intensity for Reynolds number Re = 3100 and H = 35 mm show the disappearance of the mushroom-like vortices prior to turbulence near the second peak of the wavy surface, while for higher Re, this phenomenon occurs earlier. Under certain conditions, for example, for Re = 4400 and H = 50 mm, the splitting of the vortices can also be observed.

  12. Automated ultrasonic testing of nuclear reactor welds and overlays in pre-service and in-service inspections

    International Nuclear Information System (INIS)

    Sladky, J.

    1988-01-01

    Since 1982, automatic pre-service and in-service checks are being made of welded joints and overlays on pressure vessels of WWER-440 nuclear reactors in Czechoslovakia. This is being done using the SKODA REACTORTEST TRC facility which is used for checking peripheral welded joints on the pressure vessel, neck joints, overlays in other selected areas of the cylindrical section of the pressure vessel, on radius transitions of the pressure vessel and of necks, and on the cylindrical part of necks, and also for checking the base material in selected parts of the pressure vessel and the base material of the neck extension piece. The tests are of two types, namely tests of peripheral welds and overlays of the cylindrical parts of the pressure vessel, and tests of the necks. Different ultrasonic probe holders are used for the tests, with totally different design. Ultrasonic probes which were initially used were of foreign make while at present, those of Czechoslovak make are used. For each pressure vessel a set of ultrasonic probes is used which should suffice for the life of the vessel. Experience gained so far is being used in work on the project of a new device for testing nuclear reactor presure vessels from the inside. (Z.M.)

  13. Pre-examination factors affecting molecular diagnostic test results and interpretation: A case-based approach.

    Science.gov (United States)

    Payne, Deborah A; Baluchova, Katarina; Peoc'h, Katell H; van Schaik, Ron H N; Chan, K C Allen; Maekawa, Masato; Mamotte, Cyril; Russomando, Graciela; Rousseau, François; Ahmad-Nejad, Parviz

    2017-04-01

    Multiple organizations produce guidance documents that provide opportunities to harmonize quality practices for diagnostic testing. The International Organization for Standardization ISO 15189 standard addresses requirements for quality in management and technical aspects of the clinical laboratory. One technical aspect addresses the complexities of the pre-examination phase prior to diagnostic testing. The Committee for Molecular Diagnostics of the International Federation for Clinical Chemistry and Laboratory Medicine (also known as, IFCC C-MD) conducted a survey of international molecular laboratories and determined ISO 15189 to be the most referenced guidance document. In this review, the IFCC C-MD provides case-based examples illustrating the value of select pre-examination processes as these processes relate to molecular diagnostic testing. Case-based examples in infectious disease, oncology, inherited disease and pharmacogenomics address the utility of: 1) providing information to patients and users, 2) designing requisition forms, 3) obtaining informed consent and 4) maintaining sample integrity prior to testing. The pre-examination phase requires extensive and consistent communication between the laboratory, the healthcare provider and the end user. The clinical vignettes presented in this paper illustrate the value of applying select ISO 15189 recommendations for general laboratory to the more specialized area of Molecular Diagnostics. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Pre-scission 4He multiplicity in the 19F+197Au reaction

    International Nuclear Information System (INIS)

    Ikezoe, H.; Shikazono, N.; Nagame, Y.; Sugiyama, Y.; Tomita, Y.; Ideno, K.; Iwamoto, A.; Ohtsuki, T.

    1990-01-01

    Pre- and post-scission 4 He particle multiplicities for the 19 F+ 197 Au reaction in the excitation energy range of 43 to 90 MeV have been measured in coincidence with fission fragments. The coincident 4 He particles measured at backward angles are accounted for by evaporation from a compound nucleus and fission fragments. The most probable center-of-mass energy of the 4 He particles measured at backward angles is shifted towards lower energies by ∼2 MeV compared to a statistical model calculation performed by assuming 4 He emission from a spherical compound nucleus. The observed pre-scission 4 He multiplicity as a function of excitation energy is compared to a set of statistical-model calculations which also included the delayed onset of fission. The comparison shows that the observed energy dependence of the pre-scission 4 He multiplicity is reproduced by the calculation without taking into account the delayed onset of fission if the reduced emission barrier for 4 He is assumed in the calculation. The emission mechanism of the pre-scission 4 He is discussed

  15. 3D Orthorhombic Elastic Wave Propagation Pre-Test Simulation of SPE DAG-1 Test

    Science.gov (United States)

    Jensen, R. P.; Preston, L. A.

    2017-12-01

    A more realistic representation of many geologic media can be characterized as a dense system of vertically-aligned microfractures superimposed on a finely-layered horizontal geology found in shallow crustal rocks. This seismic anisotropy representation lends itself to being modeled as an orthorhombic elastic medium comprising three mutually orthogonal symmetry planes containing nine independent moduli. These moduli can be determined by observing (or prescribing) nine independent P-wave and S-wave phase speeds along different propagation directions. We have developed an explicit time-domain finite-difference (FD) algorithm for simulating 3D elastic wave propagation in a heterogeneous orthorhombic medium. The components of the particle velocity vector and the stress tensor are governed by a set of nine, coupled, first-order, linear, partial differential equations (PDEs) called the velocity-stress system. All time and space derivatives are discretized with centered and staggered FD operators possessing second- and fourth-order numerical accuracy, respectively. Additionally, we have implemented novel perfectly matched layer (PML) absorbing boundary conditions, specifically designed for orthorhombic media, to effectively suppress grid boundary reflections. In support of the Source Physics Experiment (SPE) Phase II, a series of underground chemical explosions at the Nevada National Security Site, the code has been used to perform pre-test estimates of the Dry Alluvium Geology - Experiment 1 (DAG-1). Based on literature searches, realistic geologic structure and values for orthorhombic P-wave and S-wave speeds have been estimated. Results and predictions from the simulations are presented.

  16. Change in settings for early-season influenza vaccination among US adults, 2012 to 2013

    Directory of Open Access Journals (Sweden)

    Sarah J. Clark, MPH

    2016-12-01

    Full Text Available Vaccination in non-medical settings is recommended as a strategy to increase access to seasonal influenza vaccine. To evaluate change in early-season influenza vaccination setting, we analyzed data from the National Internet Flu Survey. Bivariate comparison of respondent characteristics by location of vaccination was assessed using chi-square tests. Multinomial logistic regression was performed to compare the predicted probability of being vaccinated in medical, retail, and mobile settings in 2012 vs 2013. In both 2012 and 2013, vaccination in medical settings was more likely among elderly adults, those with chronic conditions, and adults with a high school education or less. Adults 18–64 without a chronic condition had a lower probability of vaccination in the medical setting, and higher probability of vaccination in a retail or mobile setting, in 2013 compared to 2012. Adults 18–64 with a chronic condition had no change in their location of flu vaccination. Elderly adults had a lower probability of vaccination in the medical setting, and higher probability of vaccination in a retail setting, in 2013 compared to 2012. Non-medical settings continue to play an increasing role in influenza vaccination of adults, particularly for adults without a chronic condition and elderly adults. Retail and mobile settings should continue to be viewed as important mechanisms to ensure broad access to influenza vaccination.

  17. Automotive RF immunity test set-up analysis : why test results can't compare

    NARCIS (Netherlands)

    Coenen, Mart; Pues, H.; Bousquet, T.

    2011-01-01

    Though the automotive RF emission and RF immunity requirements are highly justifiable, the application of those requirements in an non-intended manner leads to false conclusions and unnecessary redesigns for the electronics involved. When the test results become too dependent upon the test set-up

  18. Experimental data report for test TS-3 Reactivity Initiated Accident test in the NSRR with pre-irradiated BWR fuel rod

    International Nuclear Information System (INIS)

    Nakamura, Takehiko; Yoshinaga, Makio; Fujishiro, Toshio; Kobayashi, Shinsho; Yamahara, Takeshi; Sukegawa, Tomohide; Kikuchi, Teruo; Sobajima, Makoto.

    1993-09-01

    This report presents experimental data for Test TS-3 which was the third test in a series of Reactivity Initiated Accident (RIA) tests using pre-irradiated BWR fuel rods, performed in the Nuclear Safety Research Reactor (NSRR) in September, 1990. Test fuel rod used in the Test TS-3 was a short-sized BWR (7 x 7) type rod which was re-fabricated from a commercial rod irradiated in the Tsuruga Unit 1 power reactor of Japan Atomic Power Co. The fuel had an initial enrichment of 2.79 % and a burnup of 26 Gwd/tU. A pulse irradiation of the test fuel rod was performed under a cooling condition of stagnant water at atmospheric pressure and at ambient temperature which simulated a BWR's cold start-up RIA event. The energy deposition of the fuel rod in this test was evaluated to be 94 ± 4 cal/g · fuel (88 ± 4 cal/g · fuel in peak fuel enthalpy) and no fuel failure was observed. Descriptions on test conditions, test procedures, transient behavior of the test rod during the pulse irradiation, and results of pre-pulse and post-pulse irradiation examinations are described in this report. (author)

  19. Development of a grinding-specific performance test set-up

    DEFF Research Database (Denmark)

    Olesen, C. G.; Larsen, B. H.; Andresen, E. L.

    2015-01-01

    The aim of this study was to develop a performance test set-up for America's Cup grinders. The test set-up had to mimic the on-boat grinding activity and be capable of collecting data for analysis and evaluation of grinding performance. This study included a literature-based analysis of grinding...... demands and a test protocol developed to accommodate the necessary physiological loads. This study resulted in a test protocol consisting of 10 intervals of 20 revolutions each interspersed with active resting periods of 50 s. The 20 revolutions are a combination of both forward and backward grinding...... and an exponentially rising resistance. A custom-made grinding ergometer was developed with computer-controlled resistance and capable of collecting data during the test. The data collected can be used to find measures of grinding performance such as peak power, time to complete and the decline in repeated grinding...

  20. Development of a grinding-specific performance test set-up.

    Science.gov (United States)

    Olesen, C G; Larsen, B H; Andresen, E L; de Zee, M

    2015-01-01

    The aim of this study was to develop a performance test set-up for America's Cup grinders. The test set-up had to mimic the on-boat grinding activity and be capable of collecting data for analysis and evaluation of grinding performance. This study included a literature-based analysis of grinding demands and a test protocol developed to accommodate the necessary physiological loads. This study resulted in a test protocol consisting of 10 intervals of 20 revolutions each interspersed with active resting periods of 50 s. The 20 revolutions are a combination of both forward and backward grinding and an exponentially rising resistance. A custom-made grinding ergometer was developed with computer-controlled resistance and capable of collecting data during the test. The data collected can be used to find measures of grinding performance such as peak power, time to complete and the decline in repeated grinding performance.

  1. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  2. Students' Understanding of Conditional Probability on Entering University

    Science.gov (United States)

    Reaburn, Robyn

    2013-01-01

    An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…

  3. Use of operational data for the assessment of pre-existing software

    International Nuclear Information System (INIS)

    Helminen, Atte; Gran, Bjoern Axel; Kristiansen, Monica; Winther, Rune

    2004-01-01

    To build sufficient confidence on the reliability of the safety systems of nuclear power plants all available sources of information should be used. One important data source is the operational experience collected for the system. The operational experience is particularly applicable for systems of pre-existing software. Even though systems and devices involving pre-existing software are not considered for the functions of highest safety levels of nuclear power plants, they will most probably be introduced to functions of lower safety levels and to none-safety related applications. In the paper we shortly discuss the use of operational experience data for the reliability assessment of pre-existing software in general, and the role of pre-existing software in relation to safety applications. Then we discuss the modelling of operational profiles, the application of expert judgement on operational profiles and the need of a realistic test case. Finally, we discuss the application of operational experience data in Bayesian statistics. (Author)

  4. In-situ medical simulation for pre-implementation testing of clinical service in a regional hospital in Hong Kong.

    Science.gov (United States)

    Chen, P P; Tsui, N Tk; Fung, A Sw; Chiu, A Hf; Wong, W Cw; Leong, H T; Lee, P Sf; Lau, J Yw

    2017-08-01

    The implementation of a new clinical service is associated with anxiety and challenges that may prevent smooth and safe execution of the service. Unexpected issues may not be apparent until the actual clinical service commences. We present a novel approach to test the new clinical setting before actual implementation of our endovascular aortic repair service. In-situ simulation at the new clinical location would enable identification of potential process and system issues prior to implementation of the service. After preliminary planning, a simulation test utilising a case scenario with actual simulation of the entire care process was carried out to identify any logistic, equipment, settings or clinical workflow issues, and to trial a contingency plan for a surgical complication. All patient care including anaesthetic, surgical, and nursing procedures and processes were simulated and tested. Overall, 17 vital process and system issues were identified during the simulation as potential clinical concerns. They included difficult patient positioning, draping pattern, unsatisfactory equipment setup, inadequate critical surgical instruments, blood products logistics, and inadequate nursing support during crisis. In-situ simulation provides an innovative method to identify critical deficiencies and unexpected issues before implementation of a new clinical service. Life-threatening and serious practical issues can be identified and corrected before formal service commences. This article describes our experience with the use of simulation in pre-implementation testing of a clinical process or service. We found the method useful and would recommend it to others.

  5. Complot test section outlet CFD optimization (pre - test and dimensioning)

    International Nuclear Information System (INIS)

    Profir, M. M.; Moreau, V.; Kennedy, G.

    2013-01-01

    In the framework of the FP7 MAXSIMA European project, the COMPLOT (COMPonent LOop Testing) LBE experimental facility is employed for thermal-hydraulic experiments aimed to test and qualify, among other components, a buoyancy driven safety/control rods (SR/CR) system, as key components for the safe operation of the MYRRHA reactor. This paper focuses mainly on a simplified CFD representation of the SR test section outlet in order to optimise it for the testing program. Parametric cases, associated with different positions of the SR assembly have been set up and analysed. A quasi-static analysis has been performed for each case, accounting for the LBE volume displaced by the insertion of the SR bundle, by introducing appropriately positioned additional mass sources. Velocity and pressure fields, as well as pressure drop magnitudes and mass flow rates through relevant guide tube hole outlets have been calculated and compared. The CFD analysis proved that the outer boundary of the test section does not impact the expected performance of the SR (rapid transient downward insertion). Preliminary simulations reproducing the timely repositioning of the SR/CR in COMPLOT using procedures of automatic volume mesh regeneration, consistently with the rod imposed displacement, are illustrated. (authors)

  6. Pre-Test Analysis Predictions for the Shell Buckling Knockdown Factor Checkout Tests - TA01 and TA02

    Science.gov (United States)

    Thornburgh, Robert P.; Hilburger, Mark W.

    2011-01-01

    This report summarizes the pre-test analysis predictions for the SBKF-P2-CYL-TA01 and SBKF-P2-CYL-TA02 shell buckling tests conducted at the Marshall Space Flight Center (MSFC) in support of the Shell Buckling Knockdown Factor (SBKF) Project, NASA Engineering and Safety Center (NESC) Assessment. The test article (TA) is an 8-foot-diameter aluminum-lithium (Al-Li) orthogrid cylindrical shell with similar design features as that of the proposed Ares-I and Ares-V barrel structures. In support of the testing effort, detailed structural analyses were conducted and the results were used to monitor the behavior of the TA during the testing. A summary of predicted results for each of the five load sequences is presented herein.

  7. Evaluating Diagnostic Point-of-Care Tests in Resource-Limited Settings

    Science.gov (United States)

    Drain, Paul K; Hyle, Emily P; Noubary, Farzad; Freedberg, Kenneth A; Wilson, Douglas; Bishai, William; Rodriguez, William; Bassett, Ingrid V

    2014-01-01

    Diagnostic point-of-care (POC) testing is intended to minimize the time to obtain a test result, thereby allowing clinicians and patients to make an expeditious clinical decision. As POC tests expand into resource-limited settings (RLS), the benefits must outweigh the costs. To optimize POC testing in RLS, diagnostic POC tests need rigorous evaluations focused on relevant clinical outcomes and operational costs, which differ from evaluations of conventional diagnostic tests. Here, we reviewed published studies on POC testing in RLS, and found no clearly defined metric for the clinical utility of POC testing. Therefore, we propose a framework for evaluating POC tests, and suggest and define the term “test efficacy” to describe a diagnostic test’s capacity to support a clinical decision within its operational context. We also proposed revised criteria for an ideal diagnostic POC test in resource-limited settings. Through systematic evaluations, comparisons between centralized diagnostic testing and novel POC technologies can be more formalized, and health officials can better determine which POC technologies represent valuable additions to their clinical programs. PMID:24332389

  8. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  9. Transition probabilities of Ce I obtained from Boltzmann analysis of visible and near-infrared emission spectra

    Science.gov (United States)

    Nitz, D. E.; Curry, J. J.; Buuck, M.; DeMann, A.; Mitchell, N.; Shull, W.

    2018-02-01

    We report radiative transition probabilities for 5029 emission lines of neutral cerium within the wavelength range 417-1110 nm. Transition probabilities for only 4% of these lines have been previously measured. These results are obtained from a Boltzmann analysis of two high resolution Fourier transform emission spectra used in previous studies of cerium, obtained from the digital archives of the National Solar Observatory at Kitt Peak. The set of transition probabilities used for the Boltzmann analysis are those published by Lawler et al (2010 J. Phys. B: At. Mol. Opt. Phys. 43 085701). Comparisons of branching ratios and transition probabilities for lines common to the two spectra provide important self-consistency checks and test for the presence of self-absorption effects. Estimated 1σ uncertainties for our transition probability results range from 10% to 18%.

  10. Exploring non-signalling polytopes with negative probability

    International Nuclear Information System (INIS)

    Oas, G; Barros, J Acacio de; Carvalhaes, C

    2014-01-01

    Bipartite and tripartite EPR–Bell type systems are examined via joint quasi-probability distributions where probabilities are permitted to be negative. It is shown that such distributions exist only when the no-signalling condition is satisfied. A characteristic measure, the probability mass, is introduced and, via its minimization, limits the number of quasi-distributions describing a given marginal probability distribution. The minimized probability mass is shown to be an alternative way to characterize non-local systems. Non-signalling polytopes for two to eight settings in the bipartite scenario are examined and compared to prior work. Examining perfect cloning of non-local systems within the tripartite scenario suggests defining two categories of signalling. It is seen that many properties of non-local systems can be efficiently described by quasi-probability theory. (paper)

  11. Integration of educational methods and physical settings: design guidelines for High/Scope methodology in pre-schools

    Directory of Open Access Journals (Sweden)

    Shirin Izadpanah

    2014-06-01

    Full Text Available Quality design and appropriate space organization in preschool settings can support preschool children's educational activities. Although the relationship between the well-being and development of children and physical settings has been emphasized by many early childhood researchers, there is still a need for theoretical design guidelines that are geared towards the improvement of this issue. This research focuses on High/Scope education and aims to shape a theoretical guideline that raises teachers' awareness about the potential of learning spaces and guides them to improve the quality of the physical spaces. To create a theoretical framework, reliable sources are investigated in the light of High/Scope education and the requirements of pre-school children educational spaces. Physical space characteristics, the preschool child's requirements and High/Scope methodology identified design inputs, design considerations and recommendations that shape the final guideline for spatial arrangement in a High/Scope setting are integrated. Discussions and suggestions in this research benefit both designers and High/ Scope teaching staff. Results help High/Scope teaching staff increase the quality of a space in an educational setting without having an architectural background. The theoretical framework of the research allows designers to consider key features and users' possible activities in High/ Scope settings and shape their designs accordingly.

  12. Cytologic diagnosis: expression of probability by clinical pathologists.

    Science.gov (United States)

    Christopher, Mary M; Hotz, Christine S

    2004-01-01

    Clinical pathologists use descriptive terms or modifiers to express the probability or likelihood of a cytologic diagnosis. Words are imprecise in meaning, however, and may be used and interpreted differently by pathologists and clinicians. The goals of this study were to 1) assess the frequency of use of 18 modifiers, 2) determine the probability of a positive diagnosis implied by the modifiers, 3) identify preferred modifiers for different levels of probability, 4) ascertain the importance of factors that affect expression of diagnostic certainty, and 5) evaluate differences based on gender, employment, and experience. We surveyed 202 clinical pathologists who were board-certified by the American College of Veterinary Pathologists (Clinical Pathology). Surveys were distributed in October 2001 and returned by e-mail, fax, or surface mail over a 2-month period. Results were analyzed by parametric and nonparametric tests. Survey response rate was 47.5% (n = 96) and primarily included clinical pathologists at veterinary schools (n = 58) and diagnostic laboratories (n = 31). Eleven of 18 terms were used "often" or "sometimes" by >/= 50% of respondents. Broad variability was found in the probability assigned to each term, especially those with median values of 75 to 90%. Preferred modifiers for 7 numerical probabilities ranging from 0 to 100% included 68 unique terms; however, a set of 10 terms was used by >/= 50% of respondents. Cellularity and quality of the sample, experience of the pathologist, and implications of the diagnosis were the most important factors affecting the expression of probability. Because of wide discrepancy in the implied likelihood of a diagnosis using words, defined terminology and controlled vocabulary may be useful in improving communication and the quality of data in cytology reporting.

  13. Pre-Clinical Testing of Real-Time PCR Assays for Diarrheal Disease Agents of Genera Escherichia and Shigella

    Science.gov (United States)

    2014-05-16

    FOR DIARRHEAL DISEASE AGENTS OF GENERA ESCHERICHIA AND SHIGELLA May 16, 2014 Reporting Period: October 1, 2010 to September 30, 2013...10-2010 - 30-09-2013 PRE-CLINICAL TESTING OF REAL-TIME PCR ASSAYS FOR DIARRHEAL DISEASE AGENTS OF GENERA ESCHERICHIA AND SHIGELLA ...Texas (MOA 2007 - 2013. Agreement No.: DODI 4000.19; AFI 25-201). Pre-clinical test results qualify ETEC and Shigella real-time PCR assays as lead

  14. Pre- and Post-Test Results of KEEP Class 2: 1973-74. Technical Report #40.

    Science.gov (United States)

    Fox, Candy

    This report presents the pre-and posttest results for the kindergarten year of the Kamehameha Early Education Program (KEEP) Class 2, 1973-1974. Results are presented for the Wechsler Preschool and Primary Scale of Intelligence (WPPSI), the Metropolitan Readiness Test (MRT), and the Standard English Repetition Test (SERT). Comparisons are made…

  15. Cost-Effectiveness Analysis Comparing Pre-diagnosis Autism Spectrum Disorder (ASD)-Targeted Intervention with Ontario's Autism Intervention Program.

    Science.gov (United States)

    Penner, Melanie; Rayar, Meera; Bashir, Naazish; Roberts, S Wendy; Hancock-Howard, Rebecca L; Coyte, Peter C

    2015-09-01

    Novel management strategies for autism spectrum disorder (ASD) propose providing interventions before diagnosis. We performed a cost-effectiveness analysis comparing the costs and dependency-free life years (DFLYs) generated by pre-diagnosis intensive Early Start Denver Model (ESDM-I); pre-diagnosis parent-delivered ESDM (ESDM-PD); and the Ontario Status Quo (SQ). The analyses took government and societal perspectives to age 65. We assigned probabilities of Independent, Semi-dependent or Dependent living based on projected IQ. Costs per person (in Canadian dollars) were ascribed to each living setting. From a government perspective, the ESDM-PD produced an additional 0.17 DFLYs for $8600 less than SQ. From a societal perspective, the ESDM-I produced an additional 0.53 DFLYs for $45,000 less than SQ. Pre-diagnosis interventions targeting ASD symptoms warrant further investigation.

  16. Barley Seed Germination/Root Elongation Toxicity Test For Evaluation Of Sludge Pre-Treatment

    DEFF Research Database (Denmark)

    Eriksson, Eva; Kusk, Kresten Ole; Barrett Sørensen, Mie

    Application of sludge from wastewater treatment plants (WWTPs) on agricultural land is an approach for nutrient recycling that rise challenges due to recalcitrant and harmful pollutants. In this study we assessed the feasibility of a seed germination test to evaluate sludge ecotoxicity and compared...... germination responses from two test parameters, root elongation and seed germination (sprouts elongation) of the barley (Hordeum vulgare). 2nd objective was to evaluate sewage sludge pre-treatments at batch-scale of sludge samples from two WWTPs using anaerobic digestion, and thermal and ozonation pre......-treatments. Glyphosate and eco-labelled soil were used as references. Inhibition of germination of seeds exposed to the glyphosate and sludge was registered and thus germination was successfully applied for sludge ecotoxicity assessment, and using the root elongation as the end-point was both faster and more precise...

  17. Experimental Implementation of a Kochen-Specker Set of Quantum Tests

    Directory of Open Access Journals (Sweden)

    Vincenzo D’Ambrosio

    2013-02-01

    Full Text Available The conflict between classical and quantum physics can be identified through a series of yes-no tests on quantum systems, without it being necessary that these systems be in special quantum states. Kochen-Specker (KS sets of yes-no tests have this property and provide a quantum-versus-classical advantage that is free of the initialization problem that affects some quantum computers. Here, we report the first experimental implementation of a complete KS set that consists of 18 yes-no tests on four-dimensional quantum systems and show how to use the KS set to obtain a state-independent quantum advantage. We first demonstrate the unique power of this KS set for solving a task while avoiding the problem of state initialization. Such a demonstration is done by showing that, for 28 different quantum states encoded in the orbital-angular-momentum and polarization degrees of freedom of single photons, the KS set provides an impossible-to-beat solution. In a second experiment, we generate maximally contextual quantum correlations by performing compatible sequential measurements of the polarization and path of single photons. In this case, state independence is demonstrated for 15 different initial states. Maximum contextuality and state independence follow from the fact that the sequences of measurements project any initial quantum state onto one of the KS set’s eigenstates. Our results show that KS sets can be used for quantum-information processing and quantum computation and pave the way for future developments.

  18. Experimental Implementation of a Kochen-Specker Set of Quantum Tests

    Science.gov (United States)

    D'Ambrosio, Vincenzo; Herbauts, Isabelle; Amselem, Elias; Nagali, Eleonora; Bourennane, Mohamed; Sciarrino, Fabio; Cabello, Adán

    2013-01-01

    The conflict between classical and quantum physics can be identified through a series of yes-no tests on quantum systems, without it being necessary that these systems be in special quantum states. Kochen-Specker (KS) sets of yes-no tests have this property and provide a quantum-versus-classical advantage that is free of the initialization problem that affects some quantum computers. Here, we report the first experimental implementation of a complete KS set that consists of 18 yes-no tests on four-dimensional quantum systems and show how to use the KS set to obtain a state-independent quantum advantage. We first demonstrate the unique power of this KS set for solving a task while avoiding the problem of state initialization. Such a demonstration is done by showing that, for 28 different quantum states encoded in the orbital-angular-momentum and polarization degrees of freedom of single photons, the KS set provides an impossible-to-beat solution. In a second experiment, we generate maximally contextual quantum correlations by performing compatible sequential measurements of the polarization and path of single photons. In this case, state independence is demonstrated for 15 different initial states. Maximum contextuality and state independence follow from the fact that the sequences of measurements project any initial quantum state onto one of the KS set’s eigenstates. Our results show that KS sets can be used for quantum-information processing and quantum computation and pave the way for future developments.

  19. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  20. A Cryogenic Test Station for the Pre-series 2400 W @ 1.8 K Refrigeration Units for the LHC

    CERN Document Server

    Claudet, S; Gully, P; Jäger, B; Millet, F; Roussel, P; Tavian, L

    2002-01-01

    The cooling capacity below 2 K for the superconducting magnets in the Large Hadron Collider (LHC), at CERN, will be provided by eight refrigeration units at 1.8 K, each of them coupled to a 4.5 K refrigerator. The supply of the series units is linked to successful testing and acceptance of the pre-series delivered by the two selected vendors. To properly assess the performance of specific components such as cold compressors and some process specificities a dedicated test station is necessary. The test station is able to process up to 130 g/s between 4.5 & 20 K and aims at simulating the steady and transient operational modes foreseen for the LHC. After recalling the basic characteristics of the 1.8 K refrigeration units and the content of the acceptance tests of the pre-series, the principle of the test cryostat is detailed. The components of the test station and corresponding layout are described. The first testing experience is presented as well as preliminary results of the pre-series units.

  1. BETASCAN: probable beta-amyloids identified by pairwise probabilistic analysis.

    Directory of Open Access Journals (Sweden)

    Allen W Bryan

    2009-03-01

    Full Text Available Amyloids and prion proteins are clinically and biologically important beta-structures, whose supersecondary structures are difficult to determine by standard experimental or computational means. In addition, significant conformational heterogeneity is known or suspected to exist in many amyloid fibrils. Recent work has indicated the utility of pairwise probabilistic statistics in beta-structure prediction. We develop here a new strategy for beta-structure prediction, emphasizing the determination of beta-strands and pairs of beta-strands as fundamental units of beta-structure. Our program, BETASCAN, calculates likelihood scores for potential beta-strands and strand-pairs based on correlations observed in parallel beta-sheets. The program then determines the strands and pairs with the greatest local likelihood for all of the sequence's potential beta-structures. BETASCAN suggests multiple alternate folding patterns and assigns relative a priori probabilities based solely on amino acid sequence, probability tables, and pre-chosen parameters. The algorithm compares favorably with the results of previous algorithms (BETAPRO, PASTA, SALSA, TANGO, and Zyggregator in beta-structure prediction and amyloid propensity prediction. Accurate prediction is demonstrated for experimentally determined amyloid beta-structures, for a set of known beta-aggregates, and for the parallel beta-strands of beta-helices, amyloid-like globular proteins. BETASCAN is able both to detect beta-strands with higher sensitivity and to detect the edges of beta-strands in a richly beta-like sequence. For two proteins (Abeta and Het-s, there exist multiple sets of experimental data implying contradictory structures; BETASCAN is able to detect each competing structure as a potential structure variant. The ability to correlate multiple alternate beta-structures to experiment opens the possibility of computational investigation of prion strains and structural heterogeneity of amyloid

  2. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    finite set can occur as the outcome distribution of a quantum-mechanical von Neumann measurement with postselection, given that the scalar product between the initial and the final state is known as well as the success probability of the postselection. An intermediate von Neumann measurement can enhance transition probabilities between states such that the error probability shrinks by a factor of up to 2. Chapter 4: A presentation of the category of stochastic matrices. This chapter gives generators and relations for the strict monoidal category of probabilistic maps on finite cardinals (i.e., stochastic matrices). Chapter 5: Convex Spaces: Definition and Examples. We try to promote convex spaces as an abstract concept of convexity which was introduced by Stone as ''barycentric calculus''. A convex space is a set where one can take convex combinations in a consistent way. By identifying the corresponding Lawvere theory as the category from chapter 4 and using the results obtained there, we give a different proof of a result of Swirszcz which shows that convex spaces can be identified with algebras of a finitary version of the Giry monad. After giving an extensive list of examples of convex sets as they appear throughout mathematics and theoretical physics, we note that there also exist convex spaces that cannot be embedded into a vector space: semilattices are a class of examples of purely combinatorial type. In an information-theoretic interpretation, convex subsets of vector spaces are probabilistic, while semilattices are possibilistic. Convex spaces unify these two concepts. (orig.)

  3. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  4. Cosmic Bell Test: Measurement Settings from Milky Way Stars

    Science.gov (United States)

    Handsteiner, Johannes; Friedman, Andrew S.; Rauch, Dominik; Gallicchio, Jason; Liu, Bo; Hosp, Hannes; Kofler, Johannes; Bricher, David; Fink, Matthias; Leung, Calvin; Mark, Anthony; Nguyen, Hien T.; Sanders, Isabella; Steinlechner, Fabian; Ursin, Rupert; Wengerowsky, Sören; Guth, Alan H.; Kaiser, David I.; Scheidl, Thomas; Zeilinger, Anton

    2017-02-01

    Bell's theorem states that some predictions of quantum mechanics cannot be reproduced by a local-realist theory. That conflict is expressed by Bell's inequality, which is usually derived under the assumption that there are no statistical correlations between the choices of measurement settings and anything else that can causally affect the measurement outcomes. In previous experiments, this "freedom of choice" was addressed by ensuring that selection of measurement settings via conventional "quantum random number generators" was spacelike separated from the entangled particle creation. This, however, left open the possibility that an unknown cause affected both the setting choices and measurement outcomes as recently as mere microseconds before each experimental trial. Here we report on a new experimental test of Bell's inequality that, for the first time, uses distant astronomical sources as "cosmic setting generators." In our tests with polarization-entangled photons, measurement settings were chosen using real-time observations of Milky Way stars while simultaneously ensuring locality. Assuming fair sampling for all detected photons, and that each stellar photon's color was set at emission, we observe statistically significant ≳7.31 σ and ≳11.93 σ violations of Bell's inequality with estimated p values of ≲1.8 ×10-13 and ≲4.0 ×10-33, respectively, thereby pushing back by ˜600 years the most recent time by which any local-realist influences could have engineered the observed Bell violation.

  5. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    Science.gov (United States)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  6. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  7. Exploring the initial steps of the testing process: frequency and nature of pre-preanalytic errors.

    Science.gov (United States)

    Carraro, Paolo; Zago, Tatiana; Plebani, Mario

    2012-03-01

    Few data are available on the nature of errors in the so-called pre-preanalytic phase, the initial steps of the testing process. We therefore sought to evaluate pre-preanalytic errors using a study design that enabled us to observe the initial procedures performed in the ward, from the physician's test request to the delivery of specimens in the clinical laboratory. After a 1-week direct observational phase designed to identify the operating procedures followed in 3 clinical wards, we recorded all nonconformities and errors occurring over a 6-month period. Overall, the study considered 8547 test requests, for which 15 917 blood sample tubes were collected and 52 982 tests undertaken. No significant differences in error rates were found between the observational phase and the overall study period, but underfilling of coagulation tubes was found to occur more frequently in the direct observational phase (P = 0.043). In the overall study period, the frequency of errors was found to be particularly high regarding order transmission [29 916 parts per million (ppm)] and hemolysed samples (2537 ppm). The frequency of patient misidentification was 352 ppm, and the most frequent nonconformities were test requests recorded in the diary without the patient's name and failure to check the patient's identity at the time of blood draw. The data collected in our study confirm the relative frequency of pre-preanalytic errors and underline the need to consensually prepare and adopt effective standard operating procedures in the initial steps of laboratory testing and to monitor compliance with these procedures over time.

  8. Bias-Free Chemically Diverse Test Sets from Machine Learning.

    Science.gov (United States)

    Swann, Ellen T; Fernandez, Michael; Coote, Michelle L; Barnard, Amanda S

    2017-08-14

    Current benchmarking methods in quantum chemistry rely on databases that are built using a chemist's intuition. It is not fully understood how diverse or representative these databases truly are. Multivariate statistical techniques like archetypal analysis and K-means clustering have previously been used to summarize large sets of nanoparticles however molecules are more diverse and not as easily characterized by descriptors. In this work, we compare three sets of descriptors based on the one-, two-, and three-dimensional structure of a molecule. Using data from the NIST Computational Chemistry Comparison and Benchmark Database and machine learning techniques, we demonstrate the functional relationship between these structural descriptors and the electronic energy of molecules. Archetypes and prototypes found with topological or Coulomb matrix descriptors can be used to identify smaller, statistically significant test sets that better capture the diversity of chemical space. We apply this same method to find a diverse subset of organic molecules to demonstrate how the methods can easily be reapplied to individual research projects. Finally, we use our bias-free test sets to assess the performance of density functional theory and quantum Monte Carlo methods.

  9. Mineralogic and petrologic investigation of pre-test core samples from the spent fuel test-climax

    International Nuclear Information System (INIS)

    Ryerson, F.J.; Qualheim, B.J.

    1983-12-01

    Pre-test samples obtained from just inside the perimeter of the canister emplacement holes of the Spent Fuel Test-Climax have been characterized by petrographic and microanalytical techniques. The primary quartz monzonite has undergone various degrees of hydrothermal alteration as a result of natural processes. Alteration is most apparent on primary plagioclase and biotite. The most common secondary phases on plagioclase are muscovite and calcite, while the most common secondary phases on biotite are epidote and chlorite. The major alteration zones encountered are localized along filled fractures, i.e. veins. The thickness and mineralogy of the alteration zones can be correlated with the vein mineralogy, becoming wider and more complex mineralogically when the veins contain calcite. 7 references, 10 figures, 4 tables

  10. Covariance Based Pre-Filters and Screening Criteria for Conjunction Analysis

    Science.gov (United States)

    George, E., Chan, K.

    2012-09-01

    Several relationships are developed relating object size, initial covariance and range at closest approach to probability of collision. These relationships address the following questions: - Given the objects' initial covariance and combined hard body size, what is the maximum possible value of the probability of collision (Pc)? - Given the objects' initial covariance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the combined hard body radius, what is the minimum miss distance for which the probability of collision does not exceed the tolerance limit? - Given the objects' initial covariance and the miss distance, what is the maximum combined hard body radius for which the probability of collision does not exceed the tolerance limit? The first relationship above allows the elimination of object pairs from conjunction analysis (CA) on the basis of the initial covariance and hard-body sizes of the objects. The application of this pre-filter to present day catalogs with estimated covariance results in the elimination of approximately 35% of object pairs as unable to ever conjunct with a probability of collision exceeding 1x10-6. Because Pc is directly proportional to object size and inversely proportional to covariance size, this pre-filter will have a significantly larger impact on future catalogs, which are expected to contain a much larger fraction of small debris tracked only by a limited subset of available sensors. This relationship also provides a mathematically rigorous basis for eliminating objects from analysis entirely based on element set age or quality - a practice commonly done by rough rules of thumb today. Further, these relations can be used to determine the required geometric screening radius for all objects. This analysis reveals the screening volumes for small objects are much larger than needed, while the screening volumes for

  11. Does rational selection of training and test sets improve the outcome of QSAR modeling?

    Science.gov (United States)

    Martin, Todd M; Harten, Paul; Young, Douglas M; Muratov, Eugene N; Golbraikh, Alexander; Zhu, Hao; Tropsha, Alexander

    2012-10-22

    Prior to using a quantitative structure activity relationship (QSAR) model for external predictions, its predictive power should be established and validated. In the absence of a true external data set, the best way to validate the predictive ability of a model is to perform its statistical external validation. In statistical external validation, the overall data set is divided into training and test sets. Commonly, this splitting is performed using random division. Rational splitting methods can divide data sets into training and test sets in an intelligent fashion. The purpose of this study was to determine whether rational division methods lead to more predictive models compared to random division. A special data splitting procedure was used to facilitate the comparison between random and rational division methods. For each toxicity end point, the overall data set was divided into a modeling set (80% of the overall set) and an external evaluation set (20% of the overall set) using random division. The modeling set was then subdivided into a training set (80% of the modeling set) and a test set (20% of the modeling set) using rational division methods and by using random division. The Kennard-Stone, minimal test set dissimilarity, and sphere exclusion algorithms were used as the rational division methods. The hierarchical clustering, random forest, and k-nearest neighbor (kNN) methods were used to develop QSAR models based on the training sets. For kNN QSAR, multiple training and test sets were generated, and multiple QSAR models were built. The results of this study indicate that models based on rational division methods generate better statistical results for the test sets than models based on random division, but the predictive power of both types of models are comparable.

  12. Qualitative pre-test of Energy Star advertising : final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-01-01

    Natural Resources Canada launched a print advertising campaign and one 30-second television commercial to promote the Energy Star symbol and to acquaint the public with the program that identifies energy efficient products that reduce energy use, save money and reduce greenhouse gas emissions that contribute to climate change. The Communications Branch of Natural Resources Canada wanted to pre-test the television and print ads. Each print ad focused on a particular product category, including home comfort, appliances, electronics and office equipment. The qualitative research methodology was used in the pre-testing because it is the best learning tool for understanding the range and depth of reactions toward a subject at any given time. The findings were not quantifiable because they are not representative of the population at large. Ten focus groups were surveyed in January 2003 in 5 Canadian centres with a total of 83 participants aged 18 to 54. The target groups included people who were informed about climate change issues as well as those who were note. Participants were questioned about the Energy Star Product. Findings were consistent across all 5 locations. There was some general awareness of EnerGuide on appliances in all groups, but generally a low awareness of the Energy Star symbol. Most people did not place energy efficiency as a high priority when purchasing appliances. This report presented the main findings of attitudes towards climate change, Kyoto and energy efficiency. The reaction to the television and print ads was also included along with opinions regarding their main weaknesses and strengths. Some recommendations for improvement were also included. Samples of the print advertisements were included in both English and French. tabs., figs.

  13. Introducing evidence based medicine to the journal club, using a structured pre and post test: a cohort study

    Directory of Open Access Journals (Sweden)

    Mahoney Martin C

    2001-11-01

    Full Text Available Abstract Background Journal Club at a University-based residency program was restructured to introduce, reinforce and evaluate residents understanding of the concepts of Evidence Based Medicine. Methods Over the course of a year structured pre and post-tests were developed for use during each Journal Club. Questions were derived from the articles being reviewed. Performance with the key concepts of Evidence Based Medicine was assessed. Study subjects were 35 PGY2 and PGY3 residents in a University based Family Practice Program. Results Performance on the pre-test demonstrated a significant improvement from a median of 54.5 % to 78.9 % over the course of the year (F 89.17, p Conclusions Following organizational revision, the introduction of a pre-test/post-test instrument supported achievement of the learning objectives with a better understanding and utilization of the concepts of Evidence Based Medicine.

  14. JPSS-1 VIIRS Pre-Launch Radiometric Performance

    Science.gov (United States)

    Oudrari, Hassan; McIntire, Jeff; Xiong, Xiaoxiong; Butler, James; Efremova, Boryana; Ji, Jack; Lee, Shihyan; Schwarting, Tom

    2015-01-01

    The Visible Infrared Imaging Radiometer Suite (VIIRS) on-board the first Joint Polar Satellite System (JPSS) completed its sensor level testing on December 2014. The JPSS-1 (J1) mission is scheduled to launch in December 2016, and will be very similar to the Suomi-National Polar-orbiting Partnership (SNPP) mission. VIIRS instrument was designed to provide measurements of the globe twice daily. It is a wide-swath (3,040 kilometers) cross-track scanning radiometer with spatial resolutions of 370 and 740 meters at nadir for imaging and moderate bands, respectively. It covers the wavelength spectrum from reflective to long-wave infrared through 22 spectral bands [0.412 microns to 12.01 microns]. VIIRS observations are used to generate 22 environmental data products (EDRs). This paper will briefly describe J1 VIIRS characterization and calibration performance and methodologies executed during the pre-launch testing phases by the independent government team, to generate the at-launch baseline radiometric performance, and the metrics needed to populate the sensor data record (SDR) Look-Up-Tables (LUTs). This paper will also provide an assessment of the sensor pre-launch radiometric performance, such as the sensor signal to noise ratios (SNRs), dynamic range, reflective and emissive bands calibration performance, polarization sensitivity, bands spectral performance, response-vs-scan (RVS), near field and stray light responses. A set of performance metrics generated during the pre-launch testing program will be compared to the SNPP VIIRS pre-launch performance.

  15. Understanding Pre-Quantitative Risk in Projects

    Science.gov (United States)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  16. IEEE guide for planning of pre-operational testing programs for class 1E power systems for nuclear-power generating stations

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    The Institute of Electrical and Electronics Engineers (IEEE) guide for pre-operational testing of Class 1E power systems for nuclear-power generating stations is presented. The guidelines apply to power systems both ac and dc supplies but not to the equipment which utilizes the ac and dc power. The pre-operational tests are performed after appropriate construction tests

  17. Fracture Testing of Honeycomb Core Sandwich Composites Using the DCB-UBM Test

    DEFF Research Database (Denmark)

    Saseendran, Vishnu; Berggreen, Christian; Carlsson, Leif A.

    2015-01-01

    of Linear Elastic Fracture Mechanics (LEFM). The Double Cantilever Beam subjected to Uneven Bending Moments (DCB-UBM) test set-up, which was introduced by Sørensen.et.al [1], circumvents any dependency of the pre-crack length in calculation of Gc. The new test setup is based on rotary actuators which...

  18. Pre-HEAT: submillimeter site testing and astronomical spectra from Dome A, Antarctica

    Science.gov (United States)

    Kulesa, C. A.; Walker, C. K.; Schein, M.; Golish, D.; Tothill, N.; Siegel, P.; Weinreb, S.; Jones, G.; Bardin, J.; Jacobs, K.; Martin, C. L.; Storey, J.; Ashley, M.; Lawrence, J.; Luong-Van, D.; Everett, J.; Wang, L.; Feng, L.; Zhu, Z.; Yan, J.; Yang, J.; Zhang, X.-G.; Cui, X.; Yuan, X.; Hu, J.; Xu, Z.; Jiang, Z.; Yang, H.; Li, Y.; Sun, B.; Qin, W.; Shang, Z.

    2008-07-01

    Pre-HEAT is a 20 cm aperture submillimeter-wave telescope with a 660 GHz (450 micron) Schottky diode heterodyne receiver and digital FFT spectrometer for the Plateau Observatory (PLATO) developed by the University of New South Wales. In January 2008 it was deployed to Dome A, the summit of the Antarctic plateau, as part of a scientific traverse led by the Polar Research Institute of China and the Chinese Academy of Sciences. Dome A may be one of the best sites in the world for ground based Terahertz astronomy, based on the exceptionally cold, dry and stable conditions which prevail there. Pre-HEAT is measuring the 450 micron sky opacity at Dome A and mapping the Galactic Plane in the 13CO J=6-5 line, constituting the first submillimeter measurements from Dome A. It is field-testing many of the key technologies for its namesake -- a successor mission called HEAT: the High Elevation Antarctic Terahertz telescope. Exciting prospects for submillimeter astronomy from Dome A and the status of Pre-HEAT will be presented.

  19. Probability density fittings of corrosion test-data: Implications on ...

    Indian Academy of Sciences (India)

    Steel-reinforced concrete; probability distribution functions; corrosion ... to be present in the corrosive system at a suitable concentration (Holoway et al 2004; Söylev & ..... voltage, equivalent to voltage drop, across a resistor divided by the ...

  20. Are RA patients from a non-endemic HCV population screened for HCV? A cross-sectional analysis of three different settings.

    Science.gov (United States)

    Skinner-Taylor, Cassandra Michelle; Erhard-Ramírez, Alejandro; Garza-Elizondo, Mario Alberto; Esquivel-Valerio, Jorge Antonio; Abud-Mendoza, Carlos; Martínez-Martínez, Marco Ulises; Vega-Morales, David; Arana-Guajardo, Ana

    In Mexico, other risk factors are associated with hepatitis C virus (HCV): prior heroin users, living alone, widower, and northern region residence. Rheumatoid arthritis (RA) patients are considered immunosuppressed and HCV testing is recommended before treatment. The aim of the study was to describe the characteristics of HCV testing in RA patients in three different medical care settings in a non-endemic area. A retrospective observational study was performed using medical records from 960 RA patients describing the indications for HCV testing. The test was performed in 28.6% and the HCV overall frequency was 0.36%. Population characteristics were not associated with an increased risk of HCV infection; therefore, anti-HCV positivity was low. The main reason for testing was before starting biological agents. Due to the low pre-test probability, testing for HCV infection should be personalized; i.e., according to disease prevalence in a particular geographical location and the individual risk factors. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.

  1. Correlation Between Screening Mammography Interpretive Performance on a Test Set and Performance in Clinical Practice.

    Science.gov (United States)

    Miglioretti, Diana L; Ichikawa, Laura; Smith, Robert A; Buist, Diana S M; Carney, Patricia A; Geller, Berta; Monsees, Barbara; Onega, Tracy; Rosenberg, Robert; Sickles, Edward A; Yankaskas, Bonnie C; Kerlikowske, Karla

    2017-10-01

    Evidence is inconsistent about whether radiologists' interpretive performance on a screening mammography test set reflects their performance in clinical practice. This study aimed to estimate the correlation between test set and clinical performance and determine if the correlation is influenced by cancer prevalence or lesion difficulty in the test set. This institutional review board-approved study randomized 83 radiologists from six Breast Cancer Surveillance Consortium registries to assess one of four test sets of 109 screening mammograms each; 48 radiologists completed a fifth test set of 110 mammograms 2 years later. Test sets differed in number of cancer cases and difficulty of lesion detection. Test set sensitivity and specificity were estimated using woman-level and breast-level recall with cancer status and expert opinion as gold standards. Clinical performance was estimated using women-level recall with cancer status as the gold standard. Spearman rank correlations between test set and clinical performance with 95% confidence intervals (CI) were estimated. For test sets with fewer cancers (N = 15) that were more difficult to detect, correlations were weak to moderate for sensitivity (woman level = 0.46, 95% CI = 0.16, 0.69; breast level = 0.35, 95% CI = 0.03, 0.61) and weak for specificity (0.24, 95% CI = 0.01, 0.45) relative to expert recall. Correlations for test sets with more cancers (N = 30) were close to 0 and not statistically significant. Correlations between screening performance on a test set and performance in clinical practice are not strong. Test set performance more accurately reflects performance in clinical practice if cancer prevalence is low and lesions are challenging to detect. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  2. Pre-Algebra Lexicon.

    Science.gov (United States)

    Hayden, Dunstan; Cuevas, Gilberto

    The pre-algebra lexicon is a set of classroom exercises designed to teach the technical words and phrases of pre-algebra mathematics, and includes the terms most commonly found in related mathematics courses. The lexicon has three parts, each with its own introduction. The first introduces vocabulary items in three groups forming a learning…

  3. The Sequential Probability Ratio Test: An efficient alternative to exact binomial testing for Clean Water Act 303(d) evaluation.

    Science.gov (United States)

    Chen, Connie; Gribble, Matthew O; Bartroff, Jay; Bay, Steven M; Goldstein, Larry

    2017-05-01

    The United States's Clean Water Act stipulates in section 303(d) that states must identify impaired water bodies for which total maximum daily loads (TMDLs) of pollution inputs into water bodies are developed. Decision-making procedures about how to list, or delist, water bodies as impaired, or not, per Clean Water Act 303(d) differ across states. In states such as California, whether or not a particular monitoring sample suggests that water quality is impaired can be regarded as a binary outcome variable, and California's current regulatory framework invokes a version of the exact binomial test to consolidate evidence across samples and assess whether the overall water body complies with the Clean Water Act. Here, we contrast the performance of California's exact binomial test with one potential alternative, the Sequential Probability Ratio Test (SPRT). The SPRT uses a sequential testing framework, testing samples as they become available and evaluating evidence as it emerges, rather than measuring all the samples and calculating a test statistic at the end of the data collection process. Through simulations and theoretical derivations, we demonstrate that the SPRT on average requires fewer samples to be measured to have comparable Type I and Type II error rates as the current fixed-sample binomial test. Policymakers might consider efficient alternatives such as SPRT to current procedure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Pre-operational proof and leakage rate testing requirements for concrete containment structures for CANDU nuclear power plants

    International Nuclear Information System (INIS)

    1994-02-01

    This Standard provides the requirements for pre-operational proof tests and leakage rate tests of concrete containment structures of a containment system designed as Class Containment components. 1 fig

  5. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  6. Elapsed decision time affects the weighting of prior probability in a perceptual decision task

    Science.gov (United States)

    Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.

    2012-01-01

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274

  7. Combined use of clinical pre-test probability and D-dimer test in the diagnosis of preoperative deep venous thrombosis in colorectal cancer patients

    DEFF Research Database (Denmark)

    Stender, Mogens; Frøkjaer, Jens Brøndum; Hagedorn Nielsen, Tina Sandie

    2008-01-01

    The preoperative prevalence of deep venous thrombosis (DVT) in patients with colorectal cancer may be as high as 8%. In order to minimize the risk of pulmonary embolism, it is important to rule out preoperative DVT. A large study has confirmed that a negative D-dimer test in combination with a low...... preoperative DVT in colorectal cancer patients admitted for surgery. Preoperative D-dimer test and compression ultrasonography for DVT were performed in 193 consecutive patients with newly diagnosed colorectal cancer. Diagnostic accuracy indices of the D-dimer test were assessed according to the PTP score...... in ruling out preoperative DVT in colorectal cancer patients admitted for surgery....

  8. Validation of pre-coated ELISA tests to detect antibodies against T. congolense and T. vivax

    International Nuclear Information System (INIS)

    Shumba, W.

    2000-01-01

    The anti-trypanosomal antibody detecting enzyme linked immunosorbent assay (ELISA) was first described in 1977 and was further developed for use in large scale surveys in Zimbabwe. More recently, the IAEA initiated a programme to improve the robustness and standardisation of the assay. The IAEA supplied plates pre-coated with either a crude T. congolense or T. vivax antigen and the reagents necessary for analysing samples. Parasitologically positive and negative sera were used to validate and determine the cut-off values of the two tests. The samples were tested and results analysed using a variety of cut-off values. The tests provided similar information although the T. congolense pre-coated plates gave significantly higher optical density values than the plates coated with T. vivax. Sensitivity and specificity values were calculated using the different cut-off points. Results indicate that the test using T. congolense antigen had the highest specificity and sensitivity for a given cut-off value. Although the test could distinguish positive from negative sera, it was quite difficult to provide a suitable cut-off value, but the value should be dictated by the use of the test. (author)

  9. Learning to Work with Databases in Astronomy: Quantitative Analysis of Science Educators' and Students' Pre-/Post-Tests

    Science.gov (United States)

    Schwortz, Andria C.; Burrows, Andrea C.; Myers, Adam D.

    2015-01-01

    Astronomy is increasingly moving towards working with large databases, from the state-of-the-art Sloan Digital Sky Survey Data Release 10, to the historical Digital Access to a Sky Century at Harvard. Non-astronomy fields as well tend to work with large datasets, be it in the form of warehouse inventory, health trends, or the stock market. However very few fields explicitly teach students the necessary skills to analyze such data. The authors studied a matched set of 37 participants working with 200-entry databases in astronomy using Google Spreadsheets, with limited information about a random set of quasars drawn from SDSS DR5. Here the authors present the quantitative results from an eight question pre-/post-test, with questions designed to span Bloom's taxonomy, on both the topics of the skills of using spreadsheets, and the content of quasars. Participants included both Astro 101 summer students and professionals including in-service K-12 teachers and science communicators. All groups showed statistically significant gains (as per Hake, 1998), with the greatest difference between women's gains of 0.196 and men's of 0.480.

  10. ENDF/B Pre-Processing Codes: Implementing and testing on a Personal Computer

    International Nuclear Information System (INIS)

    McLaughlin, P.K.

    1987-05-01

    This document describes the contents of the diskettes containing the ENDF/B Pre-Processing codes by D.E. Cullen, and example data for use in implementing and testing these codes on a Personal Computer of the type IBM-PC/AT. Upon request the codes are available from the IAEA Nuclear Data Section, free of charge, on a series of 7 diskettes. (author)

  11. High heat flux tests of the WENDELSTEIN 7-X pre-series target elements - experimental evaluation of the thermo-mechanical behaviour

    International Nuclear Information System (INIS)

    Greuner, H.; Boeswirth, B.; Boscary, J.; Plankensteiner, A.; Schedler, B.

    2006-01-01

    The HHF testing of WENDELSTEIN 7-X pre-series target elements is an indispensable step in the qualification of the manufacturing process. The finally 890 divertor target elements are made of an actively water-cooled CuCrZr heat sink covered with flat tiles of CFC NB31 as plasma facing material. A set of 20 full scale pre-series elements was manufactured by PLANSEE to validate the materials and manufacturing technologies prior to the start of the series production. Due to the large mismatch in the coefficients of thermal expansion for CFC and CuCrZr - resulting in high residual stresses as well as high operation-induced stresses - the bonding zone between CFC and CuCrZr was detected to be the most critical issue for the operational behaviour of the target elements. To achieve a sufficiently high manufacturing quality together with a high lifetime during operation thermal testing of full scale mockups was performed in combination with extensive FEM analyses. In both cases heat loads were applied similar to the expected heat loads in W7-X. All pre-series elements were tested in the ion beam test facility GLADIS. The elements were tested with 100 cycles of 10 MW/m 2 and several elements with even higher cycle numbers and heat loads up to 24 MW/m 2 . The instrumentation of the targets (thermocouples, strain gages) and the infrared camera observation of the heat loaded surface allow an experimental evaluation of the thermo-mechanical behaviour of the tested elements. The main result is a good agreement between experimental data and numerically computed predictions. Hot spots were, however, observed at the edges of several tiles during the HHF tests indicating local bonding problems. Therefore, a programme of fully 3D nonlinear thermal-mechanical FEM calculations was started to evaluate the thermo-mechanical behavior of the target elements with special focus on the optimization of the stress situation in the bonding zone between the CFC and the CuCrZr heat sink. This

  12. The Computerized Table Setting Test for Detecting Unilateral Neglect.

    Directory of Open Access Journals (Sweden)

    Seok Jong Chung

    Full Text Available Patients with unilateral neglect fail to respond normally to stimuli on the left side. To facilitate the evaluation of unilateral spatial neglect, we developed a new application that runs on a tablet device and investigated its feasibility in stroke patients.We made the computerized table setting test (CTST to run on the tablet computer. Forty acute ischemic stroke patients (20 patients with right hemispheric infarction with neglect, 10 patients with right hemispheric infarction without neglect, and 10 patients with left hemispheric infarction and 10 healthy controls were prospectively enrolled to validate the CTST. The test requires subjects to set a table by dragging 12 dishes located below the table on the tablet screen. The horizontal deviation of the 12 dishes from the midline of the table, the selection tendency measured by the sequence of the dish selection, and the elapsed time for table setting were calculated automatically.Parameters measured by the CTST were correlated with the results of conventional neglect tests. The horizontal deviation was significantly higher in patients with right hemispheric infarction with neglect compared with the other groups. The selection tendency and elapsed time also were significantly different in patients with right hemispheric infarction with neglect compared with the left hemispheric infarction and control groups, but were similar to those with right hemispheric infarction without neglect.The CTST is feasible to administer and comparable with conventional neglect tests. This new application may be useful for the initial diagnosis and follow-up of neglect patients.

  13. Tactical asset allocation in a real-life setting

    OpenAIRE

    Hernæs, Vegard

    2014-01-01

    Masteroppgave i økonomi og administrasjon – Universitetet i Agder 2014 This thesis will test one of the most popular market-timing strategies, using the longest available data set ranging from 1857 to 2012. The market-timing strategy has already proven to deliver superior results in the period 1926-2012 in a back-test. Which is why, the performance of the period pre-1926 will be compared to the post-1926 performance in a back-test. The performance of the two periods is similar, but period ...

  14. Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data

    Science.gov (United States)

    Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei

    2009-03-01

    We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.

  15. New modalities in the treatment of HCV in pre and post - transplantation setting.

    Science.gov (United States)

    Araz, Filiz; Durand, Christine M; Gürakar, Ahmet

    2015-05-01

    End-stage liver disease and hepatocellular carcinoma (HCC) secondary to hepatitis C virus (HCV) infection are the leading indications for liver transplantation (LT) in developed countries. Recurrence of HCV following LT is universal if the recipient has detectable serum HCV RNA at the time of LT. Recurrent HCV has an accelerated course and is associated with poor long term patient and graft survival. Interferon (IFN)-based regimens have achieved low Sustained Virological Rates (SVR) in this setting and are associated with a high rate of adverse events, resulting in treatment discontinuation. With advances in understanding the HCV life cycle, drugs targeting specific steps, particularly inhibiting the NS3/4A protease, NS5B RNA dependent RNA polymerase and the NS5A protein, have been developed. Sofosbuvir (SOF), a nucleotide analogue inhibitor of NS5B polymerase was the first compound to enter the market. Combinations of SOF with new HCV antivirals from other classes have allowed for IFN-free regimens with low rates of adverse events and SVR rates >90%. With the availability of newer agents, the approach to the treatment of HCV infection during the pre-and post-liver transplantation period has changed. We will hereby review the current status of HCV treatment and discuss the potential future therapies in the transplant setting.

  16. PRE-DISCOVERY OBSERVATIONS OF DISRUPTING ASTEROID P/2010 A2

    International Nuclear Information System (INIS)

    Jewitt, David; Stuart, Joseph S.; Li Jing

    2011-01-01

    Solar system object P/2010 A2 is the first-noticed example of the aftermath of a recently disrupted asteroid, probably resulting from a collision. Nearly a year elapsed between its inferred initiation in early 2009 and its eventual detection in early 2010. Here, we use new observations to assess the factors underlying the visibility, especially to understand the delayed discovery. We present pre-discovery observations from the LINEAR telescope and set limits to the early-time brightness from SOHO and STEREO satellite coronagraphic images. Consideration of the circumstances of discovery of P/2010 A2 suggests that similar objects must be common, and that future all-sky surveys will reveal them in large numbers.

  17. Normal mammogram detection based on local probability difference transforms and support vector machines

    International Nuclear Information System (INIS)

    Chiracharit, W.; Kumhom, P.; Chamnongthai, K.; Sun, Y.; Delp, E.J.; Babbs, C.F

    2007-01-01

    Automatic detection of normal mammograms, as a ''first look'' for breast cancer, is a new approach to computer-aided diagnosis. This approach may be limited, however, by two main causes. The first problem is the presence of poorly separable ''crossed-distributions'' in which the correct classification depends upon the value of each feature. The second problem is overlap of the feature distributions that are extracted from digitized mammograms of normal and abnormal patients. Here we introduce a new Support Vector Machine (SVM) based method utilizing with the proposed uncrossing mapping and Local Probability Difference (LPD). Crossed-distribution feature pairs are identified and mapped into a new features that can be separated by a zero-hyperplane of the new axis. The probability density functions of the features of normal and abnormal mammograms are then sampled and the local probability difference functions are estimated to enhance the features. From 1,000 ground-truth-known mammograms, 250 normal and 250 abnormal cases, including spiculated lesions, circumscribed masses or microcalcifications, are used for training a support vector machine. The classification results tested with another 250 normal and 250 abnormal sets show improved testing performances with 90% sensitivity and 89% specificity. (author)

  18. Determination method of optimum pre-stress in cables of cable-stayed bridges by using fuzzy sets theory; Fuzzy riron wo mochiita shachokyo cable no saiteki prestress chikara ketteiho

    Energy Technology Data Exchange (ETDEWEB)

    Furuta, H. [Kansai Univ., Osaka (Japan); Kaneyoshi, M.; Tanaka, H. [Hitachi Zosen Corp., Osaka (Japan); Kamei, M.

    1996-06-20

    Generally in cable-stayed bridges, optimum pre-stress is introduced into cables to achieve reducing weight of the cable cross section by reducing and equalizing the cross sectional force of the main girders. However, the conventional optimum stress determining methods require setting the cross section to be repeated. Therefore, in order to omit iterative calculations and derive rational pre-stress, a fuzzy sets theory was introduced. With this method, if upper and lower limits of design values (targeted design values) are inputted, which are desired by a designer to be realized as cross sectional force such as in main girders and towers and cable tension, an optimum stress can be derived automatically by means of a fuzzy linearity regression analysis. The targeted design values are given by experience and engineering judgment, and resetting the cross section is not required as long as a target value which can be tolerated by a hypothetical cross section is given. Since the theory used is a fuzzy sets theory, the derived pre-stress may not be guaranteed as a truly optimum pre-stress. In order to have the result approach an optimum solution, it is important to set adequate upper and lower limits of the targeted design values referring to examples of constructions in the past and experience. 10 refs., 11 figs., 7 tabs.

  19. Pre-Analytical Conditions in Non-Invasive Prenatal Testing of Cell-Free Fetal RHD

    DEFF Research Database (Denmark)

    Clausen, Frederik Banch; Jakobsen, Tanja Roien; Rieneck, Klaus

    2013-01-01

    D positive fetus. Prophylaxis reduces the risk of immunization that may lead to hemolytic disease of the fetus and the newborn. The reliability of predicting the fetal RhD type depends on pre-analytical factors and assay sensitivity. We evaluated the testing setup in the Capital Region of Denmark, based...

  20. Impact of pre-equilibration and diffusion limited release kinetics on effluent concentration in column leaching tests: Insights from numerical simulations.

    Science.gov (United States)

    Finkel, Michael; Grathwohl, Peter

    2017-05-01

    Column leaching tests have become a standard method for assessing leaching of pollutants from materials used, e.g., for road and railway constructions and in landscaping measures. Column tests showed to be practical in laboratories yielding robust and reproducible results. However, considerable uncertainty still exists related particularly to the degree of equilibration of the pore water with the solids during preparation (pre-equilibration) and percolation of the column. We analyse equilibration time scales and sensitivity of concentrations in column leachate with respect to initial conditions in a series of numerical experiments covering a broad spectrum of material and solute properties. Slow release of pollutants from solid materials is described by a spherical diffusion model of kinetic sorption accounting for multiple grain size fractions and sorption capacities. Results show that the cumulative concentrations are rather independent of the pre-equilibration level for a broad spectrum of parameter settings, e.g. if intra-particle porosity is high, grain size is small, or if the sorption coefficient is large. Sensitivity increases with decreasing liquid solid ratios and contact time during percolation. Significant variations with initial column conditions are to be expected for material and compound properties leading to slow release kinetics. In these cases, sensitivity to initial conditions may have to be considered. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  2. Is Fetal Growth Restriction Associated with a More Severe Maternal Phenotype in the Setting of Early Onset Pre-Eclampsia? A Retrospective Study

    Science.gov (United States)

    Weiler, Jane; Tong, Stephen; Palmer, Kirsten R.

    2011-01-01

    Background Both pre-eclampsia and fetal growth restriction are thought to result from abnormal placental implantation in early pregnancy. Consistent with this shared pathophysiology, it is not uncommon to see growth restriction further confound the course of pre-eclampsia and vice versa. It has been previously suggested that superimposed growth restriction is associated with a more severe pre-eclamptic phenotype, however this has not been a consistent finding. Therefore, we set out to determine whether the presence of fetal growth restriction among women with severe early-onset pre-eclampsia was associated with more severe maternal disease compared to those without a growth-restricted fetus. Methods and Findings We undertook a retrospective cohort study of women presenting to a tertiary hospital with severe early-onset pre-eclampsia (restriction. However, no significant difference was seen in relation to the severity of pre-eclampsia between those with or without a growth-restricted baby. The presence of concomitant growth restriction was however associated with a significantly increased risk of stillbirth (p = 0.003) and total perinatal mortality (p = 0.02). Conclusions The presence of fetal growth restriction among women with severe early-onset pre-eclampsia is not associated with increased severity of maternal disease. However the incidence of stillbirth and perinatal death is significantly increased in this sub-population. PMID:22046419

  3. A test-based method for the assessment of pre-crash warning and braking systems.

    Science.gov (United States)

    Bálint, András; Fagerlind, Helen; Kullgren, Anders

    2013-10-01

    In this paper, a test-based assessment method for pre-crash warning and braking systems is presented where the effectiveness of a system is measured by its ability to reduce the number of injuries of a given type or severity in car-to-car rear-end collisions. Injuries with whiplash symptoms lasting longer than 1 month and MAIS2+ injuries in both vehicles involved in the crash are considered in the assessment. The injury reduction resulting from the impact speed reduction due to a pre-crash system is estimated using a method which has its roots in the dose-response model. Human-machine interaction is also taken into account in the assessment. The results reflect the self-protection as well as the partner-protection performance of a pre-crash system in the striking vehicle in rear-end collisions and enable a comparison between two or more systems. It is also shown how the method may be used to assess the importance of warning as part of a pre-crash system. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Collective animal behavior from Bayesian estimation and probability matching.

    Directory of Open Access Journals (Sweden)

    Alfonso Pérez-Escudero

    2011-11-01

    Full Text Available Animals living in groups make movement decisions that depend, among other factors, on social interactions with other group members. Our present understanding of social rules in animal collectives is mainly based on empirical fits to observations, with less emphasis in obtaining first-principles approaches that allow their derivation. Here we show that patterns of collective decisions can be derived from the basic ability of animals to make probabilistic estimations in the presence of uncertainty. We build a decision-making model with two stages: Bayesian estimation and probabilistic matching. In the first stage, each animal makes a Bayesian estimation of which behavior is best to perform taking into account personal information about the environment and social information collected by observing the behaviors of other animals. In the probability matching stage, each animal chooses a behavior with a probability equal to the Bayesian-estimated probability that this behavior is the most appropriate one. This model derives very simple rules of interaction in animal collectives that depend only on two types of reliability parameters, one that each animal assigns to the other animals and another given by the quality of the non-social information. We test our model by obtaining theoretically a rich set of observed collective patterns of decisions in three-spined sticklebacks, Gasterosteus aculeatus, a shoaling fish species. The quantitative link shown between probabilistic estimation and collective rules of behavior allows a better contact with other fields such as foraging, mate selection, neurobiology and psychology, and gives predictions for experiments directly testing the relationship between estimation and collective behavior.

  5. Human Error Analysis by Fuzzy-Set

    International Nuclear Information System (INIS)

    Situmorang, Johnny

    1996-01-01

    In conventional HRA the probability of Error is treated as a single and exact value through constructing even tree, but in this moment the Fuzzy-Set Theory is used. Fuzzy set theory treat the probability of error as a plausibility which illustrate a linguistic variable. Most parameter or variable in human engineering been defined verbal good, fairly good, worst etc. Which describe a range of any value of probability. For example this analysis is quantified the human error in calibration task, and the probability of miscalibration is very low

  6. Optimal testing input sets for reduced diagnosis time of nuclear power plant digital electronic circuits

    International Nuclear Information System (INIS)

    Kim, D.S.; Seong, P.H.

    1994-01-01

    This paper describes the optimal testing input sets required for the fault diagnosis of the nuclear power plant digital electronic circuits. With the complicated systems such as very large scale integration (VLSI), nuclear power plant (NPP), and aircraft, testing is the major factor of the maintenance of the system. Particularly, diagnosis time grows quickly with the complexity of the component. In this research, for reduce diagnosis time the authors derived the optimal testing sets that are the minimal testing sets required for detecting the failure and for locating of the failed component. For reduced diagnosis time, the technique presented by Hayes fits best for the approach to testing sets generation among many conventional methods. However, this method has the following disadvantages: (a) it considers only the simple network (b) it concerns only whether the system is in failed state or not and does not provide the way to locate the failed component. Therefore the authors have derived the optimal testing input sets that resolve these problems by Hayes while preserving its advantages. When they applied the optimal testing sets to the automatic fault diagnosis system (AFDS) which incorporates the advanced fault diagnosis method of artificial intelligence technique, they found that the fault diagnosis using the optimal testing sets makes testing the digital electronic circuits much faster than that using exhaustive testing input sets; when they applied them to test the Universal (UV) Card which is a nuclear power plant digital input/output solid state protection system card, they reduced the testing time up to about 100 times

  7. Assurance of Learning, "Closing the Loop": Utilizing a Pre and Post Test for Principles of Finance

    Science.gov (United States)

    Flanegin, Frank; Letterman, Denise; Racic, Stanko; Schimmel, Kurt

    2010-01-01

    Since there is no standard national Pre and Post Test for Principles of Finance, akin to the one for Economics, by authors created one by selecting questions from previously administered examinations. The Cronbach's Alpha of 0.851, exceeding the minimum of 0.70 for reliable pen and paper test, indicates that our Test can detect differences in…

  8. JPSS-1 VIIRS Pre-Launch Radiometric Performance

    Science.gov (United States)

    Oudrari, Hassan; Mcintire, Jeffrey; Xiong, Xiaoxiong; Butler, James; Ji, Qiang; Schwarting, Tom; Zeng, Jinan

    2015-01-01

    The first Joint Polar Satellite System (JPSS-1 or J1) mission is scheduled to launch in January 2017, and will be very similar to the Suomi-National Polar-orbiting Partnership (SNPP) mission. The Visible Infrared Imaging Radiometer Suite (VIIRS) on board the J1 spacecraft completed its sensor level performance testing in December 2014. VIIRS instrument is expected to provide valuable information about the Earth environment and properties on a daily basis, using a wide-swath (3,040 km) cross-track scanning radiometer. The design covers the wavelength spectrum from reflective to long-wave infrared through 22 spectral bands, from 0.412 m to 12.01 m, and has spatial resolutions of 370 m and 740 m at nadir for imaging and moderate bands, respectively. This paper will provide an overview of pre-launch J1 VIIRS performance testing and methodologies, describing the at-launch baseline radiometric performance as well as the metrics needed to calibrate the instrument once on orbit. Key sensor performance metrics include the sensor signal to noise ratios (SNRs), dynamic range, reflective and emissive bands calibration performance, polarization sensitivity, bands spectral performance, response-vs-scan (RVS), near field response, and stray light rejection. A set of performance metrics generated during the pre-launch testing program will be compared to the sensor requirements and to SNPP VIIRS pre-launch performance.

  9. Development of a fresh cadaver model for instruction of ultrasound-guided breast biopsy during the surgery clerkship: pre-test and post-test results among third-year medical students.

    Science.gov (United States)

    McCrary, Hilary C; Krate, Jonida; Savilo, Christine E; Tran, Melissa H; Ho, Hang T; Adamas-Rappaport, William J; Viscusi, Rebecca K

    2016-11-01

    The aim of our study was to determine if a fresh cadaver model is a viable method for teaching ultrasound (US)-guided breast biopsy of palpable breast lesions. Third-year medical students were assessed both preinstruction and postinstruction on their ability to perform US-guided needle aspiration or biopsy of artificially created masses using a 10-item checklist. Forty-one third-year medical students completed the cadaver laboratory as part of the surgery clerkship. Eight items on the checklist were found to be significantly different between pre-testing and post-testing. The mean preinstruction score was 2.4, whereas the mean postinstruction score was 7.10 (P cadaver models have been widely used in medical education. However, there are few fresh cadaver models that provide instruction on procedures done in the outpatient setting. Our model was found to be an effective method for the instruction of US-guided breast biopsy among medical students. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  11. Pre-test genetic counseling services for hereditary breast and ovarian cancer delivered by non-genetics professionals in the state of Florida.

    Science.gov (United States)

    Vadaparampil, S T; Scherr, C L; Cragun, D; Malo, T L; Pal, T

    2015-05-01

    Genetic counseling and testing for hereditary breast and ovarian cancer now includes practitioners from multiple healthcare professions, specialties, and settings. This study examined whether non-genetics professionals (NGPs) perform guideline-based patient intake and informed consent before genetic testing. NGPs offering BRCA testing services in Florida (n = 386) were surveyed about clinical practices. Among 81 respondents (response rate = 22%), approximately half reported: sometimes scheduling a separate session for pre-test counseling lasting 11-30 min prior to testing, discussing familial implications of testing, benefits and limitations of risk management options, and discussing the potential psychological impact and insurance-related issues. Few constructed a three-generation pedigree, discussed alternative hereditary cancer syndromes, or the meaning of a variant result. This lack of adherence to guideline-based practice may result in direct harm to patients and their family members. NGPs who are unable to deliver guideline adherent cancer genetics services should focus on identification and referral of at-risk patients to in person or telephone services provided by genetics professionals. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. 31 CFR 544.702 - Pre-Penalty Notice; settlement.

    Science.gov (United States)

    2010-07-01

    ... set forth in this paragraph. The failure to submit a response within the applicable time period set... envelope in which the Pre-Penalty Notice was mailed. If the Pre-Penalty Notice was personally delivered by...

  13. Probability Estimation in the Framework of Intuitionistic Fuzzy Evidence Theory

    Directory of Open Access Journals (Sweden)

    Yafei Song

    2015-01-01

    Full Text Available Intuitionistic fuzzy (IF evidence theory, as an extension of Dempster-Shafer theory of evidence to the intuitionistic fuzzy environment, is exploited to process imprecise and vague information. Since its inception, much interest has been concentrated on IF evidence theory. Many works on the belief functions in IF information systems have appeared. Although belief functions on the IF sets can deal with uncertainty and vagueness well, it is not convenient for decision making. This paper addresses the issue of probability estimation in the framework of IF evidence theory with the hope of making rational decision. Background knowledge about evidence theory, fuzzy set, and IF set is firstly reviewed, followed by introduction of IF evidence theory. Axiomatic properties of probability distribution are then proposed to assist our interpretation. Finally, probability estimations based on fuzzy and IF belief functions together with their proofs are presented. It is verified that the probability estimation method based on IF belief functions is also potentially applicable to classical evidence theory and fuzzy evidence theory. Moreover, IF belief functions can be combined in a convenient way once they are transformed to interval-valued possibilities.

  14. Diagnostic probability function for acute coronary heart disease garnered from experts' tacit knowledge.

    Science.gov (United States)

    Steurer, Johann; Held, Ulrike; Miettinen, Olli S

    2013-11-01

    Knowing about a diagnostic probability requires general knowledge about the way in which the probability depends on the diagnostic indicators involved in the specification of the case at issue. Diagnostic probability functions (DPFs) are generally unavailable at present. Our objective was to illustrate how diagnostic experts' case-specific tacit knowledge about diagnostic probabilities could be garnered in the form of DPFs. Focusing on diagnosis of acute coronary heart disease (ACHD), we presented doctors with extensive experience in hospitals' emergency departments a set of hypothetical cases specified in terms of an inclusive set of diagnostic indicators. We translated the medians of these experts' case-specific probabilities into a logistic DPF for ACHD. The principal result was the experts' typical diagnostic probability for ACHD as a joint function of the set of diagnostic indicators. A related result of note was the finding that the experts' probabilities in any given case had a surprising degree of variability. Garnering diagnostic experts' case-specific tacit knowledge about diagnostic probabilities in the form of DPFs is feasible to accomplish. Thus, once the methodology of this type of work has been "perfected," practice-guiding diagnostic expert systems can be developed. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Tests of Cumulative Prospect Theory with graphical displays of probability

    Directory of Open Access Journals (Sweden)

    Michael H. Birnbaum

    2008-10-01

    Full Text Available Recent research reported evidence that contradicts cumulative prospect theory and the priority heuristic. The same body of research also violates two editing principles of original prospect theory: cancellation (the principle that people delete any attribute that is the same in both alternatives before deciding between them and combination (the principle that people combine branches leading to the same consequence by adding their probabilities. This study was designed to replicate previous results and to test whether the violations of cumulative prospect theory might be eliminated or reduced by using formats for presentation of risky gambles in which cancellation and combination could be facilitated visually. Contrary to the idea that decision behavior contradicting cumulative prospect theory and the priority heuristic would be altered by use of these formats, however, data with two new graphical formats as well as fresh replication data continued to show the patterns of evidence that violate cumulative prospect theory, the priority heuristic, and the editing principles of combination and cancellation. Systematic violations of restricted branch independence also contradicted predictions of ``stripped'' prospect theory (subjectively weighted additive utility without the editing rules.

  16. Pre- and post- test calculation for the parameter-SF1 experiment with ATHLET-CD

    Energy Technology Data Exchange (ETDEWEB)

    Erdmann, W.; Trambauer, K.; Stuckert, J. [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koln (Germany)

    2006-07-01

    The main objective of the PARAMETER-SF1 experiment in the frame of the ISTC project 3194 is the experimental and analytical investigation of the Russian VVER-1000 fuel rod assemblies behavior under simulated conditions of a severe accident. The special feature is to study the effect of flooding a superheated test bundle from the top (top quenching) which has not yet been investigated at all. - Simulation of the PARAMETER test facility To calculate the special effects of the top quenching, some aspects are important: detailed simulation of the bundle top, top and bottom quench front, heat losses at top/bottom of bundle, electrical heater power. - Main initial and boundary conditions The proposed initial and boundary conditions for the double-blind pre-test calculation were quite different from the actual experimental data during the test e.g.: electric power, mass flow (water, steam, argon), temperature. - Conclusions: first experiment with top flooding proposed initial condition given in the specification could not be performed during the experiment, bundle parameters deviated from anticipated values, thus, the pre-calculations not comparable with the experiment, post-calculations with ATHLET-CD showed good agreement with experiment data, top flooding is well predicted, calculational results sensitive with respect to: boundary conditions, nodalization. (authors)

  17. A computerized pre-clinical test for cemented hip prostheses based on finite element techniques

    NARCIS (Netherlands)

    Stolk, Jan

    2003-01-01

    Despite the success of cemented total hip replacement (THR), high failure rates are occasionally reported for cemented hip implants that are introduced on the orthopaedic market. Rigorous pre-clinical testing of hip implants could prevent these disasters, by detecting unsafe implant designs at a

  18. Increasing Classroom Compliance: Using a High-Probability Command Sequence with Noncompliant Students

    Science.gov (United States)

    Axelrod, Michael I.; Zank, Amber J.

    2012-01-01

    Noncompliance is one of the most problematic behaviors within the school setting. One strategy to increase compliance of noncompliant students is a high-probability command sequence (HPCS; i.e., a set of simple commands in which an individual is likely to comply immediately prior to the delivery of a command that has a lower probability of…

  19. Process for nondestructively testing with radioactive gas using a chill set sealant

    International Nuclear Information System (INIS)

    Gibbons, C.B.

    1975-01-01

    An article surface is nondestructively tested for substantially invisible surface voids by absorbing a radioactive gas thereon. The adsorbed radioactive gas is disproportionately retained on those surfaces presented by the substantially invisible surface voids as compared to the remaining surfaces of the article contacted by the radioactive gas. The radiation released by the radioactive gas remaining adsorbed is used to identify the substantially invisible voids. To immobilize the radioactive gas adjacent or within the surface voids, a sealant composition is provided which is capable of being chill set. The temperatures of the article surface to be tested and the sealant composition are then related so that the article surface is at a temperature below the chill set temperature of the sealant composition and the sealant composition is at a temperature above its chill set temperature. The article portion to be tested is then coated with sealant composition to form a chill set coating thereon of substantially uniform thickness. (U.S.)

  20. Conditional non-independence of radiographic image features and the derivation of post-test probabilities – A mammography BI-RADS example

    International Nuclear Information System (INIS)

    Benndorf, Matthias

    2012-01-01

    Bayes' theorem has proven to be one of the cornerstones in medical decision making. It allows for the derivation of post-test probabilities, which in case of a positive test result become positive predictive values. If several test results are observed successively Bayes' theorem may be used with assumed conditional independence of test results or with incorporated conditional dependencies. Herein it is examined whether radiographic image features should be considered conditionally independent diagnostic tests when post-test probabilities are to be derived. For this purpose the mammographic mass dataset from the UCI (University of California, Irvine) machine learning repository is analysed. It comprises the description of 961 (516 benign, 445 malignant) mammographic mass lesions according to the BI-RADS (Breast Imaging: Reporting and Data System) lexicon. Firstly, an exhaustive correlation matrix is presented for mammography BI-RADS features among benign and malignant lesions separately; correlation can be regarded as measure for conditional dependence. Secondly, it is shown that the derived positive predictive values for the conjunction of the two features “irregular shape” and “spiculated margin” differ significantly depending on whether conditional dependencies are incorporated into the decision process or not. It is concluded that radiographic image features should not generally be regarded as conditionally independent diagnostic tests.

  1. Couples’ Educational Needs Referred to Ershad and Fazel Health Centers of Tehran for Pre-marriage Tests in 2014

    Directory of Open Access Journals (Sweden)

    ّFarima Mohammadi

    2016-12-01

    Full Text Available Background and Objective: Premarital education is the foundation for the development of couples’ communications, sexual relationships, health and fertility. Therefore, this study was done with the purpose of determining couples’ educational needs referred to Ershad and Fazel health centers of Tehran for pre-marriage tests.Materials and Methods: A Cross Sectional study was conducted among couples who referred to health centers. The sample size was 1672 and the sampling method was convenience.. Demographic data was collected through interview with women based on questionnaires. Then the couples responded to the self-administered educational needs questionnaire. The collected data were entered into SPSS-21 statistical software and analyzed using Chi-Square test.Results: 883 women and 789 men completed the educational needs questionnaire. The mean (SD age of women and men were 27.1 (4.5 and 30.2 (5.3 years respectively. Of all female participants 483 (54.7% and 369 (41.8% of male participants had a bachelor's degree. 97.5% of men and 57.1% of women were employed. The ethnicity of about 72% of participants was Far. Most of the pre-marriage educational needs of men and women were marital health, importance of pre-marriage tests, unintended pregnancy prevention methods, correct marital relationship and prenatal care. 313 (64.6% women and 238 (42.8% men reported the pre-marriage educational class to be very helpful. The educational needs of couples were not different based on education, ethnicity and place of residence.Conclusion: Results of the currents study shows the need for pre-marriage education in all evaluated aspects and the importance of paying attention to it for prospering the pre-marital classes.

  2. Pre-exposure to food temptation reduces subsequent consumption: A test of the procedure with a South-African sample.

    Science.gov (United States)

    Duh, Helen Inseng; Grubliauskiene, Aiste; Dewitte, Siegfried

    2016-01-01

    It has been suggested that the consumption of unhealthy Westernized diet in a context of poverty and resultant food insecurity may have contributed to South-Africa's status of the third fattest country in the World. Considering that a number of South-Africans are reported to have experienced, or are still experiencing food insecurity, procedures which have been shown to reduce the consumption of unhealthy food in higher income countries may be ineffective in South-Africa. We thus tested the robustness of the so called pre-exposure procedure in South-Africa. We also tested the moderating role of childhood poverty in the pre-exposure procedure. With the pre-exposure procedure, a respondent is exposed to a tempting unhealthy food (e.g. candy) in a context that is designed such that eating the food interferes with a task goal. The typical result is that this procedure spills over and reduces consumption of similar tempting food later on. An experimental study conducted in a South-African laboratory showed that the pre-exposure effect is robust even with a sample, where food insecurity prevails. Childhood poverty did not moderate the effect. This study proves that behavioral procedures aimed at reducing the consumption of unhealthy food would be valuable in less rich non-Western countries. Further testing of the robustness of the pre-exposure effect is however recommended in other poorer food insecure countries. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Decisions under risk in Parkinson's disease: preserved evaluation of probability and magnitude.

    Science.gov (United States)

    Sharp, Madeleine E; Viswanathan, Jayalakshmi; McKeown, Martin J; Appel-Cresswell, Silke; Stoessl, A Jon; Barton, Jason J S

    2013-11-01

    Unmedicated Parkinson's disease patients tend to be risk-averse while dopaminergic treatment causes a tendency to take risks. While dopamine agonists may result in clinically apparent impulse control disorders, treatment with levodopa also causes shift in behaviour associated with an enhanced response to rewards. Two important determinants in decision-making are how subjects perceive the magnitude and probability of outcomes. Our objective was to determine if patients with Parkinson's disease on or off levodopa showed differences in their perception of value when making decisions under risk. The Vancouver Gambling task presents subjects with a choice between one prospect with larger outcome and a second with higher probability. Eighteen age-matched controls and eighteen patients with Parkinson's disease before and after levodopa were tested. In the Gain Phase subjects chose between one prospect with higher probability and another with larger reward to maximize their gains. In the Loss Phase, subjects played to minimize their losses. Patients with Parkinson's disease, on or off levodopa, were similar to controls when evaluating gains. However, in the Loss Phase before levodopa, they were more likely to avoid the prospect with lower probability but larger loss, as indicated by the steeper slope of their group psychometric function (t(24) = 2.21, p = 0.04). Modelling with prospect theory suggested that this was attributable to a 28% overestimation of the magnitude of loss, rather than an altered perception of its probability. While pre-medicated patients with Parkinson's disease show risk-aversion for large losses, patients on levodopa have normal perception of magnitude and probability for both loss and gain. The finding of accurate and normally biased decisions under risk in medicated patients with PD is important because it indicates that, if there is indeed anomalous risk-seeking behaviour in such a cohort, it may derive from abnormalities in components of

  4. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  5. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  6. Quantum Probability Zero-One Law for Sequential Terminal Events

    Science.gov (United States)

    Rehder, Wulf

    1980-07-01

    On the basis of the Jauch-Piron quantum probability calculus a zero-one law for sequential terminal events is proven, and the significance of certain crucial axioms in the quantum probability calculus is discussed. The result shows that the Jauch-Piron set of axioms is appropriate for the non-Boolean algebra of sequential events.

  7. Distance learning training in genetics and genomics testing for Italian health professionals: results of a pre and post-test evaluation

    Directory of Open Access Journals (Sweden)

    Maria Benedetta Michelazzo

    2015-09-01

    Full Text Available BackgroundProgressive advances in technologies for DNA sequencing and decreasing costs are allowing an easier diffusion of genetic and genomic tests. Physicians’ knowledge and confidence on the topic is often low and not suitable for manage this challenge. Tailored educational programs are required to reach a more and more appropriate use of genetic technologies.MethodsA distance learning course has been created by experts from different Italian medical associations with the support of the Italian Ministry of Health. The course was directed to professional figures involved in prescription and interpretation of genetic tests. A pretest-post-test study design was used to assess knowledge improvement. We analyzed the proportion of correct answers for each question pre and post-test, as well as the mean score difference stratified by gender, age, professional status and medical specialty.ResultsWe reported an improvement in the proportion of correct answers for 12 over 15 questions of the test. The overall mean score to the questions significantly increased in the post-test, from 9.44 to 12.49 (p-value < 0.0001. In the stratified analysis we reported an improvement in the knowledge of all the groups except for geneticists; the pre-course mean score of this group was already very high and did not improve significantly.ConclusionDistance learning is effective in improving the level of genetic knowledge. In the future, it will be useful to analyze which specialists have more advantage from genetic education, in order to plan more tailored education for medical professionals.

  8. The development and validation of the Closed-set Mandarin Sentence (CMS) test.

    Science.gov (United States)

    Tao, Duo-Duo; Fu, Qian-Jie; Galvin, John J; Yu, Ya-Feng

    2017-09-01

    Matrix-styled sentence tests offer a closed-set paradigm that may be useful when evaluating speech intelligibility. Ideally, sentence test materials should reflect the distribution of phonemes within the target language. We developed and validated the Closed-set Mandarin Sentence (CMS) test to assess Mandarin speech intelligibility in noise. CMS test materials were selected to be familiar words and to represent the natural distribution of vowels, consonants, and lexical tones found in Mandarin Chinese. Ten key words in each of five categories (Name, Verb, Number, Color, and Fruit) were produced by a native Mandarin talker, resulting in a total of 50 words that could be combined to produce 100,000 unique sentences. Normative data were collected in 10 normal-hearing, adult Mandarin-speaking Chinese listeners using a closed-set test paradigm. Two test runs were conducted for each subject, and 20 sentences per run were randomly generated while ensuring that each word was presented only twice in each run. First, the level of the words in each category were adjusted to produce equal intelligibility in noise. Test-retest reliability for word-in-sentence recognition was excellent according to Cronbach's alpha (0.952). After the category level adjustments, speech reception thresholds (SRTs) for sentences in noise, defined as the signal-to-noise ratio (SNR) that produced 50% correct whole sentence recognition, were adaptively measured by adjusting the SNR according to the correctness of response. The mean SRT was -7.9 (SE=0.41) and -8.1 (SE=0.34) dB for runs 1 and 2, respectively. The mean standard deviation across runs was 0.93 dB, and paired t-tests showed no significant difference between runs 1 and 2 (p=0.74) despite random sentences being generated for each run and each subject. The results suggest that the CMS provides large stimulus set with which to repeatedly and reliably measure Mandarin-speaking listeners' speech understanding in noise using a closed-set paradigm.

  9. Pre-Test pan Work Plan sebagai Strategi Pembelajaran Efektif pada Praktikum Bahan Teknik Lanjut Jurusan Pendidikan Teknik Mesin FT UNY

    Directory of Open Access Journals (Sweden)

    Nurdjito Nurdjito

    2013-09-01

    Full Text Available To find the most effective learning strategy for the practicum in the laboratory of materials of the department of Mechanical Engineering Education, Faculty of Engineering, Yogyakarta State University (YSU, a study that aims to determine the effect of applying pre-test and work plan on the learning activities and the achievement of students in the laboratory was conducted. This action research used the purposive random sampling technique. Pre-test and work plan were conducted as the treatment. The data of study was collected through a test to analyse the students’ achievement scores, then they were analyzed using t-test with SPSS. The results of this study indicated that the application of pre-test and work plan in addition to the standard module was proven to be more effective than the  normative learning using the module with t = 3.055 p = 0.003 <0.05. The implementation of the pre-test and work plan in addition to the use of standard modules is able to  improve the students’ motivation, independence and readiness to learn as well as the cooperation among the students, therefore the achievement is also improved. The mastery of competencies increased significantly proved by the increasing values of mode 66 to 85 (the experiment, and mean 73.12 into 79.32 (experiment.

  10. The Generalized Higher Criticism for Testing SNP-Set Effects in Genetic Association Studies

    Science.gov (United States)

    Barnett, Ian; Mukherjee, Rajarshi; Lin, Xihong

    2017-01-01

    It is of substantial interest to study the effects of genes, genetic pathways, and networks on the risk of complex diseases. These genetic constructs each contain multiple SNPs, which are often correlated and function jointly, and might be large in number. However, only a sparse subset of SNPs in a genetic construct is generally associated with the disease of interest. In this article, we propose the generalized higher criticism (GHC) to test for the association between an SNP set and a disease outcome. The higher criticism is a test traditionally used in high-dimensional signal detection settings when marginal test statistics are independent and the number of parameters is very large. However, these assumptions do not always hold in genetic association studies, due to linkage disequilibrium among SNPs and the finite number of SNPs in an SNP set in each genetic construct. The proposed GHC overcomes the limitations of the higher criticism by allowing for arbitrary correlation structures among the SNPs in an SNP-set, while performing accurate analytic p-value calculations for any finite number of SNPs in the SNP-set. We obtain the detection boundary of the GHC test. We compared empirically using simulations the power of the GHC method with existing SNP-set tests over a range of genetic regions with varied correlation structures and signal sparsity. We apply the proposed methods to analyze the CGEM breast cancer genome-wide association study. Supplementary materials for this article are available online. PMID:28736464

  11. Using cognitive pre-testing methods in the development of a new evidenced-based pressure ulcer risk assessment instrument

    Directory of Open Access Journals (Sweden)

    S. Coleman

    2016-11-01

    Full Text Available Abstract Background Variation in development methods of Pressure Ulcer Risk Assessment Instruments has led to inconsistent inclusion of risk factors and concerns about content validity. A new evidenced-based Risk Assessment Instrument, the Pressure Ulcer Risk Primary Or Secondary Evaluation Tool - PURPOSE-T was developed as part of a National Institute for Health Research (NIHR funded Pressure Ulcer Research Programme (PURPOSE: RP-PG-0407-10056. This paper reports the pre-test phase to assess and improve PURPOSE-T acceptability, usability and confirm content validity. Methods A descriptive study incorporating cognitive pre-testing methods and integration of service user views was undertaken over 3 cycles comprising PURPOSE-T training, a focus group and one-to-one think-aloud interviews. Clinical nurses from 2 acute and 2 community NHS Trusts, were grouped according to job role. Focus group participants used 3 vignettes to complete PURPOSE-T assessments and then participated in the focus group. Think-aloud participants were interviewed during their completion of PURPOSE-T. After each pre-test cycle analysis was undertaken and adjustment/improvements made to PURPOSE-T in an iterative process. This incorporated the use of descriptive statistics for data completeness and decision rule compliance and directed content analysis for interview and focus group data. Data were collected April 2012-June 2012. Results Thirty-four nurses participated in 3 pre-test cycles. Data from 3 focus groups, 12 think-aloud interviews incorporating 101 PURPOSE-T assessments led to changes to improve instrument content and design, flow and format, decision support and item-specific wording. Acceptability and usability were demonstrated by improved data completion and appropriate risk pathway allocation. The pre-test also confirmed content validity with clinical nurses. Conclusions The pre-test was an important step in the development of the preliminary PURPOSE-T and the

  12. Planning and pre-testing: the key to effective AIDS education materials.

    Science.gov (United States)

    Ostfield, M L; Romocki, L S

    1991-06-01

    The steps in designing and producing effective AIDS prevention educational materials are outlines, using as an example a brochure originated in St. Lucia for clients at STD clinics. The brochure was intended to be read by clients as they waited for their consultation, thus it was targeted to a specific audience delimited by age, sex, language, educational level, religion and associated medical or behavioral characteristics. When researching the audience, it is necessary to learn the medium they best respond to, what they know already, what is their present behavior, how they talk about AIDS, what terms they use, how they perceive the benefits of AIDS prevention behavior, what sources of information they trust. The minimum number of key messages should be selected. Next the most appropriate channel of communication is identified. Mass media are not always best for a target audience, "little media" such as flyers and give-always may be better. The draft is then pre-tested by focus groups and interviews, querying about the text separately, then images, color, format, style. Listen to the way the respondents talk about the draft. Modify the draft and pre-test again. Fine-tune implications of the message for realism in emotional responses, respect, self-esteem, admiration and trust. To achieve wide distribution it is a good idea to involve community leaders to production of the materials, so they will be more likely to take part in the distribution process.

  13. The Use of Conditional Probability Integral Transformation Method for Testing Accelerated Failure Time Models

    Directory of Open Access Journals (Sweden)

    Abdalla Ahmed Abdel-Ghaly

    2016-06-01

    Full Text Available This paper suggests the use of the conditional probability integral transformation (CPIT method as a goodness of fit (GOF technique in the field of accelerated life testing (ALT, specifically for validating the underlying distributional assumption in accelerated failure time (AFT model. The method is based on transforming the data into independent and identically distributed (i.i.d Uniform (0, 1 random variables and then applying the modified Watson statistic to test the uniformity of the transformed random variables. This technique is used to validate each of the exponential, Weibull and lognormal distributions' assumptions in AFT model under constant stress and complete sampling. The performance of the CPIT method is investigated via a simulation study. It is concluded that this method performs well in case of exponential and lognormal distributions. Finally, a real life example is provided to illustrate the application of the proposed procedure.

  14. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  15. How to conduct External Quality Assessment Schemes for the pre-analytical phase?

    Science.gov (United States)

    Kristensen, Gunn B B; Aakre, Kristin Moberg; Kristoffersen, Ann Helen; Sandberg, Sverre

    2014-01-01

    In laboratory medicine, several studies have described the most frequent errors in the different phases of the total testing process, and a large proportion of these errors occur in the pre-analytical phase. Schemes for registration of errors and subsequent feedback to the participants have been conducted for decades concerning the analytical phase by External Quality Assessment (EQA) organizations operating in most countries. The aim of the paper is to present an overview of different types of EQA schemes for the pre-analytical phase, and give examples of some existing schemes. So far, very few EQA organizations have focused on the pre-analytical phase, and most EQA organizations do not offer pre-analytical EQA schemes (EQAS). It is more difficult to perform and standardize pre-analytical EQAS and also, accreditation bodies do not ask the laboratories for results from such schemes. However, some ongoing EQA programs for the pre-analytical phase do exist, and some examples are given in this paper. The methods used can be divided into three different types; collecting information about pre-analytical laboratory procedures, circulating real samples to collect information about interferences that might affect the measurement procedure, or register actual laboratory errors and relate these to quality indicators. These three types have different focus and different challenges regarding implementation, and a combination of the three is probably necessary to be able to detect and monitor the wide range of errors occurring in the pre-analytical phase.

  16. Improving information extraction using a probability-based approach

    DEFF Research Database (Denmark)

    Kim, S.; Ahmed, Saeema; Wallace, K.

    2007-01-01

    Information plays a crucial role during the entire life-cycle of a product. It has been shown that engineers frequently consult colleagues to obtain the information they require to solve problems. However, the industrial world is now more transient and key personnel move to other companies...... or retire. It is becoming essential to retrieve vital information from archived product documents, if it is available. There is, therefore, great interest in ways of extracting relevant and sharable information from documents. A keyword-based search is commonly used, but studies have shown...... the recall, while maintaining the high precision, a learning approach that makes identification decisions based on a probability model, rather than simply looking up the presence of the pre-defined variations, looks promising. This paper presents the results of developing such a probability-based entity...

  17. Uniform approximation is more appropriate for Wilcoxon Rank-Sum Test in gene set analysis.

    Directory of Open Access Journals (Sweden)

    Zhide Fang

    Full Text Available Gene set analysis is widely used to facilitate biological interpretations in the analyses of differential expression from high throughput profiling data. Wilcoxon Rank-Sum (WRS test is one of the commonly used methods in gene set enrichment analysis. It compares the ranks of genes in a gene set against those of genes outside the gene set. This method is easy to implement and it eliminates the dichotomization of genes into significant and non-significant in a competitive hypothesis testing. Due to the large number of genes being examined, it is impractical to calculate the exact null distribution for the WRS test. Therefore, the normal distribution is commonly used as an approximation. However, as we demonstrate in this paper, the normal approximation is problematic when a gene set with relative small number of genes is tested against the large number of genes in the complementary set. In this situation, a uniform approximation is substantially more powerful, more accurate, and less intensive in computation. We demonstrate the advantage of the uniform approximations in Gene Ontology (GO term analysis using simulations and real data sets.

  18. Study of oxide film formed in a pre cracked CT specimen of AISI 304L during a rising displacement test in 288 C water

    International Nuclear Information System (INIS)

    Diaz S, A.; Castano M, V.

    2007-01-01

    A study of oxide film formed inside pre cracked CT specimens during a rising displacement test in high temperature water (288 C) was performed in this study, The environmental conditions used during the experiments were similar to these found in Boiling Water Reactors (BWR): Normal Water Condition (NWC - 200 ppb O 2 ) and Hydrogen Water Chemistry (HWC - 125 ppb H2). The oxide films formed were analyzed by scanning electron microscopy (SEM), energy dispersive spectroscopy (EDS) and X-ray diffraction (XRD). In both cases the oxide film consisted of two layers identified as magnetite. In the case of HWC the results agree with previous reports that mention magnetite as a stable phase in reducing conditions. However the stable phase in oxidant conditions is hematite and this work shows the presence of magnetite crystals in the narrow crack of CT specimens in spite of the oxidant environmental condition. This situation confirms that inside the pre-cracked CT specimens the environmental conditions were different from the oxidant bulk, and probably a poor oxygen access and stagnant conditions within the narrow crack promoted a localized reducing environment that permitted the magnetite formation. Is evident that the crack growth studies should consider the conditions inside crack because they are significantly different. (Author)

  19. Introduction to probability and statistics for engineers and scientists

    CERN Document Server

    Ross, Sheldon M

    2009-01-01

    This updated text provides a superior introduction to applied probability and statistics for engineering or science majors. Ross emphasizes the manner in which probability yields insight into statistical problems; ultimately resulting in an intuitive understanding of the statistical procedures most often used by practicing engineers and scientists. Real data sets are incorporated in a wide variety of exercises and examples throughout the book, and this emphasis on data motivates the probability coverage.As with the previous editions, Ross' text has remendously clear exposition, plus real-data

  20. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  1. A set of X-ray test objects for quality control in television fluoroscopy

    International Nuclear Information System (INIS)

    Hay, G.A.; Clarke, O.F.; Coleman, N.J.; Cowen, A.R.

    1985-01-01

    The history of performance testing in Leeds of television fluoroscopic systems is briefly outlined. Using the visual, physical and technological requirements as a basis, a set of nine test objects for quality control in television fluoroscopy is described. The factors measured by the test objects are listed in the introduction; the test objects and their function are fully described in the remainder of the paper. The test objects, in conjunction with a television oscilloscope, give both subjective and objective information about the X-ray system. Three of the test objects enable the physicist or engineer to adjust certain aspects of the performance of the X-ray system. The set of nine test objects is available commercially. (author)

  2. Phenotypic T cell exhaustion in a murine model of bacterial infection in the setting of pre-existing malignancy.

    Directory of Open Access Journals (Sweden)

    Rohit Mittal

    Full Text Available While much of cancer immunology research has focused on anti-tumor immunity both systemically and within the tumor microenvironment, little is known about the impact of pre-existing malignancy on pathogen-specific immune responses. Here, we sought to characterize the antigen-specific CD8+ T cell response following a bacterial infection in the setting of pre-existing pancreatic adenocarcinoma. Mice with established subcutaneous pancreatic adenocarcinomas were infected with Listeria monocytogenes, and antigen-specific CD8+ T cell responses were compared to those in control mice without cancer. While the kinetics and magnitude of antigen-specific CD8+ T cell expansion and accumulation was comparable between the cancer and non-cancer groups, bacterial antigen-specific CD8+ T cells and total CD4+ and CD8+ T cells in cancer mice exhibited increased expression of the coinhibitory receptors BTLA, PD-1, and 2B4. Furthermore, increased inhibitory receptor expression was associated with reduced IFN-γ and increased IL-2 production by bacterial antigen-specific CD8+ T cells in the cancer group. Taken together, these data suggest that cancer's immune suppressive effects are not limited to the tumor microenvironment, but that pre-existing malignancy induces phenotypic exhaustion in T cells by increasing expression of coinhibitory receptors and may impair pathogen-specific CD8+ T cell functionality and differentiation.

  3. Detecting pre-diabetes and the role of the pharmacist

    Directory of Open Access Journals (Sweden)

    Simoens S

    2011-06-01

    Full Text Available Objective: This study aims to use a pharmacoepidemiological approach to study the drug use of patients during the year prior to diabetes diagnosis (i.e. pre-diabetic patients and control patients. Drug use might reveal cardiovascular, metabolic and/or endocrinological changes and help to identify indicators for active monitoring of Type 2 diabetes mellitus.Methods: A retrospective case-control study compared drug use of patients with a future diagnosis of diabetes (experimental patients with patients without a diabetes diagnosis (control patients based on community pharmacy records. An experimental patient had used oral hypoglycaemic drugs during 2005 or 2006. Experimental and control patients were matched in terms of age, gender and quarter of index date. Drugs were selected based on possible co-morbidities of diabetes. Drug use was expressed as a binary variable, indicating whether or not a patient took specific drugs. Drug use was compared between experimental patients during the year prior to diagnosis and control patients using the chi-squared test.Results: Our dataset covered 5,064 patients (1,688 experimental and 3,376 control patients. A higher probability of taking cardiovascular drugs was observed for specific subgroups of patients with pre-diabetes as compared to control patients: this trend was observed for men as well as for women, for various cardiovascular drug classes, and for different age groups (p<0.05, although it was not always statistically significant for the 29-38 age group. For each selected age and gender group, patients with pre-diabetes had a higher probability of taking a combination of a lipid-modifying agent and an antihypertensive drug than control patients (p<0.005.Conclusions: Using community pharmacy data, this study demonstrated that age and a characteristic drug use pattern could contribute to detecting pre-diabetes. There is a potential role for community pharmacists to follow up drug indicators of patients

  4. Dilution testing using rapid diagnostic tests in a HIV diagnostic algorithm: a novel alternative for confirmation testing in resource limited settings.

    Science.gov (United States)

    Shanks, Leslie; Siddiqui, M Ruby; Abebe, Almaz; Piriou, Erwan; Pearce, Neil; Ariti, Cono; Masiga, Johnson; Muluneh, Libsework; Wazome, Joseph; Ritmeijer, Koert; Klarkowski, Derryck

    2015-05-14

    Current WHO testing guidelines for resource limited settings diagnose HIV on the basis of screening tests without a confirmation test due to cost constraints. This leads to a potential risk of false positive HIV diagnosis. In this paper, we evaluate the dilution test, a novel method for confirmation testing, which is simple, rapid, and low cost. The principle of the dilution test is to alter the sensitivity of a rapid diagnostic test (RDT) by dilution of the sample, in order to screen out the cross reacting antibodies responsible for falsely positive RDT results. Participants were recruited from two testing centres in Ethiopia where a tiebreaker algorithm using 3 different RDTs in series is used to diagnose HIV. All samples positive on the initial screening RDT and every 10th negative sample underwent testing with the gold standard and dilution test. Dilution testing was performed using Determine™ rapid diagnostic test at 6 different dilutions. Results were compared to the gold standard of Western Blot; where Western Blot was indeterminate, PCR testing determined the final result. 2895 samples were recruited to the study. 247 were positive for a prevalence of 8.5 % (247/2895). A total of 495 samples underwent dilution testing. The RDT diagnostic algorithm misclassified 18 samples as positive. Dilution at the level of 1/160 was able to correctly identify all these 18 false positives, but at a cost of a single false negative result (sensitivity 99.6 %, 95 % CI 97.8-100; specificity 100 %, 95 % CI: 98.5-100). Concordance between the gold standard and the 1/160 dilution strength was 99.8 %. This study provides proof of concept for a new, low cost method of confirming HIV diagnosis in resource-limited settings. It has potential for use as a supplementary test in a confirmatory algorithm, whereby double positive RDT results undergo dilution testing, with positive results confirming HIV infection. Negative results require nucleic acid testing to rule out false

  5. HMM-ModE – Improved classification using profile hidden Markov models by optimising the discrimination threshold and modifying emission probabilities with negative training sequences

    Directory of Open Access Journals (Sweden)

    Nandi Soumyadeep

    2007-03-01

    Full Text Available Abstract Background Profile Hidden Markov Models (HMM are statistical representations of protein families derived from patterns of sequence conservation in multiple alignments and have been used in identifying remote homologues with considerable success. These conservation patterns arise from fold specific signals, shared across multiple families, and function specific signals unique to the families. The availability of sequences pre-classified according to their function permits the use of negative training sequences to improve the specificity of the HMM, both by optimizing the threshold cutoff and by modifying emission probabilities to minimize the influence of fold-specific signals. A protocol to generate family specific HMMs is described that first constructs a profile HMM from an alignment of the family's sequences and then uses this model to identify sequences belonging to other classes that score above the default threshold (false positives. Ten-fold cross validation is used to optimise the discrimination threshold score for the model. The advent of fast multiple alignment methods enables the use of the profile alignments to align the true and false positive sequences, and the resulting alignments are used to modify the emission probabilities in the original model. Results The protocol, called HMM-ModE, was validated on a set of sequences belonging to six sub-families of the AGC family of kinases. These sequences have an average sequence similarity of 63% among the group though each sub-group has a different substrate specificity. The optimisation of discrimination threshold, by using negative sequences scored against the model improves specificity in test cases from an average of 21% to 98%. Further discrimination by the HMM after modifying model probabilities using negative training sequences is provided in a few cases, the average specificity rising to 99%. Similar improvements were obtained with a sample of G-Protein coupled receptors

  6. Single Trial Probability Applications: Can Subjectivity Evade Frequency Limitations?

    Directory of Open Access Journals (Sweden)

    David Howden

    2009-10-01

    Full Text Available Frequency probability theorists define an event’s probability distribution as the limit of a repeated set of trials belonging to a homogeneous collective. The subsets of this collective are events which we have deficient knowledge about on an individual level, although for the larger collective we have knowledge its aggregate behavior. Hence, probabilities can only be achieved through repeated trials of these subsets arriving at the established frequencies that define the probabilities. Crovelli (2009 argues that this is a mistaken approach, and that a subjective assessment of individual trials should be used instead. Bifurcating between the two concepts of risk and uncertainty, Crovelli first asserts that probability is the tool used to manage uncertain situations, and then attempts to rebuild a definition of probability theory with this in mind. We show that such an attempt has little to gain, and results in an indeterminate application of entrepreneurial forecasting to uncertain decisions—a process far-removed from any application of probability theory.

  7. Assigning probability gain for precursors of four large Chinese earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, T.; Aki, K.

    1983-03-10

    We extend the concept of probability gain associated with a precursor (Aki, 1981) to a set of precursors which may be mutually dependent. Making use of a new formula, we derive a criterion for selecting precursors from a given data set in order to calculate the probability gain. The probabilities per unit time immediately before four large Chinese earthquakes are calculated. They are approximately 0.09, 0.09, 0.07 and 0.08 per day for 1975 Haicheng (M = 7.3), 1976 Tangshan (M = 7.8), 1976 Longling (M = 7.6), and Songpan (M = 7.2) earthquakes, respectively. These results are encouraging because they suggest that the investigated precursory phenomena may have included the complete information for earthquake prediction, at least for the above earthquakes. With this method, the step-by-step approach to prediction used in China may be quantified in terms of the probability of earthquake occurrence. The ln P versus t curve (where P is the probability of earthquake occurrence at time t) shows that ln P does not increase with t linearly but more rapidly as the time of earthquake approaches.

  8. The exact probability law for the approximated similarity from the ...

    African Journals Online (AJOL)

    The exact probability law for the approximated similarity from the Minhashing method. Soumaila Dembele, Gane Samb Lo. Abstract. We propose a probabilistic setting in which we study the probability law of the Rajaraman and Ullman RU algorithm and a modied version of it denoted by RUM. These algorithms aim at ...

  9. Crash probability estimation via quantifying driver hazard perception.

    Science.gov (United States)

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Optimal selection for BRCA1 and BRCA2 mutation testing using a combination of ' easy to apply ' probability models

    NARCIS (Netherlands)

    Bodmer, D.; Ligtenberg, M. J. L.; van der Hout, A. H.; Gloudemans, S.; Ansink, K.; Oosterwijk, J. C.; Hoogerbrugge, N.

    2006-01-01

    To establish an efficient, reliable and easy to apply risk assessment tool to select families with breast and/or ovarian cancer patients for BRCA mutation testing, using available probability models. In a retrospective study of 263 families with breast and/or ovarian cancer patients, the utility of

  11. An assessment of time involved in pre-test case review and counseling for a whole genome sequencing clinical research program.

    Science.gov (United States)

    Williams, Janet L; Faucett, W Andrew; Smith-Packard, Bethanny; Wagner, Monisa; Williams, Marc S

    2014-08-01

    Whole genome sequencing (WGS) is being used for evaluation of individuals with undiagnosed disease of suspected genetic origin. Implementing WGS into clinical practice will place an increased burden upon care teams with regard to pre-test patient education and counseling about results. To quantitate the time needed for appropriate pre-test evaluation of participants in WGS testing, we documented the time spent by our clinical research group on various activities related to program preparation, participant screening, and consent prior to WGS. Participants were children or young adults with autism, intellectual or developmental disability, and/or congenital anomalies, who have remained undiagnosed despite previous evaluation, and their biologic parents. Results showed that significant time was spent in securing allocation of clinical research space to counsel participants and families, and in acquisition and review of participant's medical records. Pre-enrollment chart review identified two individuals with existing diagnoses resulting in savings of $30,000 for the genome sequencing alone, as well as saving hours of personnel time for genome interpretation and communication of WGS results. New WGS programs should plan for costs associated with additional pre-test administrative planning and patient evaluation time that will be required to provide high quality care.

  12. A method for the calculation of the cumulative failure probability distribution of complex repairable systems

    International Nuclear Information System (INIS)

    Caldarola, L.

    1976-01-01

    A method is proposed for the analytical evaluation of the cumulative failure probability distribution of complex repairable systems. The method is based on a set of integral equations each one referring to a specific minimal cut set of the system. Each integral equation links the unavailability of a minimal cut set to its failure probability density distribution and to the probability that the minimal cut set is down at the time t under the condition that it was down at time t'(t'<=t). The limitations for the applicability of the method are also discussed. It has been concluded that the method is applicable if the process describing the failure of a minimal cut set is a 'delayed semi-regenerative process'. (Auth.)

  13. Posterior Probability Matching and Human Perceptual Decision Making.

    Directory of Open Access Journals (Sweden)

    Richard F Murray

    2015-06-01

    Full Text Available Probability matching is a classic theory of decision making that was first developed in models of cognition. Posterior probability matching, a variant in which observers match their response probabilities to the posterior probability of each response being correct, is being used increasingly often in models of perception. However, little is known about whether posterior probability matching is consistent with the vast literature on vision and hearing that has developed within signal detection theory. Here we test posterior probability matching models using two tools from detection theory. First, we examine the models' performance in a two-pass experiment, where each block of trials is presented twice, and we measure the proportion of times that the model gives the same response twice to repeated stimuli. We show that at low performance levels, posterior probability matching models give highly inconsistent responses across repeated presentations of identical trials. We find that practised human observers are more consistent across repeated trials than these models predict, and we find some evidence that less practised observers more consistent as well. Second, we compare the performance of posterior probability matching models on a discrimination task to the performance of a theoretical ideal observer that achieves the best possible performance. We find that posterior probability matching is very inefficient at low-to-moderate performance levels, and that human observers can be more efficient than is ever possible according to posterior probability matching models. These findings support classic signal detection models, and rule out a broad class of posterior probability matching models for expert performance on perceptual tasks that range in complexity from contrast discrimination to symmetry detection. However, our findings leave open the possibility that inexperienced observers may show posterior probability matching behaviour, and our methods

  14. Integer Set Compression and Statistical Modeling

    DEFF Research Database (Denmark)

    Larsson, N. Jesper

    2014-01-01

    enumeration of elements may be arbitrary or random, but where statistics is kept in order to estimate probabilities of elements. We present a recursive subset-size encoding method that is able to benefit from statistics, explore the effects of permuting the enumeration order based on element probabilities......Compression of integer sets and sequences has been extensively studied for settings where elements follow a uniform probability distribution. In addition, methods exist that exploit clustering of elements in order to achieve higher compression performance. In this work, we address the case where...

  15. Introduction to probability and statistics for science, engineering, and finance

    CERN Document Server

    Rosenkrantz, Walter A

    2008-01-01

    Data Analysis Orientation The Role and Scope of Statistics in Science and Engineering Types of Data: Examples from Engineering, Public Health, and Finance The Frequency Distribution of a Variable Defined on a Population Quantiles of a Distribution Measures of Location (Central Value) and Variability Covariance, Correlation, and Regression: Computing a Stock's Beta Mathematical Details and Derivations Large Data Sets Probability Theory Orientation Sample Space, Events, Axioms of Probability Theory Mathematical Models of Random Sampling Conditional Probability and Baye

  16. Pre-test analysis of ATLAS SBO with RCP seal leakage scenario using MARS code

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Quang Huy; Lee, Sang Young; Oh, Seung Jong [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-10-15

    This study presents a pre-test calculation for the Advanced Thermal-hydraulic Test Loop for Accident Simulation (ATLAS) SBO experiment with RCP seal leakage scenario. Initially, turbine-driven auxfeed water pumps are used. Then, outside cooling water injection method is used for long term cooling. The analysis results would be useful for conducting the experiment to verify the APR 1400 extended SBO optimum mitigation strategy using outside cooling water injection in future. The pre-test calculation for ATLAS extended SBO with RCP seal leakage and outside cooling water injection scenario is performed. After Fukushima nuclear accident, the capability of coping with the extended station blackout (SBO) becomes important. Many NPPs are applying FLEX approach as main coping strategies for extended SBO scenarios. In FLEX strategies, outside cooling water injection to reactor cooling system (RCS) and steam generators (SGs) is considered as an effective method to remove residual heat and maintain the inventory of the systems during the accident. It is worthwhile to examine the soundness of outside cooling water injection method for extended SBO mitigation by both calculation and experimental demonstration. From the calculation results, outside cooling water injection into RCS and SGs is verified as an effective method during extended SBO when RCS and SGs depressurization is sufficiently performed.

  17. Evaluation of the separate effects tests (SET) validation matrix

    International Nuclear Information System (INIS)

    1996-11-01

    This work is the result of a one year extended mandate which has been given by the CSNI on the request of the PWG 2 and the Task Group on Thermal Hydraulic System Behaviour (TG THSB) in late 1994. The aim was to evaluate the SET validation matrix in order to define the real needs for further experimental work. The statistical evaluation tables of the SET matrix provide an overview of the data base including the parameter ranges covered for each phenomenon and selected parameters, and questions posed to obtain answers concerning the need for additional experimental data with regard to the objective of nuclear power plant safety. A global view of the data base is first presented focussing on areas lacking in data and on hot topics. A new systematic evaluation has been done based on the authors technical judgments and giving evaluation tables. In these tables, global and indicative information are included. Four main parameters have been chosen as the most important and relevant parameters: a state parameter given by the operating pressure of the tests, a flow parameter expressed as mass flux, mass flow rate or volumetric flow rate in the tests, a geometrical parameter provided through a typical dimension expressed by a diameter, an equivalent diameter (hydraulic or heated) or a cross sectional area of the test sections, and an energy or heat transfer parameter given as the fluid temperature, the heat flux or the heat transfer surface temperature of the tests

  18. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions Based on a Bank of Norm-Inequality-Constrained Epoch-State Filters

    Science.gov (United States)

    Carpenter, J. R.; Markley, F. L.; Alfriend, K. T.; Wright, C.; Arcido, J.

    2011-01-01

    Sequential probability ratio tests explicitly allow decision makers to incorporate false alarm and missed detection risks, and are potentially less sensitive to modeling errors than a procedure that relies solely on a probability of collision threshold. Recent work on constrained Kalman filtering has suggested an approach to formulating such a test for collision avoidance maneuver decisions: a filter bank with two norm-inequality-constrained epoch-state extended Kalman filters. One filter models 1he null hypothesis 1ha1 the miss distance is inside the combined hard body radius at the predicted time of closest approach, and one filter models the alternative hypothesis. The epoch-state filter developed for this method explicitly accounts for any process noise present in the system. The method appears to work well using a realistic example based on an upcoming highly-elliptical orbit formation flying mission.

  19. Experimental data report for Test TS-1 Reactivity Initiated Accident Test in NSRR with pre-irradiated BWR fuel rod

    International Nuclear Information System (INIS)

    Nakamura, Takehiko; Yoshinaga, Makio; Sobajima, Makoto; Fujishiro, Toshio; Horiki, Ohichiro; Yamahara, Takeshi; Ichihashi, Yoshinori; Kikuchi, Teruo

    1992-01-01

    This report presents experimental data for Test TS-1 which was the first in a series of tests, simulating Reactivity Initiated Accident (RIA) conditions using pre-irradiated BWR fuel rods, performed in the Nuclear Safety Research Reactor (NSRR) in October, 1989. Test fuel rod used in the Test TS-1 was a short-sized BWR (7 x 7) type rod which was fabricated from a commercial rod provided from Tsuruga Unit 1 power reactor. The fuel had an initial enrichment of 2.79 % and burnup of 21.3 GWd/t (bundle average). Pulse irradiation was performed at a condition of stagnant water cooling, atmospheric pressure and ambient temperature using a newly developed double container-type capsule. Energy deposition of the rod in this test was evaluated to be about 61 cal/g·fuel (55 cal/g·fuel in peak fuel enthalpy) and no fuel failure was observed. Descriptions on test conditions, test procedures, fuel burnup measurements, transient behavior of the test rod during pulse irradiation and results of post pulse irradiation examinations are contained in this report. (author)

  20. Absolute transition probabilities for 559 strong lines of neutral cerium

    Energy Technology Data Exchange (ETDEWEB)

    Curry, J J, E-mail: jjcurry@nist.go [National Institute of Standards and Technology, Gaithersburg, MD 20899-8422 (United States)

    2009-07-07

    Absolute radiative transition probabilities are reported for 559 strong lines of neutral cerium covering the wavelength range 340-880 nm. These transition probabilities are obtained by scaling published relative line intensities (Meggers et al 1975 Tables of Spectral Line Intensities (National Bureau of Standards Monograph 145)) with a smaller set of published absolute transition probabilities (Bisson et al 1991 J. Opt. Soc. Am. B 8 1545). All 559 new values are for lines for which transition probabilities have not previously been available. The estimated relative random uncertainty of the new data is +-35% for nearly all lines.

  1. Probability densities and the radon variable transformation theorem

    International Nuclear Information System (INIS)

    Ramshaw, J.D.

    1985-01-01

    D. T. Gillespie recently derived a random variable transformation theorem relating to the joint probability densities of functionally dependent sets of random variables. The present author points out that the theorem can be derived as an immediate corollary of a simpler and more fundamental relation. In this relation the probability density is represented as a delta function averaged over an unspecified distribution of unspecified internal random variables. The random variable transformation is derived from this relation

  2. New nuclear data set ABBN-90 and its testing on macroscopic experiments

    International Nuclear Information System (INIS)

    Kosh'cheev, V.N.; Manturov, G.N.; Nikolaev, M.N.; Rineyskiy, A.A.; Sinitsa, V.V.; Tsyboolya, A.M.; Zabrodskaya, S.V.

    1993-01-01

    The new group constant set ABBN-90 is developed now. It based on the FOND-2 evaluated neutron data library processed with the code GRUCON. Some results of the testing ABBN-90 set in different macroscopic experiments are presented. (author)

  3. An Estimation of a Passive Infra-Red Sensor Probability of Detection

    International Nuclear Information System (INIS)

    Osman, E.A.; El-Gazar, M.I.; Shaat, M.K.; El-Kafas, A.A.; Zidan, W.I.; Wadoud, A.A.

    2009-01-01

    Passive Infera-Red (PIR) sensors are one of many detection sensors are used to detect any intrusion process of the nuclear sites. In this work, an estimation of a PIR Sensor's Probability of Detection of a hypothetical facility is presented. sensor performance testing performed to determine whether a particular sensor will be acceptable in a proposed design. We have access to a sensor test field in which the sensor of interest is already properly installed and the parameters have been set to optimal levels by preliminary testing. The PIR sensor construction, operation and design for the investigated nuclear site are explained. Walking and running intrusion tests were carried out inside the field areas of the PIR sensor to evaluate the sensor performance during the intrusion process. 10 trials experimentally performed for achieving the intrusion process via a passive infra-red sensor's network system. The performance and intrusion senses of PIR sensors inside the internal zones was recorded and evaluated.

  4. Identifying genetic marker sets associated with phenotypes via an efficient adaptive score test

    KAUST Repository

    Cai, T.

    2012-06-25

    In recent years, genome-wide association studies (GWAS) and gene-expression profiling have generated a large number of valuable datasets for assessing how genetic variations are related to disease outcomes. With such datasets, it is often of interest to assess the overall effect of a set of genetic markers, assembled based on biological knowledge. Genetic marker-set analyses have been advocated as more reliable and powerful approaches compared with the traditional marginal approaches (Curtis and others, 2005. Pathways to the analysis of microarray data. TRENDS in Biotechnology 23, 429-435; Efroni and others, 2007. Identification of key processes underlying cancer phenotypes using biologic pathway analysis. PLoS One 2, 425). Procedures for testing the overall effect of a marker-set have been actively studied in recent years. For example, score tests derived under an Empirical Bayes (EB) framework (Liu and others, 2007. Semiparametric regression of multidimensional genetic pathway data: least-squares kernel machines and linear mixed models. Biometrics 63, 1079-1088; Liu and others, 2008. Estimation and testing for the effect of a genetic pathway on a disease outcome using logistic kernel machine regression via logistic mixed models. BMC bioinformatics 9, 292-2; Wu and others, 2010. Powerful SNP-set analysis for case-control genome-wide association studies. American Journal of Human Genetics 86, 929) have been proposed as powerful alternatives to the standard Rao score test (Rao, 1948. Large sample tests of statistical hypotheses concerning several parameters with applications to problems of estimation. Mathematical Proceedings of the Cambridge Philosophical Society, 44, 50-57). The advantages of these EB-based tests are most apparent when the markers are correlated, due to the reduction in the degrees of freedom. In this paper, we propose an adaptive score test which up- or down-weights the contributions from each member of the marker-set based on the Z-scores of

  5. Comparing the IRT Pre-equating and Section Pre-equating: A Simulation Study.

    Science.gov (United States)

    Hwang, Chi-en; Cleary, T. Anne

    The results obtained from two basic types of pre-equatings of tests were compared: the item response theory (IRT) pre-equating and section pre-equating (SPE). The simulated data were generated from a modified three-parameter logistic model with a constant guessing parameter. Responses of two replication samples of 3000 examinees on two 72-item…

  6. Pre-employment physical capacity testing as a predictor for musculoskeletal injury in paramedics: A review of the literature.

    Science.gov (United States)

    Jenkins, Natasha; Smith, Gavin; Stewart, Scott; Kamphuis, Catherine

    2016-11-22

    Workplace injuries place a significant physical, social and financial burden on organisations globally. Paramedics provide emergency management of workplace injuries, and are subjected to heightened injury risk as a direct consequence of providing such care. This review aims to identify the current evidence reporting workplace musculoskeletal injury generally, and to relate this to pre-employment physical capacity testing within the paramedic industry specifically. A search of the electronic databases (Ovid Medline, Cochrane Database of Systematic Reviews, NIOSHTIC-2, RILOSH, CISDOC and HSELINE) was completed using the keywords musculoskeletal, workplace, injury, industrial, accident, pre-employment physical capacity testing, paramedic, emergency service employee, firefighter, and police. Articles were excluded if they did not describe pre-employment physical capacity testing, musculoskeletal injuries, or were not available in English. The electronic literature search identified 765 articles, following application of exclusion criteria: based on title/abstract of article (669); no relevance (62) or unavailable in English (4), 30 articles were included in this review.The review identified that physical fitness, gender, age, equipment and demographic variables were key factors in the current high rate of paramedic workplace injury. However, there is little evidence available to quantify the relationship between pre-employment physical capacity testing and subsequent injury amongst the paramedic cohort. Despite evidence suggesting that pre-employment physical capacity testing scores may be predictive of subsequent musculoskeletal injury in paramedics, there are currently no studies in this area. Quantifying the potential association between factors affecting the conduct of paramedic work and the type of injuries that result requires examination through future research.

  7. The Influence of Pre-University Students' Mathematics Test Anxiety and Numerical Anxiety on Mathematics Achievement

    Science.gov (United States)

    Seng, Ernest Lim Kok

    2015-01-01

    This study examines the relationship between mathematics test anxiety and numerical anxiety on students' mathematics achievement. 140 pre-university students who studied at one of the institutes of higher learning were being investigated. Gender issue pertaining to mathematics anxieties was being addressed besides investigating the magnitude of…

  8. Theory of random sets

    CERN Document Server

    Molchanov, Ilya

    2017-01-01

    This monograph, now in a thoroughly revised second edition, offers the latest research on random sets. It has been extended to include substantial developments achieved since 2005, some of them motivated by applications of random sets to econometrics and finance. The present volume builds on the foundations laid by Matheron and others, including the vast advances in stochastic geometry, probability theory, set-valued analysis, and statistical inference. It shows the various interdisciplinary relationships of random set theory within other parts of mathematics, and at the same time fixes terminology and notation that often vary in the literature, establishing it as a natural part of modern probability theory and providing a platform for future development. It is completely self-contained, systematic and exhaustive, with the full proofs that are necessary to gain insight. Aimed at research level, Theory of Random Sets will be an invaluable reference for probabilists; mathematicians working in convex and integ...

  9. Safety Evaluation Test on Human for Radiation Pre vulcanised Natural Rubber Latex (RVNRL)

    International Nuclear Information System (INIS)

    Pairu Ibrahim; Wan Manshol Wan Zain; Chai, Chee Keong; Sofian Ibrahim; Saadiah Sulaiman; Sharifah Ismail

    2010-01-01

    This paper discussed about clinical test conducted to determine safety evaluations on human for latex examination gloves and latex films made from Radiation Vulcanized Natural Rubber Latex (RVNRL). Two types of test were being adopted which are i) Modified Draize-95 test and ii) Patch Test on Sensitized Individuals. Modified Draize-95 test was conducted on 200 non-sensitized human subjects and Patch Test on Sensitized Individuals was conducted on 25 individuals who are allergic to the defined major chemical sensitizer presents in natural rubber product. It was found that Modified Draize-95 test has prove that there is no clinical evidence on the presence of residual chemical additives at the level that may induce Type IV allergy in the un sensitized general user population in the RVNRL gloves. Meanwhile Patch Test on Sensitized Individuals has proved that the patch test conducted using the test article on 25 individuals who are allergic to the defined major chemical sensitizers present in natural rubber products, thiuram, carbamates or thiazoles produced a negative response, meeting the pre requirement for the claim of reduced reaction-inducing potential. (author)

  10. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  11. The utility of repeat enzyme immunoassay testing for the diagnosis of Clostridium difficile infection: A systematic review of the literature

    Directory of Open Access Journals (Sweden)

    P S Garimella

    2012-01-01

    Full Text Available Over the last 20 years, the prevalence of healthcare-associated Clostridium difficile (C. diff disease has increased. While multiple tests are available for the diagnosis of C. diff infection, enzyme immunoassay (EIA testing for toxin is the most used. Repeat EIA testing, although of limited utility, is common in medical practice. To assess the utility of repeat EIA testing to diagnose C. diff infections. Systematic literature review. Eligible studies performed >1 EIA test for C. diff toxin and were published in English. Electronic searches of MEDLINE and EMBASE were performed and bibliographies of review articles and conference abstracts were hand searched. Of 805 citations identified, 32 were reviewed in detail and nine were included in the final review. All studies except one were retrospective chart reviews. Seven studies had data on number of participants (32,526, and the overall reporting of test setting and patient characteristics was poor. The prevalence of C. diff infection ranged from 9.1% to 18.5%. The yield of the first EIA test ranged from 8.4% to 16.6%, dropping to 1.5-4.7% with a second test. The utility of repeat testing was evident in outbreak settings, where the yield of repeat testing was 5%. Repeat C. diff testing for hospitalized patients has low clinical utility and may be considered in outbreak settings or when the pre-test probability of disease is high. Future studies should aim to identify patients with a likelihood of disease and determine the utility of repeat testing compared with empiric treatment.

  12. Probability Modeling and Thinking: What Can We Learn from Practice?

    Science.gov (United States)

    Pfannkuch, Maxine; Budgett, Stephanie; Fewster, Rachel; Fitch, Marie; Pattenwise, Simeon; Wild, Chris; Ziedins, Ilze

    2016-01-01

    Because new learning technologies are enabling students to build and explore probability models, we believe that there is a need to determine the big enduring ideas that underpin probabilistic thinking and modeling. By uncovering the elements of the thinking modes of expert users of probability models we aim to provide a base for the setting of…

  13. Pre-test calculations for FAL-19 and FAL-20 using the ITHACA code

    International Nuclear Information System (INIS)

    Bradley, S.J.; Ketchell, N.

    1992-08-01

    Falcon is a small scale experimental apparatus, designed to simulate the transport of fission products through the primary circuit and containment of a nuclear power reactor under severe accident conditions. Information gained from the experiments in Falcon will be used to guide and assist in understanding the much larger Phebus-FP experiments. This report presents the results of pre-test calculations performed using ITHACA for the two tests: FAL-19 and FAL-20. Initial calculations were concerned solely with the thermal-hydraulic conditions in the containment while later ones briefly investigated the effect of the injection of an insoluble aerosol into the containment with the same thermal-hydraulic conditions. (author)

  14. PENGEMBANGAN MODUL PEMBELAJARAN KIMIA BERVISI SETS BERORIENTASI CHEMO-ENTREPRENEURSHIP (CEP PADA MATERI LARUTAN ASAM BASA

    Directory of Open Access Journals (Sweden)

    M. Agus Prayitno

    2016-05-01

    Full Text Available Kimia merupakan salah satu mata pelajaran yang berkaitan dengan kehidupan sehari- hari, dari segi lingkungan teknologi, maupun masyarakat. Tujuan penelitian ini adalah menghasilkan modul pembelajaran kimia bervisi SETS berorientasi CEP pada materi larutan asam basa yang layak dan efektif digunakan dalam pembelajaran untuk meningkatkan motivasi belajar, minat wirausaha, dan hasil belajar siswa. Penelitian ini merupakan penelitian pengembangan. Uji coba kelayakan produk pada penelitian ini dilaksanakan di MA Mu’allimin Mu’allimat dengan model penelitian one group pre-test and post-test design dan di MAN Rembang dengan model penelitian pre-test and post-test control group design. Data penelitian diperoleh dengan metode validasi, observasi, dokumentasi, tes, dan skala Likert. Hasil validasi ahli materi, ahli media, dan praktisi menunjukkan bahwa modul pembelajaran kimia bervisi SETS berorientasi CEP sangat layak digunakan dalam pembelajaran kimia dengan skor ratarata pada aspek kegrafikan 95,00, aspek penyajian 95,33, aspek kebahasaan 95,00, dan aspek kegrafikan 94,44. Hasil uji coba modul di MA Mu’allimin Mu’allimat menunjukkan peningkatan motivasi 20%, minat wirausaha 25%, dan hasil belajar siswa 79%. Uji coba modul di MAN Rembang peningkatan motivasi belajar siswa, minat wirausaha, dan hasil belajar siswa kelas eksperimen sebesar 27%, 17% dan 66%, sedangkan peningkatan kelas kontrol secara berturutturut 0,4%, 11%, dan 24%.Chemistryis one of the subjects related to everyday life, in terms of environmental technology, and society. The purpose of this research is to produce visionary chemistry learning modules SETS with CEP oriented on acid-base materials which properly and effectively use in learning to improve learning motivation, entrepreneurial interest, and student learning outcomes. This research is development reaserch. Testing the product viability on the research carried out in MA Mu'allimin Mu'allimat with research model are one

  15. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  16. Savannah River Site TEP-SET tests uncertainty report

    International Nuclear Information System (INIS)

    Taylor, D.J.N.

    1993-09-01

    This document presents a measurement uncertainty analysis for the instruments used for the Phase I, II and III of the Savannah River One-Fourth Linear Scale, One-Sixth Sector, Tank/Muff/Pump (TMP) Separate Effects Tests (SET) Experiment Series. The Idaho National Engineering Laboratory conducted the tests for the Savannah River Site (SRS). The tests represented a range of hydraulic conditions and geometries that bound anticipated Large Break Loss of Coolant Accidents in the SRS reactors. Important hydraulic phenomena were identified from experiments. In addition, code calculations will be benchmarked from these experiments. The experimental system includes the following measurement groups: coolant density; absolute and differential pressures; turbine flowmeters (liquid phase); thermal flowmeters (gas phase); ultrasonic liquid level meters; temperatures; pump torque; pump speed; moderator tank liquid inventory via a load cells measurement; and relative humidity meters. This document also analyzes data acquisition system including the presampling filters as it relates to these measurements

  17. NESC-1 spinning cylinder experiment. Pre-test fracture analysis evaluation

    International Nuclear Information System (INIS)

    Moinereau, D.; Pitard-Bouet, J.M.

    1996-10-01

    A pre-test structural analysis evaluation has been conducted by Electricite de France (EDF) including several three dimensional elastic and elastic-plastic computations. Two cylinder geometries have been studied. Higher values of the stress intensity factor are obtained in both geometries in the elastic-plastic computations due to the yielding of the cladding during the thermal transient. The comparisons between the stress intensity factors and the expected base metal toughness show that cleavage initiation must occur preferably in base metal near the interface with cladding. The comparison between both geometries show also that the thicker vessel with a deeper semi-elliptical sub-clad flaw (70 mm deep) is more favourable to cleavage initiation near the base metal - cladding interface. (K.A.)

  18. MATLAB-SIMULINK BASED INFORMATION SUPPORT FOR DIGITAL OVERCURRENT PROTECTION TEST SETS

    Directory of Open Access Journals (Sweden)

    I. V. Novash

    2017-01-01

    Full Text Available The implementation of information support for PC-based and hardware-software based sets for digital overcurrent protection devices and their models testing using MatLab-Simulink environment is considered. It is demonstrated that the mathematical modeling of a part of the power system – viz. of the generalized electric power object – could be based on rigid and flexible models. Rigid models implemented on the basis of mathematical description of electrical and magnetic circuits of a power system can be considered as a reference model for the simulation results that have been obtained with the aid of another simulation system to be compared with. It is proposed to implement flexible models for generalized electric power object in the MatLabSimulink environment that includes the SimPowerSystems component library targeted to power system modeling. The features of the parameters calculation of the SimPowerSystems component library blocks that the power system model is formed of are considered. Out of the Simulink standard blocks the models of a wye-connected current transformers were composed as well as the digital overcurrent protection, missing in the component library. A comparison of simulation results of one and the same generalized electric power object implemented in various PC-based software packages was undertaken. The divergence of simulation results did not exceed 3 %; the latter allows us to recommend the MatLab-Simulink environment for information support creation for hardware-software based sets for digital overcurrent protection devices testing. The structure of the hardware-software based set for digital overcurrent protection device testing using the Omicron CMC 356 has been suggested. Time to trip comparison between the real digital protection device МР 801 and the model with the parameters which are exactly match the parameters of the prototype device was carried out using the identical test inputs. The results of the tests

  19. Using Set Covering with Item Sampling to Analyze the Infeasibility of Linear Programming Test Assembly Models

    Science.gov (United States)

    Huitzing, Hiddo A.

    2004-01-01

    This article shows how set covering with item sampling (SCIS) methods can be used in the analysis and preanalysis of linear programming models for test assembly (LPTA). LPTA models can construct tests, fulfilling a set of constraints set by the test assembler. Sometimes, no solution to the LPTA model exists. The model is then said to be…

  20. Test-retest reliability of the Middlesex Assessment of Mental State (MEAMS): a preliminary investigation in people with probable dementia.

    Science.gov (United States)

    Powell, T; Brooker, D J; Papadopolous, A

    1993-05-01

    Relative and absolute test-retest reliability of the MEAMS was examined in 12 subjects with probable dementia and 12 matched controls. Relative reliability was good. Measures of absolute reliability showed scores changing by up to 3 points over an interval of a week. A version effect was found to be in evidence.

  1. Pre-start timing information is used to set final linear speed in a C-start manoeuvre.

    Science.gov (United States)

    Reinel, Caroline; Schuster, Stefan

    2014-08-15

    In their unique hunting behaviour, archerfish use a complex motor decision to secure their prey: based solely on how dislodged prey initially falls, they select an adapted C-start manoeuvre that turns the fish right towards the point on the water surface where their prey will later land. Furthermore, they take off at a speed that is set so as to arrive in time. We show here that the C-start manoeuvre and not subsequent tail beating is necessary and sufficient for setting this adaptive level of speed. Furthermore, the C-start pattern is adjusted to independently determine both the turning angle and the take-off speed. The selection of both aspects requires no a priori information and is done based on information sampled from the onset of target motion until the C-start is launched. Fin strokes can occur right after the C-start manoeuvre but are not required to fine-tune take-off speed, but rather to maintain it. By probing the way in which the fish set their take-off speed in a wide range of conditions in which distance from the later catching point and time until impact varied widely and unpredictably, we found that the C-start manoeuvre is programmed based on pre-C-start estimates of distance and time until impact. Our study hence provides the first evidence for a C-start that is fine-tuned to produce an adaptive speed level. © 2014. Published by The Company of Biologists Ltd.

  2. An exponential combination procedure for set-based association tests in sequencing studies.

    Science.gov (United States)

    Chen, Lin S; Hsu, Li; Gamazon, Eric R; Cox, Nancy J; Nicolae, Dan L

    2012-12-07

    State-of-the-art next-generation-sequencing technologies can facilitate in-depth explorations of the human genome by investigating both common and rare variants. For the identification of genetic factors that are associated with disease risk or other complex phenotypes, methods have been proposed for jointly analyzing variants in a set (e.g., all coding SNPs in a gene). Variants in a properly defined set could be associated with risk or phenotype in a concerted fashion, and by accumulating information from them, one can improve power to detect genetic risk factors. Many set-based methods in the literature are based on statistics that can be written as the summation of variant statistics. Here, we propose taking the summation of the exponential of variant statistics as the set summary for association testing. From both Bayesian and frequentist perspectives, we provide theoretical justification for taking the sum of the exponential of variant statistics because it is particularly powerful for sparse alternatives-that is, compared with the large number of variants being tested in a set, only relatively few variants are associated with disease risk-a distinctive feature of genetic data. We applied the exponential combination gene-based test to a sequencing study in anticancer pharmacogenomics and uncovered mechanistic insights into genes and pathways related to chemotherapeutic susceptibility for an important class of oncologic drugs. Copyright © 2012 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  3. Data for TROTS – The Radiotherapy Optimisation Test Set

    Directory of Open Access Journals (Sweden)

    Sebastiaan Breedveld

    2017-06-01

    Full Text Available The Radiotherapy Optimisation Test Set (TROTS is an extensive set of problems originating from radiotherapy (radiation therapy treatment planning. This dataset is created for 2 purposes: (1 to supply a large-scale dense dataset to measure performance and quality of mathematical solvers, and (2 to supply a dataset to investigate the multi-criteria optimisation and decision-making nature of the radiotherapy problem. The dataset contains 120 problems (patients, divided over 6 different treatment protocols/tumour types. Each problem contains numerical data, a configuration for the optimisation problem, and data required to visualise and interpret the results. The data is stored as HDF5 compatible Matlab files, and includes scripts to work with the dataset.

  4. Impact of introduction of rapid diagnostic tests for malaria on antibiotic prescribing: analysis of observational and randomised studies in public and private healthcare settings.

    Science.gov (United States)

    Hopkins, Heidi; Bruxvoort, Katia J; Cairns, Matthew E; Chandler, Clare I R; Leurent, Baptiste; Ansah, Evelyn K; Baiden, Frank; Baltzell, Kimberly A; Björkman, Anders; Burchett, Helen E D; Clarke, Siân E; DiLiberto, Deborah D; Elfving, Kristina; Goodman, Catherine; Hansen, Kristian S; Kachur, S Patrick; Lal, Sham; Lalloo, David G; Leslie, Toby; Magnussen, Pascal; Jefferies, Lindsay Mangham; Mårtensson, Andreas; Mayan, Ismail; Mbonye, Anthony K; Msellem, Mwinyi I; Onwujekwe, Obinna E; Owusu-Agyei, Seth; Reyburn, Hugh; Rowland, Mark W; Shakely, Delér; Vestergaard, Lasse S; Webster, Jayne; Wiseman, Virginia L; Yeung, Shunmay; Schellenberg, David; Staedke, Sarah G; Whitty, Christopher J M

    2017-03-29

    up untargeted use of antibiotics. That 69% of patients were prescribed antibiotics when test results were negative probably represents overprescription.This included antibiotics from several classes, including those like metronidazole that are seldom appropriate for febrile illness, across varied clinical, health system, and epidemiological settings. It is often assumed that better disease specific diagnostics will reduce antimicrobial overuse, but they might simply shift it from one antimicrobial class to another. Current global implementation of malaria testing might increase untargeted antibiotic use and must be examined. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Influence of Pre-question and genre-based instructional strategies on reading

    Directory of Open Access Journals (Sweden)

    Titi J. Fola-Adebayo

    2014-12-01

    Full Text Available This study investigated the influence of Pre-question and genre-based instructional strategies on science undergraduates’ achievement in, and attitude to, reading. Using purposive sampling,two specialised universities in Nigeria were selected and stratified sampling was employed in assigning students to research groups based on gender and performance in a verbal ability test. Two hundred and eighty-five students participated in the study. Pre-post randomised block experimental design was used with three experimental groups and one control group. The experimental procedure involving Pre-question, genre-based instruction and a combination of Pre-question and genre-based instructional strategies were used for the experimental groups for four weeks whilst the control group received normal teacher input. Data were collected through a Reading Comprehension Achievement Test and Students’ Attitude Questionnaire. Qualitative data, obtained from videotapes of classroom interactions, were subjected to conversation and interaction analyses and quantitative data were analysed with Analysis of Covariance (ANCOVA. The results indicate that although there was no significant main effect of instructional strategy on students’ achievement in reading comprehension, there was significant main effect of instructional strategy on students’ attitude to reading (F(3,231 = 30.9;p <.05. Findings from the qualitative enquiry revealed that female students were more voluble and assertive in their responses probably because of the need to resist male domination whilst male students used discourse strategies to affirm their authority. The study indicated that the combination of pre-question and genre-based approach was the most effective in enhancing the students’ attitude to reading. Reading is one of the most useful of the Language Arts skills which learners need for academic reasons and for lifelong learning. The globalised world demands that the second language

  6. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  7. Dynamic Toughness Testing of Pre-Cracked Charpy V-Notch Specimens. Convention ELECTRABEL - SCK-CEN

    Energy Technology Data Exchange (ETDEWEB)

    Lucon, E

    1999-04-01

    This document describes the experimental and analytical procedures which have been adopted at the laboratories of the Belgian Nuclear Research Centre SCK-CEN for performing dynamic toughness tests on pre-cracked Charpy-V specimens. Such procedures were chosen on the basis of the existing literature on the subject, with several updates in the data analysis stages which reflect more recent developments in fracture toughness testing. Qualification tests have been carried out on PCCv specimens of JRQ steel, in order to assess the reliability of the results obtained; straightforward comparisons with reference data have been performed, as well as more advanced analyses using the Master Curve approach. Aspects related to machine compliance and dynamic tup calibration have also been addressed.

  8. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  9. To test, or not to test: time for a MODY calculator?

    Science.gov (United States)

    Njølstad, P R; Molven, A

    2012-05-01

    To test, or not to test, that is often the question in diabetes genetics. This is why the paper of Shields et al in the current issue of Diabetologia is so warmly welcomed. MODY is the most common form of monogenic diabetes. Nevertheless, the optimal way of identifying MODY families still poses a challenge both for researchers and clinicians. Hattersley's group in Exeter, UK, have developed an easy-to-use MODY prediction model that can help to identify cases appropriate for genetic testing. By answering eight simple questions on the internet ( www.diabetesgenes.org/content/mody-probability-calculator ), the doctor receives a positive predictive value in return: the probability that the patient has MODY. Thus, the classical binary (yes/no) assessment provided by clinical diagnostic criteria has been substituted by a more rational, quantitative estimate. The model appears to discriminate well between MODY and type 1 and type 2 diabetes when diabetes is diagnosed before the age of 35 years. However, the performance of the MODY probability calculator should now be validated in other settings than where it was developed-and, as always, there is room for some improvements and modifications.

  10. Burden of high fracture probability worldwide: secular increases 2010-2040.

    Science.gov (United States)

    Odén, A; McCloskey, E V; Kanis, J A; Harvey, N C; Johansson, H

    2015-09-01

    The number of individuals aged 50 years or more at high risk of osteoporotic fracture worldwide in 2010 was estimated at 158 million and is set to double by 2040. The aim of this study was to quantify the number of individuals worldwide aged 50 years or more at high risk of osteoporotic fracture in 2010 and 2040. A threshold of high fracture probability was set at the age-specific 10-year probability of a major fracture (clinical vertebral, forearm, humeral or hip fracture) which was equivalent to that of a woman with a BMI of 24 kg/m(2) and a prior fragility fracture but no other clinical risk factors. The prevalence of high risk was determined worldwide and by continent using all available country-specific FRAX models and applied the population demography for each country. Twenty-one million men and 137 million women had a fracture probability at or above the threshold in the world for the year 2010. The greatest number of men and women at high risk were from Asia (55 %). Worldwide, the number of high-risk individuals is expected to double over the next 40 years. We conclude that individuals with high probability of osteoporotic fractures comprise a very significant disease burden to society, particularly in Asia, and that this burden is set to increase markedly in the future. These analyses provide a platform for the evaluation of risk assessment and intervention strategies.

  11. Overweight and abdominal obesity as determinants of undiagnosed diabetes and pre-diabetes in Bangladesh

    OpenAIRE

    Alam, Dewan S; Talukder, Shamim H; Chowdhury, Muhammad Ashique Haider; Siddiquee, Ali Tanweer; Ahmed, Shyfuddin; Pervin, Sonia; Khan, Sushmita; Hasan, Khaled; Koehlmoos, Tracey L P; Niessen, Louis

    2016-01-01

    Background\\ud Type 2 diabetes and pre-diabetes are an increasing pandemic globally and often remain undiagnosed long after onset in low-income settings. The objective of this study is to assess the determinants and prevalence of undiagnosed diabetes and pre-diabetes among adults in Bangladesh.\\ud \\ud Methods\\ud In an exploratory study, we performed oral glucose tolerance test on 1243 adults ≥20 years of age from urban Mirpur, Dhaka (n = 518) and rural Matlab, Chandpur (n = 725) who had never ...

  12. The association of pre-pregnancy alcohol drinking with child neuropsychological functioning

    DEFF Research Database (Denmark)

    Kesmodel, Ulrik Schiøler; Kjærsgaard, Maiken Ina Siegismund; Denny, Clark H.

    2015-01-01

    Objective: To examine the effects of pre-pregnancy alcohol drinking on child neuropsychological functioning. Design: Prospective follow-up study. Setting and population: 154 women and their children sampled from the Danish National Birth Cohort. Methods: Participants were sampled based on maternal...... of Executive Function (BRIEF) was completed by the mothers and a preschool teacher. Parental education, maternal IQ, prenatal maternal smoking, child’s age at testing, child’s sex, and maternal alcohol intake during pregnancy were considered potential confounders. Main outcome measures: Performance...... and sustained attention. Assessment of pre-pregnancy drinking provides additional information regarding potential prenatal alcohol exposure and its implications for child neurodevelopment....

  13. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  14. Exact calculation of loop formation probability identifies folding motifs in RNA secondary structures

    Science.gov (United States)

    Sloma, Michael F.; Mathews, David H.

    2016-01-01

    RNA secondary structure prediction is widely used to analyze RNA sequences. In an RNA partition function calculation, free energy nearest neighbor parameters are used in a dynamic programming algorithm to estimate statistical properties of the secondary structure ensemble. Previously, partition functions have largely been used to estimate the probability that a given pair of nucleotides form a base pair, the conditional stacking probability, the accessibility to binding of a continuous stretch of nucleotides, or a representative sample of RNA structures. Here it is demonstrated that an RNA partition function can also be used to calculate the exact probability of formation of hairpin loops, internal loops, bulge loops, or multibranch loops at a given position. This calculation can also be used to estimate the probability of formation of specific helices. Benchmarking on a set of RNA sequences with known secondary structures indicated that loops that were calculated to be more probable were more likely to be present in the known structure than less probable loops. Furthermore, highly probable loops are more likely to be in the known structure than the set of loops predicted in the lowest free energy structures. PMID:27852924

  15. Unemployment in Iraqi Refugees: The Interaction of Pre and Post-Displacement Trauma

    Science.gov (United States)

    Wright, A. Michelle; Dhalimi, Abir; Lumley, Mark A.; Jamil, Hikmet; Pole, Nnamdi; Arnetz, Judith E.; Arnetz, Bengt B.

    2016-01-01

    Previous refugee research has been unable to link pre-displacement trauma with unemployment in the host country. The current study assessed the role of pre-displacement trauma, post-displacement trauma, and the interaction of both trauma types to prospectively examine unemployment in a random sample of newly-arrived Iraqi refugees. Participants (N=286) were interviewed three times over the first two years post-arrival. Refugees were assessed for pre-displacement trauma exposure, post-displacement trauma exposure, a history of unemployment in the country of origin and host country, and symptoms of posttraumatic stress disorder (PTSD) and depression. Analyses found that neither pre-displacement nor post-displacement trauma independently predicted unemployment 2 years post-arrival; however, the interaction of pre and post-displacement trauma predicted 2-year unemployment. Refugees with high levels of both pre and post-displacement trauma had a 91% predicted probability of unemployment, whereas those with low levels of both traumas had a 20% predicted probability. This interaction remained significant after controlling for sociodemographic variables and mental health upon arrival to the U.S. Resettlement agencies and community organizations should consider the interactive effect of encountering additional trauma after escaping the hardships of the refugee's country of origin. PMID:27535348

  16. Pre-employment screening of latent tuberculosis infection among healthcare workers using tuberculin skin test and QuantiFERON-TB Gold test at a tertiary care hospital in Saudi Arabia

    Directory of Open Access Journals (Sweden)

    Mohamed El-Helaly

    2014-11-01

    Full Text Available Summary: Objective: To assess the agreement between the tuberculin skin test (TST and the QuantiFERON-TB Gold test (QFT-G as pre-employment screening tests for latent tuberculosis infection (LTBI among healthcare workers. Methods: A retrospective cross-sectional study was conducted among 1412 healthcare workers who were screened for LTBI during the period from August 2009 to May 2011 at a tertiary-care hospital in the Kingdom of Saudi Arabia (KSA. The studied population was screened for LTBI using both TST and QFT-G simultaneously. The agreement between both tests was quantified using the Kappa coefficient (κ. Results: Comparing the results of QFT-G with TST, the tests had a significant overall agreement of 73.7% (1040/1412; κ = 0.33; p < 0.01. Negative concordance comprised 60.1% of the results, and positive concordance comprised 13.5%. However, positive TST but negative QFT comprised 16.3% of the results, and negative TST but positive QFT-G comprised 10.1%. Concordance was significantly associated with young age, female gender, Saudi-born nationals, and early career but not job type (clinical versus non-clinical nor status of Bacillus Calmette–Guerin (BCG vaccination. Conclusions: This study demonstrated 73.7% overall agreement between TST and QFT-G results among healthcare workers during pre-employment screening for LTBI. The results need to be confirmed in future studies before recommending QFT-G as a pre-employment screening test for LTBI. Keywords: Latent tuberculosis infection, Healthcare workers, Tuberculin skin test, QuantiFERON-TB Gold test

  17. Mechanical and Permeability Characteristics of Latex-Modified Pre-Packed Pavement Repair Concrete as a Function of the Rapid-Set Binder Content

    Directory of Open Access Journals (Sweden)

    Jae-Woong Han

    2015-10-01

    Full Text Available We evaluated the strength and durability characteristics of latex-polymer-modified, pre-packed pavement repair concrete (LMPPRC with a rapid-set binder. The rapid-set binder was a mixture of rapid-set cement and silica sand, where the fluidity was controlled using a latex polymer. The resulting mix exhibited a compressive strength of ¥21 MPa and a flexural strength of ¥3.5 MPa after 4 h of curing (i.e., the traffic opening term for emergency repairs of pavement. The ratio of latex polymer to rapid-set binder material was varied through 0.40, 0.33, 0.29, and 0.25. Mechanical characterization revealed that the mechanical performance, permeability, and impact resistance increased as the ratio of latex polymer to rapid-set binder decreased. The mixture exhibited a compressive strength of ¥21 MPa after 4 h when the ratio of latex polymer to rapid-set binder material was ¤0.29. The mixture exhibited a flexural strength of ¥3.5 MPa after 4 h when the ratio of latex polymer to rapid-set binder material was ¤0.33. The permeability resistance to chloride ions satisfied 2000 C after 7 days of curing for all ratios. The ratio of latex polymer to rapid-set binder material that satisfied all conditions for emergency pavement repair was ¤0.29.

  18. Gaussian Hypothesis Testing and Quantum Illumination.

    Science.gov (United States)

    Wilde, Mark M; Tomamichel, Marco; Lloyd, Seth; Berta, Mario

    2017-09-22

    Quantum hypothesis testing is one of the most basic tasks in quantum information theory and has fundamental links with quantum communication and estimation theory. In this paper, we establish a formula that characterizes the decay rate of the minimal type-II error probability in a quantum hypothesis test of two Gaussian states given a fixed constraint on the type-I error probability. This formula is a direct function of the mean vectors and covariance matrices of the quantum Gaussian states in question. We give an application to quantum illumination, which is the task of determining whether there is a low-reflectivity object embedded in a target region with a bright thermal-noise bath. For the asymmetric-error setting, we find that a quantum illumination transmitter can achieve an error probability exponent stronger than a coherent-state transmitter of the same mean photon number, and furthermore, that it requires far fewer trials to do so. This occurs when the background thermal noise is either low or bright, which means that a quantum advantage is even easier to witness than in the symmetric-error setting because it occurs for a larger range of parameters. Going forward from here, we expect our formula to have applications in settings well beyond those considered in this paper, especially to quantum communication tasks involving quantum Gaussian channels.

  19. Silane pre-treatments on copper and aluminium

    International Nuclear Information System (INIS)

    Deflorian, F.; Rossi, S.; Fedrizzi, L.

    2006-01-01

    A large part of aluminium products are coated with an organic layer in order to improve the corrosion resistance. Copper surfaces are also sometimes protected with an organic coating to improve the durability or the aesthetic properties. Examples of industrial applications are household appliances and heat exchanger components. For these applications it is not rare to have the industrial need to treat at the same time components made of aluminium and copper. In order to extend the service life of the organic coated copper a specific surface pre-treatment is often required. Nevertheless, probably because of the limited market of this application, no specific pre-treatments for copper are industrially developed, with the exception of cleaning procedures, but simply extensions of existing pre-treatments optimised for other metals (aluminium, zinc) are used. The application of silane pre-treatments as adhesion promoters for organic coated metals is remarkably increasing in the last decade, because silanes offer very good performance together with high environmental compatibility. The idea is therefore to try to develop a specific silane based pre-treatment for copper. The starting point is the existing silane products for aluminium, optimising the composition and the application conditions (concentration, temperature, pH of the bath, etc.) in order to develop a high performance copper alloy pre-treatment increasing the protective properties and the adhesion of a successively applied organic coating. Moreover these pre-treatments could be used for aluminium alloys too and therefore could be suggested for multi-metals components. The deposits were analysed using FTIR spectroscopy and optical and electron microscopic observations. A careful electrochemical characterisation, mainly by electrochemical impedance spectroscopy measurements (EIS) was carried out to highlight the presence of silane and to evaluate the performance of the different deposits. In order to study an

  20. Modelling uncertainty with generalized credal sets: application to conjunction and decision

    Science.gov (United States)

    Bronevich, Andrey G.; Rozenberg, Igor N.

    2018-01-01

    To model conflict, non-specificity and contradiction in information, upper and lower generalized credal sets are introduced. Any upper generalized credal set is a convex subset of plausibility measures interpreted as lower probabilities whose bodies of evidence consist of singletons and a certain event. Analogously, contradiction is modelled in the theory of evidence by a belief function that is greater than zero at empty set. Based on generalized credal sets, we extend the conjunctive rule for contradictory sources of information, introduce constructions like natural extension in the theory of imprecise probabilities and show that the model of generalized credal sets coincides with the model of imprecise probabilities if the profile of a generalized credal set consists of probability measures. We give ways how the introduced model can be applied to decision problems.

  1. Testing the effect of defaults on the thermostat settings of OECD employees

    International Nuclear Information System (INIS)

    Brown, Zachary; Johnstone, Nick; Haščič, Ivan; Vong, Laura; Barascud, Francis

    2013-01-01

    We describe a randomized controlled experiment in which the default settings on office thermostats in an OECD office building were manipulated during the winter heating season, and employees' chosen thermostat setting observed over a 6-week period. Using difference-in-differences, panel, and censored regression models (to control for maximum allowable thermostat settings), we find that a 1 °C decrease in the default caused a reduction in the chosen setting by 0.38 °C, on average. Sixty-five percent of this effect could be attributed to office occupant behavior (p-value = 0.044). The difference-in-differences models show that small decreases in the default (1°) led to a greater reduction in chosen settings than large decreases (2°). We also find that office occupants who were more apt to adjust their thermostats prior to the intervention were less susceptible to the default. We conclude that this kind of intervention can increase building-level energy efficiency, and discuss potential explanations and broader policy implications of our findings. - Highlights: • We conduct a randomized controlled trial to test if thermostat defaults affect agent behavior. • Two treatments (schedules of default settings) were tested against a control for 6 weeks at OECD. • Small changes in defaults had a greater effect on chosen settings than larger changes in defaults. • Occupants who frequently changed their thermostats in baseline were less affected by defaults. • Thermostat defaults in office environments can be manipulated to increase energy efficiency

  2. CONSTOR registered V/TC drop tests. Pre-test analysis by finite element method

    International Nuclear Information System (INIS)

    Voelzer, W.; Koenig, S.; Klein, K.; Tso, C.F.; Owen, S.; Monk, C.

    2004-01-01

    The CONSTOR registered family of steel-concrete-steel sandwich cask designs have been developed to fulfil both the internationally valid IAEA criteria for transportation and the requirements for long-term intermediate storage in the US and various European countries. A comprehensive drop testing programme using a full-scale prototype test cask (CONSTOR registered V/TC) has been developed as part of the application for a transport license in both Germany and the US. The drop tests using the full-scale cask will be performed by BAM at test facilities in Horstwalde. The tests will include five different 9m drops onto flat unyielding targets and seven different 1m drops onto a punch. The first drop test, a 9m side drop, will be performed during PATRAM 2004. The other drop tests will take place during the following year. The development of the cask design and the formulation of the drop test programme has been supported by an extensive series of finite element analyses. The objectives of the finite element analyses were; to provide an intermediate step in demonstrating the performance of the CONSTOR registered in fulfilling the requirements of 10 CFR 71 and the IAEA transport regulations. To justify the selection of drop tests. To predict the performance of V/TC during the drop tests. To estimate the strain and acceleration time histories at measuring points on the test cask and to aid in the setting up of the test instrumentation. To develop an analysis model that can be used in future safety analyses for transport and storage license applications and which can confidently be used to demonstrate the performance of the package. This paper presents an overview of the analyses performed, including a summary of all the different drop orientations that were considered. The major assumptions employed during the analyses are also discussed, as are the specifics of the modelling techniques that were employed. At the end of the paper, the key results obtained from the analyses

  3. Pre-service proof pressure and leak rate tests for the Qinshan CANDU project reactor buildings

    International Nuclear Information System (INIS)

    Petrunik, K.J.; Khan, A.; Ricciuti, R.; Ivanov, A.; Chen, S.

    2003-01-01

    The Qinshan CANDU Project Reactor Buildings (Units 1 and 2) have been successfully tested for the Pre-Service Proof Pressure and Integrated Leak Rate Tests. The Unit 1 tests took place from May 3 to May 9, 2002 and from May 22 to May 25, 2002, and the Unit 2 tests took place from January 21 to January 27, 2003. This paper discusses the significant steps taken at minimum cost on the Qinshan CANDU Project, which has resulted in a) very good leak rate (0.21%) for Unit 1 and excellent leak rate (0.130%) for Unit 2; b) continuous monitoring of the structural behaviour during the Proof Pressure Test, thus eliminating any repeat of the structural test due to lack of data; and c) significant schedule reduction achieved for these tests in Unit 2. (author)

  4. Most probable mixing state of aerosols in Delhi NCR, northern India

    Science.gov (United States)

    Srivastava, Parul; Dey, Sagnik; Srivastava, Atul Kumar; Singh, Sachchidanand; Tiwari, Suresh

    2018-02-01

    Unknown mixing state is one of the major sources of uncertainty in estimating aerosol direct radiative forcing (DRF). Aerosol DRF in India is usually reported for external mixing and any deviation from this would lead to high bias and error. Limited information on aerosol composition hinders in resolving this issue in India. Here we use two years of aerosol chemical composition data measured at megacity Delhi to examine the most probable aerosol mixing state by comparing the simulated clear-sky downward surface flux with the measured flux. We consider external, internal, and four combinations of core-shell (black carbon, BC over dust; water-soluble, WS over dust; WS over water-insoluble, WINS and BC over WINS) mixing. Our analysis reveals that choice of external mixing (usually considered in satellite retrievals and climate models) seems reasonable in Delhi only in the pre-monsoon (Mar-Jun) season. During the winter (Dec-Feb) and monsoon (Jul-Sep) seasons, 'WS coating over dust' externally mixed with BC and WINS appears to be the most probable mixing state; while 'WS coating over WINS' externally mixed with BC and dust seems to be the most probable mixing state in the post-monsoon (Oct-Nov) season. Mean seasonal TOA (surface) aerosol DRF for the most probable mixing states are 4.4 ± 3.9 (- 25.9 ± 3.9), - 16.3 ± 5.7 (- 42.4 ± 10.5), 13.6 ± 11.4 (- 76.6 ± 16.6) and - 5.4 ± 7.7 (- 80.0 ± 7.2) W m- 2 respectively in the pre-monsoon, monsoon, post-monsoon and winter seasons. Our results highlight the importance of realistic mixing state treatment in estimating aerosol DRF to aid in policy making to combat climate change.

  5. A tool for simulating collision probabilities of animals with marine renewable energy devices.

    Directory of Open Access Journals (Sweden)

    Pál Schmitt

    Full Text Available The mathematical problem of establishing a collision probability distribution is often not trivial. The shape and motion of the animal as well as of the the device must be evaluated in a four-dimensional space (3D motion over time. Earlier work on wind and tidal turbines was limited to a simplified two-dimensional representation, which cannot be applied to many new structures. We present a numerical algorithm to obtain such probability distributions using transient, three-dimensional numerical simulations. The method is demonstrated using a sub-surface tidal kite as an example. Necessary pre- and post-processing of the data created by the model is explained, numerical details and potential issues and limitations in the application of resulting probability distributions are highlighted.

  6. A tool for simulating collision probabilities of animals with marine renewable energy devices.

    Science.gov (United States)

    Schmitt, Pál; Culloch, Ross; Lieber, Lilian; Molander, Sverker; Hammar, Linus; Kregting, Louise

    2017-01-01

    The mathematical problem of establishing a collision probability distribution is often not trivial. The shape and motion of the animal as well as of the the device must be evaluated in a four-dimensional space (3D motion over time). Earlier work on wind and tidal turbines was limited to a simplified two-dimensional representation, which cannot be applied to many new structures. We present a numerical algorithm to obtain such probability distributions using transient, three-dimensional numerical simulations. The method is demonstrated using a sub-surface tidal kite as an example. Necessary pre- and post-processing of the data created by the model is explained, numerical details and potential issues and limitations in the application of resulting probability distributions are highlighted.

  7. Free Fall Misconceptions: Results of a Graph Based Pre-Test of Sophomore Civil Engineering Students

    Science.gov (United States)

    Montecinos, Alicia M.

    2014-01-01

    A partially unusual behaviour was found among 14 sophomore students of civil engineering who took a pre test for a free fall laboratory session, in the context of a general mechanics course. An analysis contemplating mathematics models and physics models consistency was made. In all cases, the students presented evidence favoring a correct free…

  8. Contingency bias in probability judgement may arise from ambiguity regarding additional causes.

    Science.gov (United States)

    Mitchell, Chris J; Griffiths, Oren; More, Pranjal; Lovibond, Peter F

    2013-09-01

    In laboratory contingency learning tasks, people usually give accurate estimates of the degree of contingency between a cue and an outcome. However, if they are asked to estimate the probability of the outcome in the presence of the cue, they tend to be biased by the probability of the outcome in the absence of the cue. This bias is often attributed to an automatic contingency detection mechanism, which is said to act via an excitatory associative link to activate the outcome representation at the time of testing. We conducted 3 experiments to test alternative accounts of contingency bias. Participants were exposed to the same outcome probability in the presence of the cue, but different outcome probabilities in the absence of the cue. Phrasing the test question in terms of frequency rather than probability and clarifying the test instructions reduced but did not eliminate contingency bias. However, removal of ambiguity regarding the presence of additional causes during the test phase did eliminate contingency bias. We conclude that contingency bias may be due to ambiguity in the test question, and therefore it does not require postulation of a separate associative link-based mechanism.

  9. The meaning of diagnostic test results: A spreadsheet for swift data analysis

    International Nuclear Information System (INIS)

    MacEneaney, Peter M.; Malone, Dermot E.

    2000-01-01

    AIMS: To design a spreadsheet program to: (a) analyse rapidly diagnostic test result data produced in local research or reported in the literature; (b) correct reported predictive values for disease prevalence in any population; (c) estimate the post-test probability of disease in individual patients. MATERIALS AND METHODS: Microsoft Excel TM was used. Section A: a contingency (2 x 2) table was incorporated into the spreadsheet. Formulae for standard calculations [sample size, disease prevalence, sensitivity and specificity with 95% confidence intervals, predictive values and likelihood ratios (LRs)] were linked to this table. The results change automatically when the data in the true or false negative and positive cells are changed. Section B: this estimates predictive values in any population, compensating for altered disease prevalence. Sections C-F: Bayes' theorem was incorporated to generate individual post-test probabilities. The spreadsheet generates 95% confidence intervals, LRs and a table and graph of conditional probabilities once the sensitivity and specificity of the test are entered. The latter shows the expected post-test probability of disease for any pre-test probability when a test of known sensitivity and specificity is positive or negative. RESULTS: This spreadsheet can be used on desktop and palmtop computers. The MS Excel TM version can be downloaded via the Internet from the URL ftp://radiography.com/pub/Rad-data99.xls CONCLUSION: A spreadsheet is useful for contingency table data analysis and assessment of the clinical meaning of diagnostic test results. MacEneaney, P.M., Malone, D.E. (2000)

  10. A multi-center field study of two point-of-care tests for circulating Wuchereria bancrofti antigenemia in Africa.

    Directory of Open Access Journals (Sweden)

    Cédric B Chesnais

    2017-09-01

    Full Text Available The Global Programme to Eliminate Lymphatic Filariasis uses point-of-care tests for circulating filarial antigenemia (CFA to map endemic areas and for monitoring and evaluating the success of mass drug administration (MDA programs. We compared the performance of the reference BinaxNOW Filariasis card test (ICT, introduced in 1997 with the Alere Filariasis Test Strip (FTS, introduced in 2013 in 5 endemic study sites in Africa.The tests were compared prior to MDA in two study sites (Congo and Côte d'Ivoire and in three sites that had received MDA (DRC and 2 sites in Liberia. Data were analyzed with regard to % positivity, % agreement, and heterogeneity. Models evaluated potential effects of age, gender, and blood microfilaria (Mf counts in individuals and effects of endemicity and history of MDA at the village level as potential factors linked to higher sensitivity of the FTS. Lastly, we assessed relationships between CFA scores and Mf in pre- and post-MDA settings.Paired test results were available for 3,682 individuals. Antigenemia rates were 8% and 22% higher by FTS than by ICT in pre-MDA and in post-MDA sites, respectively. FTS/ICT ratios were higher in areas with low infection rates. The probability of having microfilaremia was much higher in persons with CFA scores >1 in untreated areas. However, this was not true in post-MDA settings.This study has provided extensive new information on the performance of the FTS compared to ICT in Africa and it has confirmed the increased sensitivity of FTS reported in prior studies. Variability in FTS/ICT was related in part to endemicity level, history of MDA, and perhaps to the medications used for MDA. These results suggest that FTS should be superior to ICT for mapping, for transmission assessment surveys, and for post-MDA surveillance.

  11. Failure probability analysis on mercury target vessel

    International Nuclear Information System (INIS)

    Ishikura, Syuichi; Futakawa, Masatoshi; Kogawa, Hiroyuki; Sato, Hiroshi; Haga, Katsuhiro; Ikeda, Yujiro

    2005-03-01

    Failure probability analysis was carried out to estimate the lifetime of the mercury target which will be installed into the JSNS (Japan spallation neutron source) in J-PARC (Japan Proton Accelerator Research Complex). The lifetime was estimated as taking loading condition and materials degradation into account. Considered loads imposed on the target vessel were the static stresses due to thermal expansion and static pre-pressure on He-gas and mercury and the dynamic stresses due to the thermally shocked pressure waves generated repeatedly at 25 Hz. Materials used in target vessel will be degraded by the fatigue, neutron and proton irradiation, mercury immersion and pitting damages, etc. The imposed stresses were evaluated through static and dynamic structural analyses. The material-degradations were deduced based on published experimental data. As a result, it was quantitatively confirmed that the failure probability for the lifetime expected in the design is very much lower, 10 -11 in the safety hull, meaning that it will be hardly failed during the design lifetime. On the other hand, the beam window of mercury vessel suffered with high-pressure waves exhibits the failure probability of 12%. It was concluded, therefore, that the leaked mercury from the failed area at the beam window is adequately kept in the space between the safety hull and the mercury vessel by using mercury-leakage sensors. (author)

  12. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  13. A probability-based multi-cycle sorting method for 4D-MRI: A simulation study.

    Science.gov (United States)

    Liang, Xiao; Yin, Fang-Fang; Liu, Yilin; Cai, Jing

    2016-12-01

    To develop a novel probability-based sorting method capable of generating multiple breathing cycles of 4D-MRI images and to evaluate performance of this new method by comparing with conventional phase-based methods in terms of image quality and tumor motion measurement. Based on previous findings that breathing motion probability density function (PDF) of a single breathing cycle is dramatically different from true stabilized PDF that resulted from many breathing cycles, it is expected that a probability-based sorting method capable of generating multiple breathing cycles of 4D images may capture breathing variation information missing from conventional single-cycle sorting methods. The overall idea is to identify a few main breathing cycles (and their corresponding weightings) that can best represent the main breathing patterns of the patient and then reconstruct a set of 4D images for each of the identified main breathing cycles. This method is implemented in three steps: (1) The breathing signal is decomposed into individual breathing cycles, characterized by amplitude, and period; (2) individual breathing cycles are grouped based on amplitude and period to determine the main breathing cycles. If a group contains more than 10% of all breathing cycles in a breathing signal, it is determined as a main breathing pattern group and is represented by the average of individual breathing cycles in the group; (3) for each main breathing cycle, a set of 4D images is reconstructed using a result-driven sorting method adapted from our previous study. The probability-based sorting method was first tested on 26 patients' breathing signals to evaluate its feasibility of improving target motion PDF. The new method was subsequently tested for a sequential image acquisition scheme on the 4D digital extended cardiac torso (XCAT) phantom. Performance of the probability-based and conventional sorting methods was evaluated in terms of target volume precision and accuracy as measured

  14. [Interpretation and use of routine pulmonary function tests: Spirometry, static lung volumes, lung diffusion, arterial blood gas, methacholine challenge test and 6-minute walk test].

    Science.gov (United States)

    Bokov, P; Delclaux, C

    2016-02-01

    Resting pulmonary function tests (PFT) include the assessment of ventilatory capacity: spirometry (forced expiratory flows and mobilisable volumes) and static volume assessment, notably using body plethysmography. Spirometry allows the potential definition of obstructive defect, while static volume assessment allows the potential definition of restrictive defect (decrease in total lung capacity) and thoracic hyperinflation (increase in static volumes). It must be kept in mind that this evaluation is incomplete and that an assessment of ventilatory demand is often warranted, especially when facing dyspnoea: evaluation of arterial blood gas (searching for respiratory insufficiency) and measurement of the transfer coefficient of the lung, allowing with the measurement of alveolar volume to calculate the diffusing capacity of the lung for CO (DLCO: assessment of alveolar-capillary wall and capillary blood volume). All these pulmonary function tests have been the subject of an Americano-European Task force (standardisation of lung function testing) published in 2005, and translated in French in 2007. Interpretative strategies for lung function tests have been recommended, which define abnormal lung function tests using the 5th and 95th percentiles of predicted values (lower and upper limits of normal values). Thus, these recommendations need to be implemented in all pulmonary function test units. A methacholine challenge test will only be performed in the presence of an intermediate pre-test probability for asthma (diagnostic uncertainty), which is an infrequent setting. The most convenient exertional test is the 6-minute walk test that allows the assessment of walking performance, the search for arterial desaturation and the quantification of dyspnoea complaint. Copyright © 2015 Société nationale française de médecine interne (SNFMI). Published by Elsevier SAS. All rights reserved.

  15. Failure Probability Calculation Method Using Kriging Metamodel-based Importance Sampling Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seunggyu [Korea Aerospace Research Institue, Daejeon (Korea, Republic of); Kim, Jae Hoon [Chungnam Nat’l Univ., Daejeon (Korea, Republic of)

    2017-05-15

    The kernel density was determined based on sampling points obtained in a Markov chain simulation and was assumed to be an important sampling function. A Kriging metamodel was constructed in more detail in the vicinity of a limit state. The failure probability was calculated based on importance sampling, which was performed for the Kriging metamodel. A pre-existing method was modified to obtain more sampling points for a kernel density in the vicinity of a limit state. A stable numerical method was proposed to find a parameter of the kernel density. To assess the completeness of the Kriging metamodel, the possibility of changes in the calculated failure probability due to the uncertainty of the Kriging metamodel was calculated.

  16. Prioritizing molecular markers to test for in the initial workup of advanced non-small cell lung cancer: wants versus needs.

    Science.gov (United States)

    West, Howard

    2017-09-01

    The current standard of care for molecular marker testing in patients with advanced non-small cell lung cancer (NSCLC) has been evolving over several years and is a product of the quality of the evidence supporting a targeted therapy for a specific molecular marker, the pre-test probability of that marker in the population, and the magnitude of benefit seen with that treatment. Among the markers that have one or more matched targeted therapies, only a few are in the subset for which they should be considered as most clearly worthy of prioritizing to detect in the first line setting in order to have them supplant other first line alternatives, and in only a subset of patients, as defined currently by NSCLC histology. Specifically, this currently includes testing for an activating epidermal growth factor receptor ( EGFR ) mutation or an anaplastic lymphoma kinase ( ALK ) or ROS1 rearrangement. This article reviews the history and data supporting the prioritization of these markers in patients with non-squamous NSCLC, a histologically selected population in whom the probability of these markers combined with the anticipated efficacy of targeted therapies against them is high enough to favor these treatments in the first line setting. In reviewing the evidence supporting this very limited core subset of most valuable molecular markers to detect in the initial workup of such patients, we can also see the criteria by which other actionable markers need to reach in order to be widely recognized as reliably valuable enough to warrant prioritization to detect in the initial workup of advanced NSCLC as well.

  17. Time-dependent earthquake probability calculations for southern Kanto after the 2011 M9.0 Tohoku earthquake

    Science.gov (United States)

    Nanjo, K. Z.; Sakai, S.; Kato, A.; Tsuruoka, H.; Hirata, N.

    2013-05-01

    Seismicity in southern Kanto activated with the 2011 March 11 Tohoku earthquake of magnitude M9.0, but does this cause a significant difference in the probability of more earthquakes at the present or in the To? future answer this question, we examine the effect of a change in the seismicity rate on the probability of earthquakes. Our data set is from the Japan Meteorological Agency earthquake catalogue, downloaded on 2012 May 30. Our approach is based on time-dependent earthquake probabilistic calculations, often used for aftershock hazard assessment, and are based on two statistical laws: the Gutenberg-Richter (GR) frequency-magnitude law and the Omori-Utsu (OU) aftershock-decay law. We first confirm that the seismicity following a quake of M4 or larger is well modelled by the GR law with b ˜ 1. Then, there is good agreement with the OU law with p ˜ 0.5, which indicates that the slow decay was notably significant. Based on these results, we then calculate the most probable estimates of future M6-7-class events for various periods, all with a starting date of 2012 May 30. The estimates are higher than pre-quake levels if we consider a period of 3-yr duration or shorter. However, for statistics-based forecasting such as this, errors that arise from parameter estimation must be considered. Taking into account the contribution of these errors to the probability calculations, we conclude that any increase in the probability of earthquakes is insignificant. Although we try to avoid overstating the change in probability, our observations combined with results from previous studies support the likelihood that afterslip (fault creep) in southern Kanto will slowly relax a stress step caused by the Tohoku earthquake. This afterslip in turn reminds us of the potential for stress redistribution to the surrounding regions. We note the importance of varying hazards not only in time but also in space to improve the probabilistic seismic hazard assessment for southern Kanto.

  18. Use of a field model to analyze probable fire environments encountered within the complex geometries of nuclear power plants

    International Nuclear Information System (INIS)

    Boccio, J.L.; Usher, J.L.; Singhal, A.K.; Tam, L.T.

    1985-08-01

    A fire in a nuclear power plant (NPP) can damage equipment needed to safely operate the plant and thereby either directly cause an accident or else reduce the plant's margin of safety. The development of a field-model fire code to analyze the probable fire environments encountered within NPP is discussed. A set of fire tests carried out under the aegis of the US Nuclear Regulatory Commission (NRC) is described. The results of these tests are then utilized to validate the field model

  19. Parametric modeling of probability of bank loan default in Kenya ...

    African Journals Online (AJOL)

    This makes the study on probability of a customer defaulting very useful while analyzing the credit risk policies. In this paper, we use a raw data set that contains demographic information about the borrowers. The data sets have been used to identify which risk factors associated with the borrowers contribute towards default.

  20. Student Perceptions of the Progress Test in Two Settings and the Implications for Test Deployment

    Science.gov (United States)

    Wade, Louise; Harrison, Chris; Hollands, James; Mattick, Karen; Ricketts, Chris; Wass, Val

    2012-01-01

    Background: The Progress Test (PT) was developed to assess student learning within integrated curricula. Whilst it is effective in promoting and rewarding deep approaches to learning in some settings, we hypothesised that implementation of the curriculum (design and assessment) may impact on students' preparation for the PT and their learning.…

  1. The Valuation of Insurance under Uncertainty: Does Information about Probability Matter?

    OpenAIRE

    Carmela Di Mauro; Anna Maffioletti

    2001-01-01

    In a laboratory experiment we test the hypothesis that consumers' valuation of insurance is sensitive to the amount of information available on the probability of a potential loss. In order to test this hypothesis we simulate a market in which we elicit individuals' willingness to pay to insure against a loss characterised either by known or else vague probabilities. We use two distinct treatments by providing subjects with different information over the vague probabilities of loss. In genera...

  2. Comparison of pre-test analyses with the Sizewell-B 1:10 scale prestressed concrete containment test

    International Nuclear Information System (INIS)

    Dameron, R.A.; Rashid, Y.R.; Parks, M.B.

    1991-01-01

    This paper describes pretest analyses of a one-tenth scale model of the 'Sizewell-B' prestressed concrete containment building. The work was performed by ANATECH Research Corp. under contract with Sandia National Laboratories (SNL). Hydraulic testing of the model was conducted in the United Kingdom by the Central Electricity Generating Board (CEGB). In order to further their understanding of containment behavior, the USNRC, through an agreement with the United Kingdom Atomic Energy Authority (UKAEA), also participated in the test program with SNL serving as their technical agent. The analyses that were conducted included two global axisymmetric models with 'bonded' and 'unbonded' analytical treatment of meridional tendons, a 3D quarter model of the structure, an axisymmetric representation of the equipment hatch region, and local plane stress and r-θ models of a buttress. Results of these analyses are described and compared with the results of the test. A global hoop failure at midheight of the cylinder and a shear/bending type failure at the base of the cylinder wall were both found to have roughly equal probability of occurrence; however, the shear failure mode had higher uncertainty associated with it. Consequently, significant effort was dedicated to improving the modeling capability for concrete shear behavior. This work is also described briefly. (author)

  3. Comparison of pre-test analyses with the Sizewell-B 1:10 scale prestressed concrete containment test

    International Nuclear Information System (INIS)

    Dameron, R.A.; Rashid, Y.R.; Parks, M.B.

    1991-01-01

    This paper describes pretest analyses of a one-tenth scale model of the Sizewell-B prestressed concrete containment building. The work was performed by ANATECH Research Corp. under contract with Sandia National Laboratories (SNL). Hydraulic testing of the model was conducted in the United Kingdom by the Central Electricity Generating Board (CEGB). In order to further their understanding of containment behavior, the USNRC, through an agreement with the United Kingdom Atomic Energy Authority (UKAEA), also participated in the test program with SNL serving as their technical agent. The analyses that were conducted included two global axisymmetric models with ''bonded'' and ''unbonded'' analytical treatment of meridional tendons, a 3D quarter model of the structure, an axisymmetric representation of the equipment hatch region, and local plan stress and r-θ models of a buttress. Results of these analyses are described and compared with the results of the test. A global hoop failure at midheight of the cylinder and a shear/bending type failure at the base of the cylinder wall were both found to have roughly equal probability of occurrence; however, the shear failure mode had higher uncertainty associated with it. Consequently, significant effort was dedicated to improving the modeling capability for concrete shear behavior. This work is also described briefly. 5 refs., 7 figs

  4. Theoretical value of pre-trade testing for Salmonella in Swedish cattle herds.

    Science.gov (United States)

    Sternberg Lewerin, Susanna

    2018-05-01

    The Swedish Salmonella control programme includes mandatory action if Salmonella is detected in a herd. The aim of this study was to assess the relative value of different strategies for pre-movement testing of cattle. Three fictitious herds were included: dairy, beef and specialised calf-fattening. The yearly risks of introducing Salmonella with and without individual serological or bulk milk testing were assessed as well as the effects of sourcing animals from low-prevalence areas or reducing the number of source herds. The initial risk was highest for the calf-fattening herd and lowest for the beef herd. For the beef and dairy herds, the yearly risk of Salmonella introduction was reduced by about 75% with individual testing. Sourcing animals from low-prevalence areas reduced the risk by >99%. For the calf-fattening herd, the yearly risk was reduced by almost 50% by individual testing or sourcing animals from a maximum of five herds. The method was useful for illustrating effects of risk mitigation when introducing animals into a herd. Sourcing animals from low-risk areas (or herds) is more effective than single testing of individual animals or bulk milk. A comprehensive approach to reduce the risk of introducing Salmonella from source herds is justified. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Salt intake, obesity, and pre-hypertension among Iranian adults: a cross-sectional study

    International Nuclear Information System (INIS)

    Khosravi, A.; Toghianifar, N.; Sarrafzodegan, N

    2012-01-01

    Objective: Overweight and obese subjects are prone to have a high salt intake. This study was aimed to investigate the relationship between salt intake and pre hypertension among overweight and obese subjects. Methodology: This was across-sectional study performed in the setting of a community-based intervention: the Isfahan Healthy Heart Program (IHHP). In total, 806 subjects with normal blood pressure or pre hypertension entered the study. Salt intake, BMI, and blood pressure were measured using standard methods. Results: The salt intake was 9.19 +- 5.34, 11.6 6.87, and 11.64 +- 6.68 gm/d in normal-weight, overweight and obese subjects with normal blood pressure, respectively (p=0.0001). The values for normal-weight, overweight and obese pre hypertensive subjects were 12.04 +- 8.03, 12.41 +- 6.45, and 12.52+- 7.63 gm/d, respectively (p=0.236). The unadjusted odds ratio for pre hypertension among obese subjects was 4.78 (95% Cl: 2.38 - 9.60). The odds ratio was 4.73 (95% CI 2.19-10.19), 4.65 (95%CI 2.15-10.05), and 3.37 (95%CI 1.45-7.85) after adjustment for socio demographic characteristics, lifestyle factors, and salt intake, respectively. An increase of one gram per day in the daily salt intake increased the probability of having pre hypertension by 5% after adjusting for age, education, BMI, and lifestyle factors. Conclusion: The findings of this study support a role for high salt intake in the high blood pressure of overweight and obese subjects. (author)

  6. Pre-processing by data augmentation for improved ellipse fitting.

    Science.gov (United States)

    Kumar, Pankaj; Belchamber, Erika R; Miklavcic, Stanley J

    2018-01-01

    Ellipse fitting is a highly researched and mature topic. Surprisingly, however, no existing method has thus far considered the data point eccentricity in its ellipse fitting procedure. Here, we introduce the concept of eccentricity of a data point, in analogy with the idea of ellipse eccentricity. We then show empirically that, irrespective of ellipse fitting method used, the root mean square error (RMSE) of a fit increases with the eccentricity of the data point set. The main contribution of the paper is based on the hypothesis that if the data point set were pre-processed to strategically add additional data points in regions of high eccentricity, then the quality of a fit could be improved. Conditional validity of this hypothesis is demonstrated mathematically using a model scenario. Based on this confirmation we propose an algorithm that pre-processes the data so that data points with high eccentricity are replicated. The improvement of ellipse fitting is then demonstrated empirically in real-world application of 3D reconstruction of a plant root system for phenotypic analysis. The degree of improvement for different underlying ellipse fitting methods as a function of data noise level is also analysed. We show that almost every method tested, irrespective of whether it minimizes algebraic error or geometric error, shows improvement in the fit following data augmentation using the proposed pre-processing algorithm.

  7. Different goodness of fit tests for Rayleigh distribution in ranked set sampling

    Directory of Open Access Journals (Sweden)

    Amer Al-Omari

    2016-03-01

    Full Text Available In this paper, different goodness of fit tests for the Rayleigh distribution are considered based on simple random sampling (SRS and ranked set sampling (RSS techniques. The performance of the suggested estimators is evaluated in terms of the power of the tests by using Monte Carlo simulation. It is found that the suggested RSS tests perform better than their counterparts  in SRS.

  8. Changing Default Fluoroscopy Equipment Settings Decreases Entrance Skin Dose in Patients.

    Science.gov (United States)

    Canales, Benjamin K; Sinclair, Lindsay; Kang, Diana; Mench, Anna M; Arreola, Manuel; Bird, Vincent G

    2016-04-01

    Proper fluoroscopic education and protocols may reduce the patient radiation dose but few prospective studies in urology have been performed. Using optically stimulated luminescent dosimeters we tested whether fluoroscopy time and/or entrance skin dose would decrease after educational and radiation reduction protocols. At default manufacturer settings fluoroscopy time and entrance skin dose were prospectively measured using optically stimulated luminescent dosimeters in patients undergoing ureteroscopy, retrograde pyelogram/stent or percutaneous nephrolithotomy with access for stone disease. A validated radiation safety competency test was administered to urology faculty and residents before and after web based, hands-on fluoroscopy training. Default fluoroscopy settings were changed from continuous to intermittent pulse rate and from standard to half-dose output. Fluoroscopy time and entrance skin dose were then measured again. The cohorts of 44 pre-protocol and 50 post-protocol patients with stones were similarly matched. The change in mean fluoroscopy time and entrance skin dose from pre-protocol to post-protocol was -0.6 minutes and -11.6 mGy (33%) for percutaneous nephrolithotomy (p = 0.62 and default settings to intermittent pulse rate (12 frames per second) and half-dose lowered the entrance skin dose by 30% across all endourology patients but most significantly during percutaneous nephrolithotomy. To limit patient radiation exposure fluoroscopy default settings should be decreased before all endourology procedures and image equipment manufacturers should consider lowering standard default renal settings. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  9. Using Fuzzy Probability Weights in Cumulative Prospect Theory

    Directory of Open Access Journals (Sweden)

    Užga-Rebrovs Oļegs

    2016-12-01

    Full Text Available During the past years, a rapid growth has been seen in the descriptive approaches to decision choice. As opposed to normative expected utility theory, these approaches are based on the subjective perception of probabilities by the individuals, which takes place in real situations of risky choice. The modelling of this kind of perceptions is made on the basis of probability weighting functions. In cumulative prospect theory, which is the focus of this paper, decision prospect outcome weights are calculated using the obtained probability weights. If the value functions are constructed in the sets of positive and negative outcomes, then, based on the outcome value evaluations and outcome decision weights, generalised evaluations of prospect value are calculated, which are the basis for choosing an optimal prospect.

  10. Method for Automatic Selection of Parameters in Normal Tissue Complication Probability Modeling.

    Science.gov (United States)

    Christophides, Damianos; Appelt, Ane L; Gusnanto, Arief; Lilley, John; Sebag-Montefiore, David

    2018-07-01

    To present a fully automatic method to generate multiparameter normal tissue complication probability (NTCP) models and compare its results with those of a published model, using the same patient cohort. Data were analyzed from 345 rectal cancer patients treated with external radiation therapy to predict the risk of patients developing grade 1 or ≥2 cystitis. In total, 23 clinical factors were included in the analysis as candidate predictors of cystitis. Principal component analysis was used to decompose the bladder dose-volume histogram into 8 principal components, explaining more than 95% of the variance. The data set of clinical factors and principal components was divided into training (70%) and test (30%) data sets, with the training data set used by the algorithm to compute an NTCP model. The first step of the algorithm was to obtain a bootstrap sample, followed by multicollinearity reduction using the variance inflation factor and genetic algorithm optimization to determine an ordinal logistic regression model that minimizes the Bayesian information criterion. The process was repeated 100 times, and the model with the minimum Bayesian information criterion was recorded on each iteration. The most frequent model was selected as the final "automatically generated model" (AGM). The published model and AGM were fitted on the training data sets, and the risk of cystitis was calculated. The 2 models had no significant differences in predictive performance, both for the training and test data sets (P value > .05) and found similar clinical and dosimetric factors as predictors. Both models exhibited good explanatory performance on the training data set (P values > .44), which was reduced on the test data sets (P values < .05). The predictive value of the AGM is equivalent to that of the expert-derived published model. It demonstrates potential in saving time, tackling problems with a large number of parameters, and standardizing variable selection in NTCP

  11. Probability of islanding in utility networks due to grid connected photovoltaic power systems

    Energy Technology Data Exchange (ETDEWEB)

    Verhoeven, B.

    2002-09-15

    This report for the International Energy Agency (IEA) made by Task 5 of the Photovoltaic Power Systems (PVPS) programme takes a look at the probability of islanding in utility networks due to grid-connected photovoltaic power systems. The mission of the Photovoltaic Power Systems Programme is to enhance the international collaboration efforts which accelerate the development and deployment of photovoltaic solar energy. Task 5 deals with issues concerning grid-interconnection and distributed PV power systems. This report summarises the results on a study on the probability of islanding in power networks with a high penetration level of grid connected PV-systems. The results are based on measurements performed during one year in a Dutch utility network. The measurements of active and reactive power were taken every second for two years and stored in a computer for off-line analysis. The area examined and its characteristics are described, as are the test set-up and the equipment used. The ratios between load and PV-power are discussed. The general conclusion is that the probability of islanding is virtually zero for low, medium and high penetration levels of PV-systems.

  12. Transitional Probabilities Are Prioritized over Stimulus/Pattern Probabilities in Auditory Deviance Detection: Memory Basis for Predictive Sound Processing.

    Science.gov (United States)

    Mittag, Maria; Takegata, Rika; Winkler, István

    2016-09-14

    Representations encoding the probabilities of auditory events do not directly support predictive processing. In contrast, information about the probability with which a given sound follows another (transitional probability) allows predictions of upcoming sounds. We tested whether behavioral and cortical auditory deviance detection (the latter indexed by the mismatch negativity event-related potential) relies on probabilities of sound patterns or on transitional probabilities. We presented healthy adult volunteers with three types of rare tone-triplets among frequent standard triplets of high-low-high (H-L-H) or L-H-L pitch structure: proximity deviant (H-H-H/L-L-L), reversal deviant (L-H-L/H-L-H), and first-tone deviant (L-L-H/H-H-L). If deviance detection was based on pattern probability, reversal and first-tone deviants should be detected with similar latency because both differ from the standard at the first pattern position. If deviance detection was based on transitional probabilities, then reversal deviants should be the most difficult to detect because, unlike the other two deviants, they contain no low-probability pitch transitions. The data clearly showed that both behavioral and cortical auditory deviance detection uses transitional probabilities. Thus, the memory traces underlying cortical deviance detection may provide a link between stimulus probability-based change/novelty detectors operating at lower levels of the auditory system and higher auditory cognitive functions that involve predictive processing. Our research presents the first definite evidence for the auditory system prioritizing transitional probabilities over probabilities of individual sensory events. Forming representations for transitional probabilities paves the way for predictions of upcoming sounds. Several recent theories suggest that predictive processing provides the general basis of human perception, including important auditory functions, such as auditory scene analysis. Our

  13. On the Cut-off Point for Combinatorial Group Testing

    DEFF Research Database (Denmark)

    Fischer, Paul; Klasner, N.; Wegener, I.

    1999-01-01

    is answered by 1 if Q contains at least one essential object and by 0 otherwise. In the statistical setting the objects are essential, independently of each other, with a given probability p combinatorial setting the number k ... group testing is equal to p* = 12(3 - 5), i.e., the strategy of testing each object individually minimizes the average number of queries iff p >= p* or n = 1. In the combinatorial setting the worst case number of queries is of interest. It has been conjectured that the cut-off point of combinatorial...

  14. Evaluation of the effect of prostate volume change on tumor control probability in LDR brachytherapy.

    Science.gov (United States)

    Knaup, Courtney; Mavroidis, Panayiotis; Stathakis, Sotirios; Smith, Mark; Swanson, Gregory; Papanikolaou, Niko

    2011-09-01

    This study evaluates low dose-rate brachytherapy (LDR) prostate plans to determine the biological effect of dose degradation due to prostate volume changes. In this study, 39 patients were evaluated. Pre-implant prostate volume was determined using ultrasound. These images were used with the treatment planning system (Nucletron Spot Pro 3.1(®)) to create treatment plans using (103)Pd seeds. Following the implant, patients were imaged using CT for post-implant dosimetry. From the pre and post-implant DVHs, the biologically equivalent dose and the tumor control probability (TCP) were determined using the biologically effective uniform dose. The model used RBE = 1.75 and α/β = 2 Gy. The prostate volume changed between pre and post implant image sets ranged from -8% to 110%. TCP and the mean dose were reduced up to 21% and 56%, respectively. TCP is observed to decrease as the mean dose decreases to the prostate. The post-implant tumor dose was generally observed to decrease, compared to the planned dose. A critical uniform dose of 130 Gy was established. Below this dose, TCP begins to fall-off. It was also determined that patients with a small prostates were more likely to suffer TCP decrease. The biological effect of post operative prostate growth due to operative trauma in LDR was evaluated using the concept. The post-implant dose was lower than the planned dose due to an increase of prostate volume post-implant. A critical uniform dose of 130 Gy was determined, below which TCP begun to decline.

  15. Evaluation of the effect of prostate volume change on tumor control probability in LDR brachytherapy

    Directory of Open Access Journals (Sweden)

    Courtney Knaup

    2011-09-01

    Full Text Available Purpose: This study evaluates low dose-rate brachytherapy (LDR prostate plans to determine the biological effectof dose degradation due to prostate volume changes. Material and methods: In this study, 39 patients were evaluated. Pre-implant prostate volume was determinedusing ultrasound. These images were used with the treatment planning system (Nucletron Spot Pro 3.1® to create treatmentplans using 103Pd seeds. Following the implant, patients were imaged using CT for post-implant dosimetry. Fromthe pre and post-implant DVHs, the biologically equivalent dose and the tumor control probability (TCP were determinedusing the biologically effective uniform dose. The model used RBE = 1.75 and α/β = 2 Gy. Results: The prostate volume changed between pre and post implant image sets ranged from –8% to 110%. TCP andthe mean dose were reduced up to 21% and 56%, respectively. TCP is observed to decrease as the mean dose decreasesto the prostate. The post-implant tumor dose was generally observed to decrease, compared to the planned dose.A critical uniform dose of 130 Gy was established. Below this dose, TCP begins to fall-off. It was also determined thatpatients with a small prostates were more likely to suffer TCP decrease. Conclusions: The biological effect of post operative prostate growth due to operative trauma in LDR was evaluatedusing the concept. The post-implant dose was lower than the planned dose due to an increase of prostate volumepost-implant. A critical uniform dose of 130 Gy was determined, below which TCP begun to decline.

  16. The Pacific Marine Energy Center - South Energy Test Site (PMEC-SETS)

    Energy Technology Data Exchange (ETDEWEB)

    Batten, Belinda [Oregon State Univ., Corvallis, OR (United States); Hellin, Dan [Oregon State Univ., Corvallis, OR (United States)

    2018-02-07

    The overall goal of this project was to build on existing progress to establish the Pacific Marine Energy Center South Energy Test Site (PMEC-SETS) as the nation's first fully permitted test site for wave energy converter arrays. Specifically, it plays an essential role in reducing levelized cost of energy for the wave energy industry by providing both the facility and resources to address the challenges of cost reduction.

  17. System and method for pre-cooling of buildings

    Science.gov (United States)

    Springer, David A.; Rainer, Leo I.

    2011-08-09

    A method for nighttime pre-cooling of a building comprising inputting one or more user settings, lowering the indoor temperature reading of the building during nighttime by operating an outside air ventilation system followed, if necessary, by a vapor compression cooling system. The method provides for nighttime pre-cooling of a building that maintains indoor temperatures within a comfort range based on the user input settings, calculated operational settings, and predictions of indoor and outdoor temperature trends for a future period of time such as the next day.

  18. Bayesian probability analysis: a prospective demonstration of its clinical utility in diagnosing coronary disease

    International Nuclear Information System (INIS)

    Detrano, R.; Yiannikas, J.; Salcedo, E.E.; Rincon, G.; Go, R.T.; Williams, G.; Leatherman, J.

    1984-01-01

    One hundred fifty-four patients referred for coronary arteriography were prospectively studied with stress electrocardiography, stress thallium scintigraphy, cine fluoroscopy (for coronary calcifications), and coronary angiography. Pretest probabilities of coronary disease were determined based on age, sex, and type of chest pain. These and pooled literature values for the conditional probabilities of test results based on disease state were used in Bayes theorem to calculate posttest probabilities of disease. The results of the three noninvasive tests were compared for statistical independence, a necessary condition for their simultaneous use in Bayes theorem. The test results were found to demonstrate pairwise independence in patients with and those without disease. Some dependencies that were observed between the test results and the clinical variables of age and sex were not sufficient to invalidate application of the theorem. Sixty-eight of the study patients had at least one major coronary artery obstruction of greater than 50%. When these patients were divided into low-, intermediate-, and high-probability subgroups according to their pretest probabilities, noninvasive test results analyzed by Bayesian probability analysis appropriately advanced 17 of them by at least one probability subgroup while only seven were moved backward. Of the 76 patients without disease, 34 were appropriately moved into a lower probability subgroup while 10 were incorrectly moved up. We conclude that posttest probabilities calculated from Bayes theorem more accurately classified patients with and without disease than did pretest probabilities, thus demonstrating the utility of the theorem in this application

  19. Set-up and Test Procedure for Suction Installation and Uninstallation of Bucket Foundation

    DEFF Research Database (Denmark)

    Koteras, Aleksandra Katarzyna

    This technical report describes the set-up and the test procedures for installation and uninstallation of medium-scale model of bucket foundation that can be performed in the geotechnical part of laboratory in Aalborg University. The installation of bucket foundation can be tested with the use of......) and loading frame used for those tests have been already used for axially static and cyclic loading of piles (Thomassen, 2015a) and for axially static and cyclic loading of bucket foundation (Vaitkunaite et al., 2015).......This technical report describes the set-up and the test procedures for installation and uninstallation of medium-scale model of bucket foundation that can be performed in the geotechnical part of laboratory in Aalborg University. The installation of bucket foundation can be tested with the use...... of suction under the bucket lid or by applying additional force through the hydraulic piston, forcing the bucket to penetrate into the soil. Tests for uninstallation are performed also with the use of water pressure, as a reverse process to the suction installation. Both installation and uninstallation tests...

  20. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  1. Assessing future vent opening locations at the Somma-Vesuvio volcanic complex: 2. Probability maps of the caldera for a future Plinian/sub-Plinian event with uncertainty quantification

    Science.gov (United States)

    Tadini, A.; Bevilacqua, A.; Neri, A.; Cioni, R.; Aspinall, W. P.; Bisson, M.; Isaia, R.; Mazzarini, F.; Valentine, G. A.; Vitale, S.; Baxter, P. J.; Bertagnini, A.; Cerminara, M.; de Michieli Vitturi, M.; Di Roberto, A.; Engwell, S.; Esposti Ongaro, T.; Flandoli, F.; Pistolesi, M.

    2017-06-01

    In this study, we combine reconstructions of volcanological data sets and inputs from a structured expert judgment to produce a first long-term probability map for vent opening location for the next Plinian or sub-Plinian eruption of Somma-Vesuvio. In the past, the volcano has exhibited significant spatial variability in vent location; this can exert a significant control on where hazards materialize (particularly of pyroclastic density currents). The new vent opening probability mapping has been performed through (i) development of spatial probability density maps with Gaussian kernel functions for different data sets and (ii) weighted linear combination of these spatial density maps. The epistemic uncertainties affecting these data sets were quantified explicitly with expert judgments and implemented following a doubly stochastic approach. Various elicitation pooling metrics and subgroupings of experts and target questions were tested to evaluate the robustness of outcomes. Our findings indicate that (a) Somma-Vesuvio vent opening probabilities are distributed inside the whole caldera, with a peak corresponding to the area of the present crater, but with more than 50% probability that the next vent could open elsewhere within the caldera; (b) there is a mean probability of about 30% that the next vent will open west of the present edifice; (c) there is a mean probability of about 9.5% that the next medium-large eruption will enlarge the present Somma-Vesuvio caldera, and (d) there is a nonnegligible probability (mean value of 6-10%) that the next Plinian or sub-Plinian eruption will have its initial vent opening outside the present Somma-Vesuvio caldera.

  2. Estimating the Probability of Vegetation to Be Groundwater Dependent Based on the Evaluation of Tree Models

    Directory of Open Access Journals (Sweden)

    Isabel C. Pérez Hoyos

    2016-04-01

    Full Text Available Groundwater Dependent Ecosystems (GDEs are increasingly threatened by humans’ rising demand for water resources. Consequently, it is imperative to identify the location of GDEs to protect them. This paper develops a methodology to identify the probability of an ecosystem to be groundwater dependent. Probabilities are obtained by modeling the relationship between the known locations of GDEs and factors influencing groundwater dependence, namely water table depth and climatic aridity index. Probabilities are derived for the state of Nevada, USA, using modeled water table depth and aridity index values obtained from the Global Aridity database. The model selected results from the performance comparison of classification trees (CT and random forests (RF. Based on a threshold-independent accuracy measure, RF has a better ability to generate probability estimates. Considering a threshold that minimizes the misclassification rate for each model, RF also proves to be more accurate. Regarding training accuracy, performance measures such as accuracy, sensitivity, and specificity are higher for RF. For the test set, higher values of accuracy and kappa for CT highlight the fact that these measures are greatly affected by low prevalence. As shown for RF, the choice of the cutoff probability value has important consequences on model accuracy and the overall proportion of locations where GDEs are found.

  3. LAMP-B: a Fortran program set for the lattice cell analysis by collision probability method

    International Nuclear Information System (INIS)

    Tsuchihashi, Keiichiro

    1979-02-01

    Nature of physical problem solved: LAMB-B solves an integral transport equation by the collision probability method for many variety of lattice cell geometries: spherical, plane and cylindrical lattice cell; square and hexagonal arrays of pin rods; annular clusters and square clusters. LAMP-B produces homogenized constants for multi and/or few group diffusion theory programs. Method of solution: LAMP-B performs an exact numerical integration to obtain the collision probabilities. Restrictions on the complexity of the problem: Not more than 68 group in the fast group calculation, and not more than 20 regions in the resonance integral calculation. Typical running time: It varies with the number of energy groups and the selection of the geometry. Unusual features of the program: Any or any combination of constituent subprograms can be used so that the partial use of this program is available. (author)

  4. Frequency, probability, and prediction: easy solutions to cognitive illusions?

    Science.gov (United States)

    Griffin, D; Buehler, R

    1999-02-01

    Many errors in probabilistic judgment have been attributed to people's inability to think in statistical terms when faced with information about a single case. Prior theoretical analyses and empirical results imply that the errors associated with case-specific reasoning may be reduced when people make frequentistic predictions about a set of cases. In studies of three previously identified cognitive biases, we find that frequency-based predictions are different from-but no better than-case-specific judgments of probability. First, in studies of the "planning fallacy, " we compare the accuracy of aggregate frequency and case-specific probability judgments in predictions of students' real-life projects. When aggregate and single-case predictions are collected from different respondents, there is little difference between the two: Both are overly optimistic and show little predictive validity. However, in within-subject comparisons, the aggregate judgments are significantly more conservative than the single-case predictions, though still optimistically biased. Results from studies of overconfidence in general knowledge and base rate neglect in categorical prediction underline a general conclusion. Frequentistic predictions made for sets of events are no more statistically sophisticated, nor more accurate, than predictions made for individual events using subjective probability. Copyright 1999 Academic Press.

  5. Internal Medicine residents use heuristics to estimate disease probability

    OpenAIRE

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background: Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method: We randomized 55 In...

  6. EDRN Pre-Validation of Multiplex Biomarker in Urine — EDRN Public Portal

    Science.gov (United States)

    The goal of this proposal is to begin to establish an EDRN “pre-validation” trial of a multiplex set of transcripts, including the ETS gene fusions, in post-DRE urine sediments. As can be evidenced by our preliminary data, we have established the utility of this multiplex urine test (which includes TMPRSS-ERG, SPINK1, PCA3 and GOLPH2) in a cohort of prospectively collected urine sediments from the University of Michigan EDRN CEVC site (collected by co-I, Dr. John Wei). In this proposal, we will run this multiplex assay on prospectively collected post-DRE urines collected from other EDRN sites. The idea is to couple this “pre-validation” study with an EDRN validation trial under consideration for the Gen-Probe PCA3 urine test (directed by Drs. John Wei and Harry Rittenhouse).

  7. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  8. Foreign Language Optical Character Recognition, Phase II: Arabic and Persian Training and Test Data Sets

    National Research Council Canada - National Science Library

    Davidson, Robert

    1997-01-01

    .... Each data set is divided into a training set, which is made available to developers, and a carefully matched equal-sized set of closely analogous samples, which is reserved for testing of the developers' products...

  9. Pre-Discovery Detection of ASASSN-18fv by Evryscope

    Science.gov (United States)

    Corbett, H.; Law, N.; Goeke, E.; Ratzloff, J.; Howard, W.; Fors, O.; del Ser, D.; Quimby, R. M.

    2018-03-01

    We have identified pre-discovery imaging of the probable classical nova ASASSN-18fv by Evryscope-South (http://evryscope.astro.unc.edu/), an array of 6-cm telescopes continuously monitoring 8000 square degrees of sky at 2-minute cadence from CTIO, Chile.

  10. Dental age estimation: the role of probability estimates at the 10 year threshold.

    Science.gov (United States)

    Lucas, Victoria S; McDonald, Fraser; Neil, Monica; Roberts, Graham

    2014-08-01

    The use of probability at the 18 year threshold has simplified the reporting of dental age estimates for emerging adults. The availability of simple to use widely available software has enabled the development of the probability threshold for individual teeth in growing children. Tooth development stage data from a previous study at the 10 year threshold were reused to estimate the probability of developing teeth being above or below the 10 year thresh-hold using the NORMDIST Function in Microsoft Excel. The probabilities within an individual subject are averaged to give a single probability that a subject is above or below 10 years old. To test the validity of this approach dental panoramic radiographs of 50 female and 50 male children within 2 years of the chronological age were assessed with the chronological age masked. Once the whole validation set of 100 radiographs had been assessed the masking was removed and the chronological age and dental age compared. The dental age was compared with chronological age to determine whether the dental age correctly or incorrectly identified a validation subject as above or below the 10 year threshold. The probability estimates correctly identified children as above or below on 94% of occasions. Only 2% of the validation group with a chronological age of less than 10 years were assigned to the over 10 year group. This study indicates the very high accuracy of assignment at the 10 year threshold. Further work at other legally important age thresholds is needed to explore the value of this approach to the technique of age estimation. Copyright © 2014. Published by Elsevier Ltd.

  11. Maximizing the probability of satisfying the clinical goals in radiation therapy treatment planning under setup uncertainty

    International Nuclear Information System (INIS)

    Fredriksson, Albin; Hårdemark, Björn; Forsgren, Anders

    2015-01-01

    Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goals to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality

  12. EMC Pre-Compliance Tests and Educational Aspects

    Directory of Open Access Journals (Sweden)

    Lia Elena Aciu

    2018-05-01

    Full Text Available The aim of this paper is to present the obtained results at the pre-compliance EMC measurements according to the European standards for a microcontroller based device. The EMC measurements fulfils the students’ education in electronics and electrical engineering, who after building microcontroller devices can see the impact on the environment and the immunity to electromagnetic disturbances.

  13. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    Science.gov (United States)

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  14. Sequential probability ratio controllers for safeguards radiation monitors

    International Nuclear Information System (INIS)

    Fehlau, P.E.; Coop, K.L.; Nixon, K.V.

    1984-01-01

    Sequential hypothesis tests applied to nuclear safeguards accounting methods make the methods more sensitive to detecting diversion. The sequential tests also improve transient signal detection in safeguards radiation monitors. This paper describes three microprocessor control units with sequential probability-ratio tests for detecting transient increases in radiation intensity. The control units are designed for three specific applications: low-intensity monitoring with Poisson probability ratios, higher intensity gamma-ray monitoring where fixed counting intervals are shortened by sequential testing, and monitoring moving traffic where the sequential technique responds to variable-duration signals. The fixed-interval controller shortens a customary 50-s monitoring time to an average of 18 s, making the monitoring delay less bothersome. The controller for monitoring moving vehicles benefits from the sequential technique by maintaining more than half its sensitivity when the normal passage speed doubles

  15. A simple assessment of physical activity is associated with obesity and motor fitness in pre-school children.

    Science.gov (United States)

    Bayer, Otmar; Bolte, Gabriele; Morlock, Gabriele; Rückinger, Simon; von Kries, Rüdiger

    2009-08-01

    Physical activity is an important determinant of energy balance. However, its impact on overweight/obesity has proved difficult to measure in pre-school children and few studies have found significant associations. A set of simple questions was used to distinguish pre-school children with high and low physical activity, and the association of this classification with childhood overweight/obesity and performance in an established motor test was investigated. Survey, cross-sectional. Weight and height were measured in 12,556 children taking part in the obligatory school entrance health examination 2004-5 and 2005-6 in three urban and three rural Bavarian regions. Their parents were asked to answer a questionnaire with a set of questions on physical activity. The mean age of the children evaluated was 5.78 (sd 0.43) years, 6535 (52.1 %) were boys. Physically active children were less likely to be overweight (OR = 0.786, 95 % CI 0.687, 0.898) or obese (OR = 0.655, 95 % CI 0.506, 0.849) and achieved 6.7 (95 % CI 5.8, 7.7) % more jumps per 30 s than less active children in a motor test, adjusted for a number of potentially confounding variables. Classification of pre-school children as physically active or not, based on a small set of questions, revealed significant associations with overweight/obesity and a motor test. Once further validated, this classification might provide a valuable tool to assess the impact of physical activity on the risk of childhood overweight and obesity.

  16. Do action learning sets facilitate collaborative, deliberative learning?: A focus group evaluation of Graduate Entry Pre-registration Nursing (GEN) students' experience.

    Science.gov (United States)

    Maddison, Charlotte; Strang, Gus

    2018-01-01

    The aim of this study was to investigate if by participating in action learning sets, Graduate Entry Pre-registration Nursing (GEN) students were able to engage in collaborative and deliberative learning. A single focus group interview involving eleven participants was used to collect data. Data analysis identified five themes; collaborative learning; reflection; learning through case study and problem-solving; communication, and rejection of codified learning. The themes are discussed and further analysed in the context of collaborative and deliberative learning. The evidence from this small scale study suggests that action learning sets do provide an environment where collaborative and deliberative learning can occur. However, students perceived some of them, particularly during year one, to be too 'teacher lead', which stifled learning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Electromagnetic analysis of the Korean helium cooled ceramic reflector test blanket module set

    International Nuclear Information System (INIS)

    Lee, Youngmin; Ku, Duck Young; Lee, Dong Won; Ahn, Mu-Young; Park, Yi-Hyun; Cho, Seungyon

    2016-01-01

    Korean helium cooled ceramic reflector (HCCR) test blanket module set (TBM-set) will be installed at equatorial port #18 of Vacuum Vessel in ITER in order to test the breeding blanket performance for forthcoming fusion power plant. Since ITER tokamak has a set of electromagnetic coils (Central Solenoid, Poloidal Field and Toroidal Field coil set) around Vacuum Vessel, the HCCR TBM-set, the TBM and associated shield, is greatly influenced by magnetic field generated by these coils. In the case of fast transient electromagnetic events such as major disruption, vertical displacement event or magnet fast discharge, magnetic field and induced eddy current results in huge electromagnetic load, known as Lorentz load, on the HCCR TBM-set. In addition, the TBM-set experiences electromagnetic load due to magnetization of the structural material not only during the fast transient events but also during normal operation since the HCCR TBM adopts Reduced Activation Ferritic Martensitic (RAFM) steel as a structural material. This is known as Maxwell load which includes Lorentz load as well as load due to magnetization of structure material. This paper presents electromagnetic analysis results for the HCCR TBM-set. For analysis, a 20° sector finite model was constructed considering ITER configuration such as Vacuum Vessel, ITER shield blankets, Central Solenoid, Poloidal Field, Toroidal Field coil set as well as the HCCR TBM-set. Three major disruptions (operational event, likely event and highly unlikely event) were selected for analysis based on the load specifications. ANSYS-EMAG was used as a calculation tool. The results of EM analysis will be used as input data for the structural analysis.

  18. Electromagnetic analysis of the Korean helium cooled ceramic reflector test blanket module set

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Youngmin, E-mail: ymlee@nfri.re.kr [National Fusion Research Institute, Daejeon (Korea, Republic of); Ku, Duck Young [National Fusion Research Institute, Daejeon (Korea, Republic of); Lee, Dong Won [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Ahn, Mu-Young; Park, Yi-Hyun; Cho, Seungyon [National Fusion Research Institute, Daejeon (Korea, Republic of)

    2016-11-01

    Korean helium cooled ceramic reflector (HCCR) test blanket module set (TBM-set) will be installed at equatorial port #18 of Vacuum Vessel in ITER in order to test the breeding blanket performance for forthcoming fusion power plant. Since ITER tokamak has a set of electromagnetic coils (Central Solenoid, Poloidal Field and Toroidal Field coil set) around Vacuum Vessel, the HCCR TBM-set, the TBM and associated shield, is greatly influenced by magnetic field generated by these coils. In the case of fast transient electromagnetic events such as major disruption, vertical displacement event or magnet fast discharge, magnetic field and induced eddy current results in huge electromagnetic load, known as Lorentz load, on the HCCR TBM-set. In addition, the TBM-set experiences electromagnetic load due to magnetization of the structural material not only during the fast transient events but also during normal operation since the HCCR TBM adopts Reduced Activation Ferritic Martensitic (RAFM) steel as a structural material. This is known as Maxwell load which includes Lorentz load as well as load due to magnetization of structure material. This paper presents electromagnetic analysis results for the HCCR TBM-set. For analysis, a 20° sector finite model was constructed considering ITER configuration such as Vacuum Vessel, ITER shield blankets, Central Solenoid, Poloidal Field, Toroidal Field coil set as well as the HCCR TBM-set. Three major disruptions (operational event, likely event and highly unlikely event) were selected for analysis based on the load specifications. ANSYS-EMAG was used as a calculation tool. The results of EM analysis will be used as input data for the structural analysis.

  19. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  20. Can Probability Maps of Swept-Source Optical Coherence Tomography Predict Visual Field Changes in Preperimetric Glaucoma?

    Science.gov (United States)

    Lee, Won June; Kim, Young Kook; Jeoung, Jin Wook; Park, Ki Ho

    2017-12-01

    To determine the usefulness of swept-source optical coherence tomography (SS-OCT) probability maps in detecting locations with significant reduction in visual field (VF) sensitivity or predicting future VF changes, in patients with classically defined preperimetric glaucoma (PPG). Of 43 PPG patients, 43 eyes were followed-up on every 6 months for at least 2 years were analyzed in this longitudinal study. The patients underwent wide-field SS-OCT scanning and standard automated perimetry (SAP) at the time of enrollment. With this wide-scan protocol, probability maps originating from the corresponding thickness map and overlapped with SAP VF test points could be generated. We evaluated the vulnerable VF points with SS-OCT probability maps as well as the prevalence of locations with significant VF reduction or subsequent VF changes observed in the corresponding damaged areas of the probability maps. The vulnerable VF points were shown in superior and inferior arcuate patterns near the central fixation. In 19 of 43 PPG eyes (44.2%), significant reduction in baseline VF was detected within the areas of structural change on the SS-OCT probability maps. In 16 of 43 PPG eyes (37.2%), subsequent VF changes within the areas of SS-OCT probability map change were observed over the course of the follow-up. Structural changes on SS-OCT probability maps could detect or predict VF changes using SAP, in a considerable number of PPG eyes. Careful comparison of probability maps with SAP results could be useful in diagnosing and monitoring PPG patients in the clinical setting.

  1. A Balanced Approach to Adaptive Probability Density Estimation

    Directory of Open Access Journals (Sweden)

    Julio A. Kovacs

    2017-04-01

    Full Text Available Our development of a Fast (Mutual Information Matching (FIM of molecular dynamics time series data led us to the general problem of how to accurately estimate the probability density function of a random variable, especially in cases of very uneven samples. Here, we propose a novel Balanced Adaptive Density Estimation (BADE method that effectively optimizes the amount of smoothing at each point. To do this, BADE relies on an efficient nearest-neighbor search which results in good scaling for large data sizes. Our tests on simulated data show that BADE exhibits equal or better accuracy than existing methods, and visual tests on univariate and bivariate experimental data show that the results are also aesthetically pleasing. This is due in part to the use of a visual criterion for setting the smoothing level of the density estimate. Our results suggest that BADE offers an attractive new take on the fundamental density estimation problem in statistics. We have applied it on molecular dynamics simulations of membrane pore formation. We also expect BADE to be generally useful for low-dimensional applications in other statistical application domains such as bioinformatics, signal processing and econometrics.

  2. JPP: A Java Pre-Processor

    OpenAIRE

    Kiniry, Joseph R.; Cheong, Elaine

    1998-01-01

    The Java Pre-Processor, or JPP for short, is a parsing pre-processor for the Java programming language. Unlike its namesake (the C/C++ Pre-Processor, cpp), JPP provides functionality above and beyond simple textual substitution. JPP's capabilities include code beautification, code standard conformance checking, class and interface specification and testing, and documentation generation.

  3. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  4. Using probability modelling and genetic parentage assignment to test the role of local mate availability in mating system variation.

    Science.gov (United States)

    Blyton, Michaela D J; Banks, Sam C; Peakall, Rod; Lindenmayer, David B

    2012-02-01

    The formal testing of mating system theories with empirical data is important for evaluating the relative importance of different processes in shaping mating systems in wild populations. Here, we present a generally applicable probability modelling framework to test the role of local mate availability in determining a population's level of genetic monogamy. We provide a significance test for detecting departures in observed mating patterns from model expectations based on mate availability alone, allowing the presence and direction of behavioural effects to be inferred. The assessment of mate availability can be flexible and in this study it was based on population density, sex ratio and spatial arrangement. This approach provides a useful tool for (1) isolating the effect of mate availability in variable mating systems and (2) in combination with genetic parentage analyses, gaining insights into the nature of mating behaviours in elusive species. To illustrate this modelling approach, we have applied it to investigate the variable mating system of the mountain brushtail possum (Trichosurus cunninghami) and compared the model expectations with the outcomes of genetic parentage analysis over an 18-year study. The observed level of monogamy was higher than predicted under the model. Thus, behavioural traits, such as mate guarding or selective mate choice, may increase the population level of monogamy. We show that combining genetic parentage data with probability modelling can facilitate an improved understanding of the complex interactions between behavioural adaptations and demographic dynamics in driving mating system variation. © 2011 Blackwell Publishing Ltd.

  5. Joint probabilities reproducing three EPR experiments on two qubits

    NARCIS (Netherlands)

    Roy, S. M.; Atkinson, D.; Auberson, G.; Mahoux, G.; Singh, V.

    2007-01-01

    An eight-parameter family of the most general non-negative quadruple probabilities is constructed for EPR-Bohm-Aharonov experiments when only three pairs of analyser settings are used. It is a simultaneous representation of three different Bohr-incompatible experimental configurations involving

  6. Box-particle probability hypothesis density filtering

    OpenAIRE

    Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.

    2014-01-01

    This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...

  7. Trait mindfulness, reasons for living and general symptom severity as predictors of suicide probability in males with substance abuse or dependence.

    Directory of Open Access Journals (Sweden)

    Parvaneh Mohammadkhani

    2015-03-01

    Full Text Available The aim of this study was to evaluate suicide probability in Iranian males with substance abuse or dependence disorder and to investigate the predictors of suicide probability based on trait mindfulness, reasons for living and severity of general psychiatric symptoms.Participants were 324 individuals with substance abuse or dependence in an outpatient setting and prison. Reasons for living questionnaire, Mindfulness Attention Awareness Scale and Suicide probability Scale were used as instruments. Sample was selected based on convenience sampling method. Data were analyzed using SPSS and AMOS.The life-time prevalence of suicide attempt in the outpatient setting was35% and it was 42% in the prison setting. Suicide probability in the prison setting was significantly higher than in the outpatient setting (p<0.001. The severity of general symptom strongly correlated with suicide probability. Trait mindfulness, not reasons for living beliefs, had a mediating effect in the relationship between the severity of general symptoms and suicide probability. Fear of social disapproval, survival and coping beliefs and child-related concerns significantly predicted suicide probability (p<0.001.It could be suggested that trait mindfulness was more effective in preventing suicide probability than beliefs about reasons for living in individuals with substance abuse or dependence disorders. The severity of general symptom should be regarded as an important risk factor of suicide probability.

  8. A Method to Select Software Test Cases in Consideration of Past Input Sequence

    International Nuclear Information System (INIS)

    Kim, Hee Eun; Kim, Bo Gyung; Kang, Hyun Gook

    2015-01-01

    In the Korea Nuclear I and C Systems (KNICS) project, the software for the fully-digitalized reactor protection system (RPS) was developed under a strict procedure. Even though the behavior of the software is deterministic, the randomness of input sequence produces probabilistic behavior of software. A software failure occurs when some inputs to the software occur and interact with the internal state of the digital system to trigger a fault that was introduced into the software during the software lifecycle. In this paper, the method to select test set for software failure probability estimation is suggested. This test set reflects past input sequence of software, which covers all possible cases. In this study, the method to select test cases for software failure probability quantification was suggested. To obtain profile of paired state variables, relationships of the variables need to be considered. The effect of input from human operator also have to be considered. As an example, test set of PZR-PR-Lo-Trip logic was examined. This method provides framework for selecting test cases of safety-critical software

  9. The Impact of Enactive /Vicarious pre-reading Tasks on Reading Comprehension and Self-Efficacy of Iranian Pre-Intermediate EFL Learners

    Directory of Open Access Journals (Sweden)

    Arezoo Eshghipour

    2016-01-01

    Full Text Available This study investigated the effect of enactive pre-reading tasks on Iranian pre-intermediate EFL learners’ reading comprehension and self-efficacy. Moreover, it explored whether Iranian per-intermediate EFL learners’ reading comprehension and self-efficacy are influenced by vicarious pre-reading tasks. The required data was gathered through a reading comprehension passage entailing 20 comprehension questions and a 30-item self-efficacy questionnaire with 5-point Likert-scale response options. A total of 66 participants (including 34 individuals in the enactive group and 32 leaners in the vicarious one took part in this study. The Pearson formula, an independent T-Test, paired T-test, and the Mann-Whitney U test were used to analyze the data. Based on the findings of the study, enactive pre-reading tasks played a key role in the Iranian pre-intermediate EFL learners’ reading comprehension ability. Moreover, it was found that vicarious pre-reading tasks served an important role in the Iranian pre-intermediate EFL learners’ self-efficacy.

  10. Modeling spatial variability of sand-lenses in clay till settings using transition probability and multiple-point geostatistics

    DEFF Research Database (Denmark)

    Kessler, Timo Christian; Nilsson, Bertel; Klint, Knud Erik

    2010-01-01

    (TPROGS) of alternating geological facies. The second method, multiple-point statistics, uses training images to estimate the conditional probability of sand-lenses at a certain location. Both methods respect field observations such as local stratigraphy, however, only the multiple-point statistics can...... of sand-lenses in clay till. Sand-lenses mainly account for horizontal transport and are prioritised in this study. Based on field observations, the distribution has been modeled using two different geostatistical approaches. One method uses a Markov chain model calculating the transition probabilities...

  11. Operation and control strategies in pre-series testing of cold circulating pumps for ITER

    International Nuclear Information System (INIS)

    Bhattacharya, R.; Vaghela, H.; Sarkar, B.; Srinivas, M.; Choukekar, K.

    2013-01-01

    Cryo-distribution system of ITER is responsible for the distribution and control of forced-flow supercritical helium for cooling of the superconducting magnets and the cryo-pumps. The requirements of cold circulating pumps (CCP) for mass flow rates and performance are much higher than presently existing and commercially available one used at 4.0 K helium. Design up-scaling with pre-series test of CCP has been proposed including test infrastructure. Operation and control strategies for the test distribution box (TDB) of test infrastructure have been developed and analyzed using steady state and dynamic process simulation to cope with the functional requirements of CCPs. Off-normal scenario with CCP inlet pressure variation is an important concern, dynamic process responses during such scenario have been evaluated to verify the operability of CCP. The paper describes process simulation to cope with the functional requirements of CCPs along with evaluation of off-normal scenario to verify the operability of CCP. (author)

  12. Setting up pre-admission visits for children undergoing day surgery: a practice development initiative.

    Science.gov (United States)

    O'Shea, Maria; Cummins, Ann; Kelleher, Ann

    2010-06-01

    The hospital experience can bring about a range of negative emotions for children. The literature clearly states that children who are prepared for surgery recover faster and have fewer negative effects. Pre-admission programmes seek to prepare children (and their parents) for surgery. This paper describes in detail how a pre-admission programme was established for children and their families who were scheduled for day case surgery.

  13. Background reduction of the KATRIN spectrometers. Transmission function of the pre-spectrometer and systematic tests of the main-spectrometer wire electrode

    Energy Technology Data Exchange (ETDEWEB)

    Prall, Matthias

    2011-07-04

    precision). These measurements, together with the results of various quality assurance tests are stored in a database allowing to reconstruct the properties and history of each single electrode module. The UHV-compatible, fail-save, non-magnetic high voltage distribution, routing 46 voltages to the electrode modules inside the MS, was designed, tested systematically and its installation was started. The pre-spectrometer (PS), which has the same working principle as the MS is placed in front of it in the KATRIN experiment. It is foreseen to operate the MS at a potential of about 18.57 keV and the PS at about 18.3 keV reducing the rate of {beta}-decay electrons entering the MS to about 10{sup 3} s{sup -1}. This measure reduces the electron scattering probability in the MS and thus the background rate. Like this, however, electrons are confined longitudinally between the spectrometer potentials and radially by the strong magnetic field of a solenoid, placed between them. The electrons trapped in this Penning trap can also produce background. This work investigates the possibility to diminish this background source by reducing the PS potential by several keV worsening the trapping conditions. In this configuration, however, the PS transmission probability could be reduced as an adiabaticity requirement guaranteeing that electrons follow the magnetic field lines through the PS and are transmitted, is potentially violated. This phenomenon would introduce an additional systematic uncertainty for KATRIN. Therefore, the pre-spectrometer transmission at several magnetic fields settings. These investigations show that there are no non-adiabatic transmission losses for magnetic fields larger, or equal to 2.25 T (50% KATRIN design value).

  14. Recommendations for translation and reliability testing of International Spinal Cord Injury Data Sets.

    Science.gov (United States)

    Biering-Sørensen, F; Alexander, M S; Burns, S; Charlifue, S; DeVivo, M; Dietz, V; Krassioukov, A; Marino, R; Noonan, V; Post, M W M; Stripling, T; Vogel, L; Wing, P

    2011-03-01

    To provide recommendations regarding translation and reliability testing of International Spinal Cord Injury (SCI) Data Sets. The Executive Committee for the International SCI Standards and Data Sets. Translations of any specific International SCI Data Set can be accomplished by translation from the English version into the target language, and be followed by a back-translation into English, to confirm that the original meaning has been preserved. Another approach is to have the initial translation performed by translators who have knowledge of SCI, and afterwards controlled by other person(s) with the same kind of knowledge. The translation process includes both language translation and cultural adaptation, and therefore shall not be made word for word, but will strive to include conceptual equivalence. At a minimum, the inter-rater reliability should be tested by no less than two independent observers, and preferably in multiple countries. Translations must include information on the name, role and background of everyone involved in the translation process, and shall be dated and noted with a version number. By following the proposed guidelines, translated data sets should assure comparability of data acquisition across countries and cultures. If the translation process identifies irregularities or misrepresentation in either the original English version or the target language, the working group for the particular International SCI Data Set shall revise the data set accordingly, which may include re-wording of the original English version in order to accomplish a compromise in the content of the data set.

  15. Detection of nuclear testing from surface concentration measurements: Analysis of radioxenon from the February 2013 underground test in North Korea

    Science.gov (United States)

    Kurzeja, R. J.; Buckley, R. L.; Werth, D. W.; Chiswell, S. R.

    2018-03-01

    A method is outlined and tested to detect low level nuclear or chemical sources from time series of concentration measurements. The method uses a mesoscale atmospheric model to simulate the concentration signature from a known or suspected source at a receptor which is then regressed successively against segments of the measurement series to create time series of metrics that measure the goodness of fit between the signatures and the measurement segments. The method was applied to radioxenon data from the Comprehensive Test Ban Treaty (CTBT) collection site in Ussuriysk, Russia (RN58) after the Democratic People's Republic of Korea (North Korea) underground nuclear test on February 12, 2013 near Punggye. The metrics were found to be a good screening tool to locate data segments with a strong likelihood of origin from Punggye, especially when multiplied together to a determine the joint probability. Metrics from RN58 were also used to find the probability that activity measured in February and April of 2013 originated from the Feb 12 test. A detailed analysis of an RN58 data segment from April 3/4, 2013 was also carried out for a grid of source locations around Punggye and identified Punggye as the most likely point of origin. Thus, the results support the strong possibility that radioxenon was emitted from the test site at various times in April and was detected intermittently at RN58, depending on the wind direction. The method does not locate unsuspected sources, but instead, evaluates the probability of a source at a specified location. However, it can be extended to include a set of suspected sources. Extension of the method to higher resolution data sets, arbitrary sampling, and time-varying sources is discussed along with a path to evaluate uncertainty in the calculated probabilities.

  16. Intervening to decrease the probability of alcohol-impaired driving: Impact of novel field sobriety tests.

    Science.gov (United States)

    Smith, Ryan C; Robinson, Zechariah; Bazdar, Alexandra; Geller, E Scott

    2016-01-01

    The efficacy of novel field sobriety tests to predict breath alcohol content (BAC) and perceptions of driving risk was evaluated. Participants (N = 210) were passersby at two downtown locations near local bars and one on-campus location near a late-night dining facility between the hours of 10:00 p.m. and 2:00 a.m. Participants gave ratings of their perceived risk to drive at their current level of intoxication, then completed three sobriety tests (a hand-pat, tracing test, and Romberg test), and finally provided new ratings of their perceived risk to drive. After completing the final set of questions, participants were administered a Lifeloc FC20 breath alcohol test (±.005 g/dL). Each of the sobriety tests performed better than chance at predicting participant intoxication, but the performance feedback did not enhance awareness of one's risk to drive at a given BAC. Actually, after the sobriety tests, Greek-life females perceived themselves to be less at-risk to drive.

  17. Monitoring training response in young Friesian dressage horses using two different standardised exercise tests (SETs)

    NARCIS (Netherlands)

    de Bruijn, Cornelis Marinus; Houterman, Willem; Ploeg, Margreet; Ducro, Bart; Boshuizen, Berit; Goethals, Klaartje; Verdegaal, Elisabeth-Lidwien; Delesalle, Catherine

    2017-01-01

    BACKGROUND: Most Friesian horses reach their anaerobic threshold during a standardized exercise test (SET) which requires lower intensity exercise than daily routine training. AIM: to study strengths and weaknesses of an alternative SET-protocol. Two different SETs (SETA and SETB) were applied

  18. Monitoring training response in young Friesian dressage horses using two different standardised exercise tests (SETs)

    NARCIS (Netherlands)

    Bruijn, de Cornelis Marinus; Houterman, Willem; Ploeg, Margreet; Ducro, Bart; Boshuizen, Berit; Goethals, Klaartje; Verdegaal, Elisabeth Lidwien; Delesalle, Catherine

    2017-01-01

    Background: Most Friesian horses reach their anaerobic threshold during a standardized exercise test (SET) which requires lower intensity exercise than daily routine training. Aim: to study strengths and weaknesses of an alternative SET-protocol. Two different SETs (SETA and SETB) were applied

  19. [Quality Management and Quality Specifications of Laboratory Tests in Clinical Studies--Challenges in Pre-Analytical Processes in Clinical Laboratories].

    Science.gov (United States)

    Ishibashi, Midori

    2015-01-01

    The cost, speed, and quality are the three important factors recently indicated by the Ministry of Health, Labour and Welfare (MHLW) for the purpose of accelerating clinical studies. Based on this background, the importance of laboratory tests is increasing, especially in the evaluation of clinical study participants' entry and safety, and drug efficacy. To assure the quality of laboratory tests, providing high-quality laboratory tests is mandatory. For providing adequate quality assurance in laboratory tests, quality control in the three fields of pre-analytical, analytical, and post-analytical processes is extremely important. There are, however, no detailed written requirements concerning specimen collection, handling, preparation, storage, and shipping. Most laboratory tests for clinical studies are performed onsite in a local laboratory; however, a part of laboratory tests is done in offsite central laboratories after specimen shipping. As factors affecting laboratory tests, individual and inter-individual variations are well-known. Besides these factors, standardizing the factors of specimen collection, handling, preparation, storage, and shipping, may improve and maintain the high quality of clinical studies in general. Furthermore, the analytical method, units, and reference interval are also important factors. It is concluded that, to overcome the problems derived from pre-analytical processes, it is necessary to standardize specimen handling in a broad sense.

  20. New Graphical Methods and Test Statistics for Testing Composite Normality

    Directory of Open Access Journals (Sweden)

    Marc S. Paolella

    2015-07-01

    Full Text Available Several graphical methods for testing univariate composite normality from an i.i.d. sample are presented. They are endowed with correct simultaneous error bounds and yield size-correct tests. As all are based on the empirical CDF, they are also consistent for all alternatives. For one test, called the modified stabilized probability test, or MSP, a highly simplified computational method is derived, which delivers the test statistic and also a highly accurate p-value approximation, essentially instantaneously. The MSP test is demonstrated to have higher power against asymmetric alternatives than the well-known and powerful Jarque-Bera test. A further size-correct test, based on combining two test statistics, is shown to have yet higher power. The methodology employed is fully general and can be applied to any i.i.d. univariate continuous distribution setting.

  1. Assessment of clinical reasoning: A Script Concordance test designed for pre-clinical medical students.

    Science.gov (United States)

    Humbert, Aloysius J; Johnson, Mary T; Miech, Edward; Friedberg, Fred; Grackin, Janice A; Seidman, Peggy A

    2011-01-01

    The Script Concordance test (SCT) measures clinical reasoning in the context of uncertainty by comparing the responses of examinees and expert clinicians. It uses the level of agreement with a panel of experts to assign credit for the examinee's answers. This study describes the development and validation of a SCT for pre-clinical medical students. Faculty from two US medical schools developed SCT items in the domains of anatomy, biochemistry, physiology, and histology. Scoring procedures utilized data from a panel of 30 expert physicians. Validation focused on internal reliability and the ability of the SCT to distinguish between different cohorts. The SCT was administered to an aggregate of 411 second-year and 70 fourth-year students from both schools. Internal consistency for the 75 test items was satisfactory (Cronbach's alpha = 0.73). The SCT successfully differentiated second- from fourth-year students and both student groups from the expert panel in a one-way analysis of variance (F(2,508) = 120.4; p students from the two schools were not significantly different (p = 0.20). This SCT successfully differentiated pre-clinical medical students from fourth-year medical students and both cohorts of medical students from expert clinicians across different institutions and geographic areas. The SCT shows promise as an easy-to-administer measure of "problem-solving" performance in competency evaluation even in the beginning years of medical education.

  2. HIGHLY EFFECTIVE CHEMICAL MODIFIERS FOR PRODUCTION OF CONCRETES WITH PRE-SET PROPERTIES

    Directory of Open Access Journals (Sweden)

    Tkach Evgeniya Vladimirovna

    2012-10-01

    Full Text Available The paper demonstrates the application of industrial by-products and recycled materials. Waterproofing admixtures improve the structure and the properties of the cement stone. Development and preparation of highly effective waterproofing modifiers of durable effect, as well as development of the process procedure parameters, including mixing, activation, heat treatment, etc. are to be implemented. The composition of waterproofing modifiers is to be fine-tuned to synergize the behaviour of various ingredients of cement systems to assure the substantial improvement of their strength, freeze- and corrosion resistance. Multi-functional waterproofing admixtures were used to produce highly effective modified concretes. The key idea of the new method of modifying cement-based building materials is that the waterproofing admixture concentration is to exceed 10% of the weight of the binding agent within the per-unit weight of the cement stone, given that its strength does not deteriorate. GKM-type modifier coupled with organo-mineral waterproofing admixture concentration agent GT-M may be recommended for mass use in the manufacturing of hydraulic concrete and reinforced concrete products. Overview of their practical implementation has proven that waterproofing modifier GKM-S, if coupled with waterproofing admixture concentration agent GT-M, improves the corrosion control inside the cement stone and makes it possible to manufacture durable concrete and reinforced concrete products that demonstrate pre-set physical and processing behaviour. Comprehensive concrete modification by modifier GKM-S and waterproofing admixture concentration agent GT-M may be regarded as one of the most ambitious methods of production of highly effective waterproof concretes.

  3. Input-profile-based software failure probability quantification for safety signal generation systems

    International Nuclear Information System (INIS)

    Kang, Hyun Gook; Lim, Ho Gon; Lee, Ho Jung; Kim, Man Cheol; Jang, Seung Cheol

    2009-01-01

    The approaches for software failure probability estimation are mainly based on the results of testing. Test cases represent the inputs, which are encountered in an actual use. The test inputs for the safety-critical application such as a reactor protection system (RPS) of a nuclear power plant are the inputs which cause the activation of protective action such as a reactor trip. A digital system treats inputs from instrumentation sensors as discrete digital values by using an analog-to-digital converter. Input profile must be determined in consideration of these characteristics for effective software failure probability quantification. Another important characteristic of software testing is that we do not have to repeat the test for the same input value since the software response is deterministic for each specific digital input. With these considerations, we propose an effective software testing method for quantifying the failure probability. As an example application, the input profile of the digital RPS is developed based on the typical plant data. The proposed method in this study is expected to provide a simple but realistic mean to quantify the software failure probability based on input profile and system dynamics.

  4. FATTY MUSCLE INFILTRATION IN CUFF TEAR: PRE AND POST OPERATIVE EVALUATION BY MRI.

    Science.gov (United States)

    Miyazaki, Alberto Naoki; Santos, Pedro Doneux; da Silva, Luciana Andrade; Sella, Guilherme do Val; Miranda, Eduardo Régis de Alencar Bona; Zampieri, Rodrigo

    2015-01-01

    To evaluate the fatty infiltration and atrophy of the supraespinatus in the pre- and postoperative of a rotator cuff lesion (RCL), by MRI. Ten patients with full-thickness rotator cuff tears who had undergone surgical arthroscopic rotator cuff repair between September and December 2011 were included. This is a prospective study, with analysis and comparison of fatty infiltration and atrophy of the supraespinatus. The occupation ratio was measured using the magic selection tool in Adobe Photoshop CS3((r)) on T1 oblique sagittal Y-view MRI. Through Photoshop, the proportion occupied by the muscle belly regarding its fossae was calculated. There was a statistically significant increase in the muscle ratio (p=0.013) comparing images pre and postoperative, analyzed by the Wilcoxon T test. The proportion of the supraspinal muscle above the pit increases in the immediate postoperative period, probably due to the traction exerted on the tendon at the time of repair. Level of Evidence II, Cohort Study.

  5. Assessment of Oral Fluid HIV Test Performance in an HIV Pre-Exposure Prophylaxis Trial in Bangkok, Thailand.

    Directory of Open Access Journals (Sweden)

    Pravan Suntharasamai

    Full Text Available Rapid easy-to-use HIV tests offer opportunities to increase HIV testing among populations at risk of infection. We used the OraQuick Rapid HIV-1/2 antibody test (OraQuick in the Bangkok Tenofovir Study, an HIV pre-exposure prophylaxis trial among people who inject drugs.The Bangkok Tenofovir Study was a randomized, double-blind, placebo-controlled trial. We tested participants' oral fluid for HIV using OraQuick monthly and blood using a nucleic-acid amplification test (NAAT every 3 months. We used Kaplan-Meier methods to estimate the duration from a positive HIV NAAT until the mid-point between the last non-reactive and first reactive oral fluid test and proportional hazards to examine factors associated with the time until the test was reactive.We screened 3678 people for HIV using OraQuick. Among 447 with reactive results, 436 (97.5% were confirmed HIV-infected, 10 (2.2% HIV-uninfected, and one (0.2% had indeterminate results. Two participants with non-reactive OraQuick results were, in fact, HIV-infected at screening yielding 99.5% sensitivity, 99.7% specificity, a 97.8% positive predictive value, and a 99.9% negative predictive value. Participants receiving tenofovir took longer to develop a reactive OraQuick (191.8 days than participants receiving placebo (16.8 days (p = 0.02 and participants infected with HIV CRF01_AE developed a reactive OraQuick earlier than participants infected with other subtypes (p = 0.04.The oral fluid HIV test performed well at screening, suggesting it can be used when rapid results and non-invasive tools are preferred. However, participants receiving tenofovir took longer to develop a reactive oral fluid test result than those receiving placebo. Thus, among people using pre-exposure prophylaxis, a blood-based HIV test may be an appropriate choice.ClinicalTrials.gov NCT00119106.

  6. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  7. Association test based on SNP set: logistic kernel machine based test vs. principal component analysis.

    Directory of Open Access Journals (Sweden)

    Yang Zhao

    Full Text Available GWAS has facilitated greatly the discovery of risk SNPs associated with complex diseases. Traditional methods analyze SNP individually and are limited by low power and reproducibility since correction for multiple comparisons is necessary. Several methods have been proposed based on grouping SNPs into SNP sets using biological knowledge and/or genomic features. In this article, we compare the linear kernel machine based test (LKM and principal components analysis based approach (PCA using simulated datasets under the scenarios of 0 to 3 causal SNPs, as well as simple and complex linkage disequilibrium (LD structures of the simulated regions. Our simulation study demonstrates that both LKM and PCA can control the type I error at the significance level of 0.05. If the causal SNP is in strong LD with the genotyped SNPs, both the PCA with a small number of principal components (PCs and the LKM with kernel of linear or identical-by-state function are valid tests. However, if the LD structure is complex, such as several LD blocks in the SNP set, or when the causal SNP is not in the LD block in which most of the genotyped SNPs reside, more PCs should be included to capture the information of the causal SNP. Simulation studies also demonstrate the ability of LKM and PCA to combine information from multiple causal SNPs and to provide increased power over individual SNP analysis. We also apply LKM and PCA to analyze two SNP sets extracted from an actual GWAS dataset on non-small cell lung cancer.

  8. Sludge pre-treatment with pulsed electric fields

    Energy Technology Data Exchange (ETDEWEB)

    Kopplow, O.; Barjenbruch, M.; Heinz, V.

    2003-07-01

    The anaerobic stabilization process depends - among others - on the bio-availability of organic carbon. Through pre-treatment of the sludge which leads to the destruction of micro-organisms and to the setting-free of cell content substances (disintegration), the carbon can be microbially converted better and faster. Moreover, effects on the digestion are likely. However, only little experience is available in the sludge treatment with pulsed electric fields. Laboratory-scale digestion tests have been run to analyse the influence of pulsed electric fields on the properties of sludge, anaerobic degradation, sludge water reload and foaming of digesters. The results will be compared with those of other disintegration methods (high pressure homogenise, thermal treatment). The effect of pre-treatment on the sludge is shown by the COD release. Degrees of disintegration have been achieved up to 20%. The specific energy input was high. The energy consumption has been decreased by initial improvements (pre-heating to 55{sup o}C). The filament bacteria were partially destroyed. The foam reduction in the digesters was marginal. The anaerobic degradation performance has been improved in every case. The degradation rate of organic matter increased about 9%. Due to the increase of degradation, there is a higher reload of the sludge-water with COD and nitrogen compounds. (author)

  9. Delayed hydride cracking: alternative pre-cracking method

    International Nuclear Information System (INIS)

    Mieza, Juan I.; Ponzoni, Lucio M.E.; Vigna, Gustavo L.; Domizzi, Gladys

    2009-01-01

    The internal components of nuclear reactors built-in Zr alloys are prone to a failure mechanism known as Delayed Hydride Cracking (DHC). This situation has triggered numerous scientific studies in order to measure the crack propagation velocity and the threshold stress intensity factor associated to DHC. Tests are carried out on fatigued pre-crack samples to ensure similar test conditions and comparable results. Due to difficulties in implementing the fatigue pre-crack method it would be desirable to replace it with a pre-crack produced by the same process of DHC, for which is necessary to demonstrate equivalence of this two methods. In this work tests on samples extracted from two Zr-2.5 Nb tubes were conducted. Some of the samples were heat treated to obtain a range in their metallurgical properties as well as different DHC velocities. A comparison between velocities measured in test samples pre-cracked by fatigue and RDIH is done, demonstrating that the pre-cracking method does not affect the measured velocity value. In addition, the incubation (t inc ), which is the time between the application of the load and the first signal of crack propagation, in samples pre-cracked by RDIH, was measured. It was found that these times are sufficiently short, even in the worst cases (lower speed) and similar to the ones of fatigued pre-cracked samples. (author)

  10. Relationship of academic success of medical students with motivation and pre-admission grades.

    Science.gov (United States)

    Luqman, Muhammad

    2013-01-01

    To determine predictive validity of pre-admission scores of medical students, evaluate correlation between level of motivation and later on academic success in a medical college. Analytical study. Foundation University Medical College, Islamabad, from June to August 2011. A non-probability convenience sampling of students of 1st to final year MBBS classes was done after obtaining informed consent. These students filled out 'Strength of Motivation for Medical School' (SMMS) questionnaire. The data of pre-admission grades of these students along with academic success in college according to examination results in different years were collected. The correlation between the pre-admission grades and score of SMMS questionnaire with their academic success in medical college was found by applying Pearson co-efficient of correlation in order to determine the predictive validity. Only 46% students revealed strong motivation. A significant, moderate correlation was found between preadmission scores and academic success in 1st year modular examination (0.52) which became weaker in various professional examinations in higher classes. However, no significant correlation was observed between motivation and academic success of medical students in college. Selecting medical students by pre-admission scores or motivation level alone may not be desirable. A combination of measures of cognitive ability criteria (FSc/pre-admission test scores) and non-cognitive skills (personality traits) is recommended to be employed with the use of right tools for selection of students in medical schools.

  11. A method for estimating failure rates for low probability events arising in PSA

    International Nuclear Information System (INIS)

    Thorne, M.C.; Williams, M.M.R.

    1995-01-01

    The authors develop a method for predicting failure rates and failure probabilities per event when, over a given test period or number of demands, no failures have occurred. A Bayesian approach is adopted to calculate a posterior probability distribution for the failure rate or failure probability per event subsequent to the test period. This posterior is then used to estimate effective failure rates or probabilities over a subsequent period of time or number of demands. In special circumstances, the authors results reduce to the well-known rules of thumb, viz: 1/N and 1/T, where N is the number of demands during the test period for no failures and T is the test period for no failures. However, the authors are able to give strict conditions on the validity of these rules of thumb and to improve on them when necessary

  12. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  13. Predicting Pre-Service Classroom Teachers' Civil Servant Recruitment Examination's Educational Sciences Test Scores Using Artificial Neural Networks

    Science.gov (United States)

    Demir, Metin

    2015-01-01

    This study predicts the number of correct answers given by pre-service classroom teachers in Civil Servant Recruitment Examination's (CSRE) educational sciences test based on their high school grade point averages, university entrance scores, and grades (mid-term and final exams) from their undergraduate educational courses. This study was…

  14. Posterior probability of linkage and maximal lod score.

    Science.gov (United States)

    Génin, E; Martinez, M; Clerget-Darpoux, F

    1995-01-01

    To detect linkage between a trait and a marker, Morton (1955) proposed to calculate the lod score z(theta 1) at a given value theta 1 of the recombination fraction. If z(theta 1) reaches +3 then linkage is concluded. However, in practice, lod scores are calculated for different values of the recombination fraction between 0 and 0.5 and the test is based on the maximum value of the lod score Zmax. The impact of this deviation of the test on the probability that in fact linkage does not exist, when linkage was concluded, is documented here. This posterior probability of no linkage can be derived by using Bayes' theorem. It is less than 5% when the lod score at a predetermined theta 1 is used for the test. But, for a Zmax of +3, we showed that it can reach 16.4%. Thus, considering a composite alternative hypothesis instead of a single one decreases the reliability of the test. The reliability decreases rapidly when Zmax is less than +3. Given a Zmax of +2.5, there is a 33% chance that linkage does not exist. Moreover, the posterior probability depends not only on the value of Zmax but also jointly on the family structures and on the genetic model. For a given Zmax, the chance that linkage exists may then vary.

  15. Estimation of functional failure probability of passive systems based on adaptive importance sampling method

    International Nuclear Information System (INIS)

    Wang Baosheng; Wang Dongqing; Zhang Jianmin; Jiang Jing

    2012-01-01

    In order to estimate the functional failure probability of passive systems, an innovative adaptive importance sampling methodology is presented. In the proposed methodology, information of variables is extracted with some pre-sampling of points in the failure region. An important sampling density is then constructed from the sample distribution in the failure region. Taking the AP1000 passive residual heat removal system as an example, the uncertainties related to the model of a passive system and the numerical values of its input parameters are considered in this paper. And then the probability of functional failure is estimated with the combination of the response surface method and adaptive importance sampling method. The numerical results demonstrate the high computed efficiency and excellent computed accuracy of the methodology compared with traditional probability analysis methods. (authors)

  16. Isokinetic Strength and Endurance Tests used Pre- and Post-Spaceflight: Test-Retest Reliability

    Science.gov (United States)

    Laughlin, Mitzi S.; Lee, Stuart M. C.; Loehr, James A.; Amonette, William E.

    2009-01-01

    To assess changes in muscular strength and endurance after microgravity exposure, NASA measures isokinetic strength and endurance across multiple sessions before and after long-duration space flight. Accurate interpretation of pre- and post-flight measures depends upon the reliability of each measure. The purpose of this study was to evaluate the test-retest reliability of the NASA International Space Station (ISS) isokinetic protocol. Twenty-four healthy subjects (12 M/12 F, 32.0 +/- 5.6 years) volunteered to participate. Isokinetic knee, ankle, and trunk flexion and extension strength as well as endurance of the knee flexors and extensors were measured using a Cybex NORM isokinetic dynamometer. The first weekly session was considered a familiarization session. Data were collected and analyzed for weeks 2-4. Repeated measures analysis of variance (alpha=0.05) was used to identify weekly differences in isokinetic measures. Test-retest reliability was evaluated by intraclass correlation coefficients (ICC) (3,1). No significant differences were found between weeks in any of the strength measures and the reliability of the strength measures were all considered excellent (ICC greater than 0.9), except for concentric ankle dorsi-flexion (ICC=0.67). Although a significant difference was noted in weekly endurance measures of knee extension (p less than 0.01), the reliability of endurance measure by week were considered excellent for knee flexion (ICC=0.97) and knee extension (ICC=0.96). Except for concentric ankle dorsi-flexion, the isokinetic strength and endurance measures are highly reliable when following the NASA ISS protocol. This protocol should allow accurate interpretation isokinetic data even with a small number of crew members.

  17. Efficient assignment of the temperature set for Parallel Tempering

    International Nuclear Information System (INIS)

    Guidetti, M.; Rolando, V.; Tripiccione, R.

    2012-01-01

    We propose a simple algorithm able to identify a set of temperatures for a Parallel Tempering Monte Carlo simulation, that maximizes the probability that the configurations drift across all temperature values, from the coldest to the hottest ones, and vice versa. The proposed algorithm starts from data gathered from relatively short Monte Carlo simulations and is straightforward to implement. We assess its effectiveness on a test case simulation of an Edwards–Anderson spin glass on a lattice of 12 3 sites.

  18. CORRELATION OF SPOT URINE ALBUMIN AND 12-HOUR URINE PROTEIN WITH 24-HOUR URINE PROTEIN IN PRE-ECLAMPSIA

    Directory of Open Access Journals (Sweden)

    S. Vinayachandran

    2017-11-01

    Full Text Available BACKGROUND Pre-eclampsia is defined as the development of new-onset hypertension in the second half of pregnancy often accompanied by new-onset proteinuria with other signs and symptoms. Proteinuria is defined by the excretion of 300 mg or more of protein in a 24-hour urine collection. To avoid time consumed in collection of 24-hour urine specimens, efforts have been made to develop faster methods to determine concentration of urine protein. Preliminary studies have suggested that 12-hour urine protein collection maybe adequate for evaluation of pre-eclampsia with advantage of early diagnosis and treatment of pre-eclampsia as well as potential for early hospital discharge and increased compliance with specimen collection. The aim of the study is to evaluate and correlate spot urine albumin and 12-hour urine protein with 24-hour urine protein in pre-eclampsia. MATERIALS AND METHODS A diagnostic evaluation study- a 24-hour urine protein, 12-hour urine protein and spot urine albumin results are analysed. Correlation of 12-hour urine protein and spot urine albumin with 24-hour urine protein is analysed using SPSS software. The strength of correlation was measured by Pearson’s correlation coefficient (r. Student’s t-test and Chi-square tests were used to compare patients with and without 24-hour urine protein ≥300 mg. Probability value of 165 mg with 24-hour urine protein ≥300 mg suggest that this test has role in the evaluation of women with suspected pre-eclampsia and could be substituted for 24-hour urine protein as a simple, faster and cheaper method.

  19. An Enhanced Non-Coherent Pre-Filter Design for Tracking Error Estimation in GNSS Receivers.

    Science.gov (United States)

    Luo, Zhibin; Ding, Jicheng; Zhao, Lin; Wu, Mouyan

    2017-11-18

    Tracking error estimation is of great importance in global navigation satellite system (GNSS) receivers. Any inaccurate estimation for tracking error will decrease the signal tracking ability of signal tracking loops and the accuracies of position fixing, velocity determination, and timing. Tracking error estimation can be done by traditional discriminator, or Kalman filter-based pre-filter. The pre-filter can be divided into two categories: coherent and non-coherent. This paper focuses on the performance improvements of non-coherent pre-filter. Firstly, the signal characteristics of coherent and non-coherent integration-which are the basis of tracking error estimation-are analyzed in detail. After that, the probability distribution of estimation noise of four-quadrant arctangent (ATAN2) discriminator is derived according to the mathematical model of coherent integration. Secondly, the statistical property of observation noise of non-coherent pre-filter is studied through Monte Carlo simulation to set the observation noise variance matrix correctly. Thirdly, a simple fault detection and exclusion (FDE) structure is introduced to the non-coherent pre-filter design, and thus its effective working range for carrier phase error estimation extends from (-0.25 cycle, 0.25 cycle) to (-0.5 cycle, 0.5 cycle). Finally, the estimation accuracies of discriminator, coherent pre-filter, and the enhanced non-coherent pre-filter are evaluated comprehensively through the carefully designed experiment scenario. The pre-filter outperforms traditional discriminator in estimation accuracy. In a highly dynamic scenario, the enhanced non-coherent pre-filter provides accuracy improvements of 41.6%, 46.4%, and 50.36% for carrier phase error, carrier frequency error, and code phase error estimation, respectively, when compared with coherent pre-filter. The enhanced non-coherent pre-filter outperforms the coherent pre-filter in code phase error estimation when carrier-to-noise density ratio

  20. Test-retest reliability of the Military Pre-training Questionnaire.

    Science.gov (United States)

    Robinson, M; Stokes, K; Bilzon, J; Standage, M; Brown, P; Thompson, D

    2010-09-01

    Musculoskeletal injuries are a significant cause of morbidity during military training. A brief, inexpensive and user-friendly tool that demonstrates reliability and validity is warranted to effectively monitor the relationship between multiple predictor variables and injury incidence in military populations. To examine the test-retest reliability of the Military Pre-training Questionnaire (MPQ), designed specifically to assess risk factors for injury among military trainees across five domains (physical activity, injury history, diet, alcohol and smoking). Analyses were based on a convenience sample of 58 male British Army trainees. Kappa (kappa), weighted kappa (kappa(w)) and intraclass correlation coefficients (ICC) were used to evaluate the 2-week test-retest reliability of the MPQ. For index measures constituting the assessment of a given construct, internal consistency was assessed by Cronbach's alpha (alpha) coefficients. Reliability of individual items ranged from poor to almost perfect (kappa range = 0.45-0.86; kappa(w) range = 0.11-0.91; ICC range = 0.34-0.86) with most items demonstrating moderate reliability. Overall scores related to physical activity, diet, alcohol and smoking constructs were reliable between both administrations (ICC = 0.63-0.85). Support for the internal consistency of the incorporated alcohol (alpha = 0.78) and cigarette (alpha = 0.75) scales was also provided. The MPQ is a reliable self-report instrument for assessing multiple injury-related risk factors during initial military training. Further assessment of the psychometric properties of the MPQ (e.g. different types of validity) with military populations/samples will support its interpretation and use in future surveillance and epidemiological studies.